Facebook deletes fake news accounts ahead of U.K. election

Steven Loeb · May 8, 2017 · Short URL: https://vator.tv/n/4986

If you're not read, you're not shared, and therefore possibly fake

While Mark Zuckerberg may have started out intiailly dismissing Facebook's problem with fake news, and its role in potentially affecting the outcome of the U.S. elections last year, the company is now taking that possibility very seriously when it comes to the upcoming general election in the U.K. in June. 

While the company has already begun cracking down on the problem introducing new tools to help its users identify, and report, news that may not be real, now the company has revealed some of the actions that it has taken to solve this problem over the last month or so. 

If you're not read, you're not shared

According to a report from the BBC on Monday, Facebook has deleted tens of thousands of accounts in the United Kingdom, which it found were disseminating fake news. Facebook also updated its system to flag sites that repeatedly post the same content, or that see "a sharp increase in messaging."

That also includes decreasing the ranking of stories that are read but not shared. If the article is shared, it's assumed it must have been read, and more importantly considered real or worthy. If it's read but not shared, it's more likely to be fake.

In addition, the company released a full page ad in U.K. newspapers on Monday, with tips on how to spot fake news. These include: "be skeptical of headlines," "watch for unusual formatting," "check the date," and "investigate the source."

The timing of these actions is not random: they are being taken in advance of the general election set to take place in the United Kingdom on June 8, 2017.

It's strategy that Facebook has deployed before, having taken out similar ads in both Germany and in France, prior to its general election, which took place this month.

Fake news on Facebook

The issue of fake news, or websites set up to deliberately spread false information to affect political outcomes around the world, first came up in the aftermath of the U.S. election in November. Following Donald Trump's surprise win, numerous articles began stating outright that Facebook was responsible, accusing the company of handing Trump his win.

Zuckerberg dismissed these claims.

"I’ve seen some of the stories you are talking about around this election. Personally, I think the idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea," he said at Techonomy16 just a few days later, also noting that "voters make decisions based on their lived experience."

However, only a couple of days later, he announced that the company would indeed be cracking down on these sites.

In April, Facebook finally unveiled its plan to combat the spread of disinformation on its site. That included making it harder to sell likes on the platform, while also blocking bots from making fake Facebook accounts. The company began "identifying patterns of activity — without assessing the content itself."

Now we know that Facebook isn't just doing more to identify the problem, but to actually fix it. However, it's also clear that just deleting fake accounts isn't the answer here. People need to be educated on how to spot false stories. A Pew study, released in November of last year, nearly a quarter of American adults had shared a fake news story. Only 39 percent said they were "very confident" in their ability to recognize fake news.

Similarlysurvey by Common Sense Media found that over 30 percent of young people had shared fake news.

VatorNews reached out to Facebook for confirmation of this report. We will update this story if we learn more. 

(Image source: mediaite.com)

Support VatorNews by Donating

Read more from our "Trends and news" series

More episodes