Facebook's Trending Topics algorithm didn't really get duped

Steven Loeb · August 29, 2016 · Short URL: https://vator.tv/n/46fa

It was a human review team that allowed a fake news story on Fox News' Megyn Kelly to slip through

One thing I've long admired about Facebook was that it routinely added in a human element, alongside machine learning, when it came to making recommendations. That was the case with the Siri-like service, called M, that it released last year, and with event recommendations it launched in June. Machine learning is great, but it's still a machine, one that can't reason things out the way a person can.

However, humans are just as likely to mess things up, as Facebook learned on Monday, when a fake news story about Fox News anchor Megyn Kelly became a trending topic for many hours, much to the company's embarrassement. 

This inaccuracy has become big news due to the fact that Facebook updated Trending Topics to remove the people who were curating that content on Friday, leadning many to put the blame for this gaffe on the Trending Topics algoryth.

The simple fact is, though, that this was actually a human error, not a machine error. 

When Facebook updated Trending Topics to remove the people who were curating that content on Friday, the company made it clear that it wasn't removing humans entirely from the equation. 

"There are still people involved in this process to ensure that the topics that appear in Trending remain high-quality — for example, confirming that a topic is tied to a current news event in the real world," the company said.

That team screwed up on Monday, as Megyn Kelly became a Trending Topic on Facebook, thanks to a story that was published about the Fox News anchor being fired for being a "traitor," and for her falling ratings. The source of the report was that great bastion of news, endingthefed.com (I know that's where I go when I want to get the truth).

The story was fake, because of course it was. If it were true, the stop source probably would have been a bit more reputable. The story stayed up for a number of hours before it was finally removed by Facebook at around 9:30 eastern time this morning.

So was this the fault of a computer? Actually, no. It was the Trending review team, which accepted this topic over the weekend, a spokesperson for Facebook informed me.

The team has review guidelines, and, based on those, the topic met the conditions for acceptance at the time due to the fact there was a sufficient number of relevant articles and posts (never mind that it was almost certainly not true).

It was only when the team decided to re-review it that the topic was marked to be revisited, and therefore no longer shown live, based on the likely inaccuracy of the articles on which the topic was based.

Going forward, the company is now working to make its detection of hoax and satirical stories more accurate, the spokesperson said. 

One would think that a human curating the content would have known that the story wasn't true, or at least been able to look into it deeper to see if it was. The machine that picked the story couldn't do that, but neither did the team.

The really ironic thing about this incident is that Facebook got rid of human curation of Trending Topics in the first place because of conservatives fretting over reports of systematic bias.

Earlier this year, Facebook was accused, by a former contractor, of routinely preventing stories from conservative news outlets from appearing in its Trending Topics sidebar. The company denied any wrongdoing, and tried to smooth things over by meeting with conservative leaders.

It even went so far as to force its employees to take a class on getting rid of political bias, and to open up about what people were seeing on their News Feeds, and why, all in an effort to tamp down the story 

It seemed for a while that Facebook wasn't going to give in to the pressure to remove humans from the equation, but, late last week, the company finally caved. Its stated reasons for that decision focused on how difficult it was for humans to curate the news, given how much content there is to sift through. 

"Our goal is to enable Trending for as many people as possible, which would be hard to do if we relied solely on summarizing topics by hand. A more algorithmically driven process allows us to scale Trending to cover more topics and make it available to more people globally over time. This is something we always hoped to do but we are making these changes sooner given the feedback we got from the Facebook community earlier this year," Facebook wrote.

However, it could also not deny that the controversy had played some role as well.

"Earlier this year, we shared more information about Trending in response to questions about alleged political bias in the product. We looked into these claims and found no evidence of systematic bias. Still, making these changes to the product allows our team to make fewer individual decisions about topics. Facebook is a platform for all ideas, and we’re committed to maintaining Trending as a way for people to access a breadth of ideas and commentary about a variety of topics."

Still, there is a human team that is supposed to catch these kinds of problems. The twist is that they wound up publishing a fake story on a prominent conservative completely by accident. 

(Image source: slideshare.net)

Support VatorNews by Donating

Read more from our "Trends and news" series

More episodes