Facebook seems to have an interesting habit of first denying a problem exists, but then solving it anyway.
That's what it did when it was accused of preventing stories from conservative news outlets from appearing in its Trending Topics sidebar earlier this year. The company denied any wrongdoing but eventually updated Trending Topics to remove the people who were curating that content.
Now, the same thing is happening with its "fake news" problem.
Just days after Mark Zuckerberg denied the notion that fake news on Facebook could have swayed the Presidential election, calling it "a crazy idea" and "pretty out there," he wrote a Facebook post in which he said that, actually, the company would be cracking down on such stories after all.
While reiterating his belief that fake stories are a "very small" amount of the content on the site, he pledged to purge them anyway since "we don't want any hoaxes on Facebook."
"Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further," he wrote.
Of course, what the actual "truth" is can be kind of complicated, and based on a point of view, so Zuckerberg pledged to "proceed very carefully" with this mission.
"While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted," he said.
"An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves."
This controversy started after Donald Trump won the election last week, and numerous articles began stating outright that Facebook was responsible. They accused the compant of handing Trump the election thanks to "fake news."
Speaking at Techonomy16, last week, Zuckerberg dismissed these claims.
"I’ve seen some of the stories you are talking about around this election. Personally, I think the idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea," Zuckerberg responded, also noting that "voters make decisions based on their lived experience."
However, he knows that Facebook has fake news; the company itself was a victim of a hoax not that long ago.
In the wake of the conservative news scandal, Facebook removed the people who were curating its Trending Topics content, replacing it with a machine learning algorithm, which very quickly began spreading a false story about Fox News anchor Megyn Kelly.
To be clear, it wasn't actually the machine's fault; Facebook still has a Trending review team to vet each topic, which let the fake story through, the irony being that the company wound up publishing a fake story on a prominent conservative completely by accident.
Facebook has made a real effort to clean up the News Feed, so its actually a little surprising that it took this long for it to target fake news.
A few years ago, Facebook announced that it would be highlighting "high quality content" over other types, including memes. The company has already taken aim at spam content on the Feed, giving the boot to "like-baiting," overly reshared content and spammy links, as well as click bait headlines of the, "You'll never believe what happened next!" variety, and promotional posts, like those that push people to buy a product or install an app or reuse the exact same content from ads.
Despite Zuckerberg's claims to the contrary, there is some well earned criticism to be levied at Facebook for pushing fake news, and it is finally going to do something about it.
(Image source: anaren.com)