The company embeds itself into EMRs and uses patient data to create an actionable reportRead more...
The company will make it easier for users to report, while also using third-party fact checkers
It didn't take long for Facebook to go from denying the impact of fake news to agreeing to fix the problem to now actually outlining how it will go about it. This all happened in the span of roughly a week and a half. Maybe that's what happens when even the President of the United States starts calling you out.
Over the weekend, in a post on Facebook, Mark Zuckerberg outlined his plan for combating misinformation on the site, saying that the company does "take misinformation seriously."
"Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We've been working on this problem for a long time and we take this responsibility seriously. We've made significant progress, but there is more work to be done."
The company, he said, used to rely on its users to report fake news, but that apparently isn't working anymore, not when more fake news was shared this past election season than real news.
So, he outlined a bunch of actions Facebook will be taking to stop the spread of misinformation, including being able to classify it better so it can "detect what people will flag as false before they do it themselves."
It will also make it easier to report fake news, which is something that I, someone who uses Facebook a lot, didn't even know I could do, so it's a good thing to step up the game in that department. The company will also get some outside help, including from fact checking organizations, to help verify what's true and what's not.
Users might also get warnings about stories that have been marked as fake, not that that will stop most people from still reading them. The company will also crack down on spam, and get more input from journalists "to better understand their fact checking systems and learn from them."
Ultimately, while Facebook is owning up to its own responsibility to give people accurate information, the company obviously doesn't see itself as a media company, but simply as a platform, and one that does not want to become the ultimate arbiter of truth versus fiction.
As Zuckerberg pointed out previously, what is truth and what is fiction may sometimes come down to a point of view. If Facebook does start cracking down, it may be accused of censoring certain political viewpoints, as it was earlier this year.
He reiterated his reluctance to wade into those waters again in his most recent post.
"The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible," he wrote.
"We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties."
The controversy over what gets posted on Facebook started after Donald Trump won the presidential election earlier this month, leading to numerous articles which stated outright that Facebook was responsible. They accused the company of handing Trump the election.
Speaking at Techonomy16, Zuckerberg dismissed these claims.
"I’ve seen some of the stories you are talking about around this election. Personally, I think the idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea," Zuckerberg responded, also noting that "voters make decisions based on their lived experience."
However, only a couple of days later, he announced that the company would indeed be cracking down on these sites.
"Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further," Zuckerberg wrote in a Facebook note.
Facebook should know full well now easily fake news can spread across the site; the company itself was a victim of a hoax not that long ago.
In the wake of the conservative news scandal earlier this year, where the company was accused of repressing conservative news sites, Facebook removed the people who were curating its Trending Topics content, replacing it with a machine learning algorithm, which very quickly began spreading a false story about Fox News anchor Megyn Kelly.
(Image source: textalks.com)
Support VatorNews by Donating
Read more from our "Trends and news" series
The round includes $55 million in equity and a $20 million line of creditRead more...
Tia will open its second LA clinic in Santa Monica, with three more coming later this yearRead more...