Surfair
110799

Facebook is stepping up efforts to curb violence on Live

The company is hiring 3,000 people to respond to reports of people hurting themselves or others

Financial trends and news by Steven Loeb
May 3, 2017 | Comments
Short URL: http://vator.tv/n/4981

There's a lot of talk out there right now about automation, machine learning and artificial intelligence, with the subtext being that human beings will no longer be necessary to carry out certain tasks. Oftentimes it feels like the need for the human element is dismissed, but I give a lot of credit to Facebook for not forgetting how important that part of the equation can be.

Having a real person in place behind the scenes seems to be part of the company's ethos. When it launched its Siri-like service, called M, back in 2015 it had people behind the scenes to assist. When it;s event recommendations was unveiled last year, again, it was human curated. Now, in the midst of a bit of a crisis with its Live video feed, the company is once again calling on people to step up to help.

On Wednesday, Mark Zuckerberg revealed in a Facebook post that the company will be hiring 3,000 new workers over the next year, with the explicit job of responding to reports of violence and hate speech on Facebook Live. The company already has 4,500 people doing this job today, but is increasing the size of that team by 66 percent. 

The move comes in response to a series of incidents over the past few months, in which people using the feature did harm to themselves, or to others, while broadcasting it to the world. In many cases not only were the acts themselves not stopped, but the videos also remained up for many hours following the incidents.

One of the first reported incidents occured in December, when a 12 year old girl in Atlanta committed suicide on Facebook Live. A similar incident then took place in January. Last month, two violent acts made major headlines: first, a man in Cleveland used the platform to commit a murder. That was followed by an even more horrific incident, when a man in Thailand streamed himself killing his 11-month old daughter, before killing himself.

Despite all of these horrific incidents, Facebook had mostly stayed mum until today's announcement, though Zuckerberg did say last month that the company was going to take responsibility for what was broadcast on Facebook Live. Now, the company is finally taking some more concrete steps to help curb the problem. 

"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," Zuckerberg wrote.

While Facebook has previously touted the use of artificial intelligence to help with suicide prevention on its platform, using pattern recognition to help identify which posts might contain people thinking of hurting themselves, Zuckerberg seems to recognize that, for now at least, actually having a person to respond to an alert would be quicker than relying on a machine.

"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down," he said.

In addition, Facebook will also work closer with community groups and law enforcement, while also adding better tools for reporting these types of incidents. 

The human element is, of course, far from infallible, as Facebook knows all too well. Last year, it was accused, by a former contractor, of routinely preventing stories from conservative news outlets from appearing in its Trending Topics sidebar. As a result, the company removed its human curators, though it did keep on a team to ensure that the topics that appear in Trending are real. That team also messed up pretty badly.

In this case, though, it's hard to see a downside to having real people who can quickly and easily respond, and hopefully prevent any more tragedies from occurring.

(Image source: work.chron.com)


Related news


blog comments powered by Disqus