Get-listed-729x90
110799

Facebook cleans up Live Video, bans fake posts

The company updates its policy to disallow videos of static, animated or looping images

Technology trends and news by Steven Loeb
May 15, 2017 | Comments
Short URL: http://vator.tv/n/4991

As Facebook continues to deal with the fallout from its fake news scandal, it also has to contend with fake content that, while less damaging to its overall reputation, may actually have an impact on a more important aspect of its business. I'm speaking of live videos that aren't actually live and aren't actually videos.

Facebook has updated its Platform Policy to go after videos that only show images, whether they be static or animated, as well as those that are used to take user polls.

"Don't use the API to publish only images (ex: don't publish static, animated, or looping images), or to live-stream polls associated with unmoving or ambient broadcasts," the company now warns developers. 

These are not the only types of live videos that aren't live, of course; I've actually seen the same video on Facebook Live being rebroadcast on a loop, but the company makes it clear that this isn't a problem, only asking that developers, "Ensure any pre-recorded content is clearly distinguishable from live content."

So, why are static images more of a problem? Simply put, users don't like them and that may hurt Live Video numbers over time, causing fewer people to want to tune i. 

In December, Facebook revealed that it gotten feedback from users who said that "they don’t find graphics-only polls to be an interesting type of Live content." Those are the types of videos that simply ask people for reactions to a static image of a poll or question. In response, the company said it would "reduce the visibility of Live streams that consist entirely of graphics with voting," ranking them lower on News Feeds. 

Now it has taken the additional step of actually making those videos a violation of policy, essentially banning them. According to Ubergizmo, Facebook will be lenient at first to those who break this policy, only making their post less visibile, but may restrict their access to Facebook Live for repeat offenders. 

Anything that hinders video on Facebook is a major problem for the company, as its strategy going forward revolves around video, and that includes everything from artificial intelligence to virtual reality.

"Photos and video are becoming more common than text, so the camera is becoming more central then the text box in all of our apps. In the Facebook app, you can now swipe right from News Feed to access our new camera with masks, frames, and filters. We've developed new computer vision tools that can apply the style of a painting to a photo or video, and we can do that in real time on your phone for the first time," Mark Zuckerberg said in a conference call following Facebook's Q1 earnings last week.

"This is part of making the camera the first augmented reality platform. We want to give developers the power to build all kinds of AR tools in the camera so more people can experience augmented reality on their phone. Creating the first open camera platform is a huge step forward, and we're excited to keep pushing augmented reality forward."

Live video has become especially important to Facebook, as the platform has been growing rapidly. One in every five Facebook videos is now a live broadcast and, over the past year, daily watch time for Facebook Live broadcasts has grown by more than four times.

Of course, Facebook has bigger problems when it comes to live video than just bad content. There have also been series of incidents over the past few months, in which people using the feature did harm to themselves, or to others, while broadcasting it to the world. In many cases not only were the acts themselves not stopped, but the videos also remained up for many hours following the incidents.

One of the first reported incidents occurred in December, when a 12 year old girl in Atlanta committed suicide on Facebook Live. A similar incident then took place in January. Last month, two violent acts made major headlines: first, a man in Cleveland used the platform to commit a murder. That was followed by an even more horrific incident, when a man in Thailand streamed himself killing his 11-month old daughter, before killing himself.

Recently, Facebook announced that it will be hiring 3,000 new workers over the next year, with the explicit job of responding to reports of violence and hate speech on Facebook Live to help curbs these types of incidents.

This change was first reported by TechCrunch on Monday. 

VatorNews reached out to Facebook for more information. We will update this story if we learn more. 

(Image source: entrepreneur.com)


Related news


blog comments powered by Disqus