Facebook gets hit with $1B lawsuit for being soft on terror

Steven Loeb · July 11, 2016 · Short URL: https://vator.tv/n/4671

The families of terror victims have accused the social network of providing a platform for Hamas

The world is a pretty scary place these days, and it seems to be getting scarier all the time. There are so many terrorist attacks and shootings that it's almost becoming commonplace. Now, of course, there are a lot of things you can blame for that, but one of the most unlikely targets of anger has become social media.

Social networks have been hit with a number of lawsuits recently for their response, or apparent lack thereof, against terror groups using their platforms. The latest comes from relatives of victims of terrorist attacks in Israel, spanning over two years, who are suing Facebook for allegedly allowing Palestinian terrorists to use the network to spread hate and violence without any repercussions, ABC reported on Monday. 

Facebook has knowingly provided material support and resources to Hamas in the form of Facebook’s online social network and communication services,” it says in the suit. “Hamas has used and relied on Facebook’s online social network platform and communications services as among its most important tools to facilitate and carry out its terrorist activity, including the terrorist attacks in which Hamas murdered and injured victims and their families in this case."

The lawsuit was brought by the families of five people who were injured or killed in terrorist attacks in Israel. That includes Yaakov Naftali Fraenkel, a 16-year-old U.S. who was kidnapped and killed in June 2014, and Taylor Force 29-year-old U.S. Army veteran who was stabbed to death in March of this year. 

For years, Hamas, its leaders, spokesmen and members have openly maintained and used official Facebook accounts with little or no interference,” the suit says.

In total, the families are seeking $1 billion in damages from the company. 

VatorNews has reached out to Facebook for comment on the lawsuit, but, as of the publishing of ABC's story, the company had said that it had not received the lawsuit yet, and couldn't comment on its merits.

"We have a set of Community Standards to help people understand what is allowed on Facebook, and we urge people to use our reporting tools if they find content that they believe violates our standards so we can investigate and take swift action," a spokesperson told ABC.

We will update this story if we received any further comment from Facebook about this lawsuit. 

Terrorism on social media

Social media has taken a lot of heat in recent months in regards to terrorism, and the rolethese networks play in cracking down on hate speech.

This isn't the only lawsuit of this nature that Facebook is dealing with currently; last month the father of a woman killed in the terrorist attack in Paris in November of last year sued not only Facebook, but Google and Twitter as well, for alleging providing "material support" to ISIS.

In January, Twitter also faced a similar lawsuit from the family of a Florida defense contractor, who was killed in a terrorist attack in Jordan in November. The company was accused of having "knowingly permitted the terrorist group ISIS to use its social network as a tool for spreading extremist propaganda, raising funds and attracting new recruits."

It's not just ordinary people who are asking the companies to task; even one of the leading candidates for President of the United States has specifically asked them to do more.

"The threat from radical jihadism has metastasized and become more complex and challenging. We are seeing the results of the radicalization, not just in far off lands, but right here at home, fueled by the Internet. It's the nexus of terrorism and technology," Hillary Clinton said in a speech earlier this year.

They are using websites, social media, chat rooms and other platforms to celebrate beheadings, recruit future terrorists and call for attacks. We should work with host companies to shut them down,” Clinton said;

In response, both companies have attempted to take a stronger stand on the issue. Facebook launched a new anti-hate speech initiative in Europe, while also pledging over 1 million euros (or $1.09 million) to support non-governmental organizations in their efforts to rid its platform of racist and xenophobic posts.

Twitter, meanwhile, has updated its rules on hate speech, including adding this statement: "You may not make threats of violence or promote violence, including threatening or promoting terrorism."

More importantly, it also announced that it had suspended over 125,000 accounts for threatening or promoting terrorist acts, primarily related to ISIS.

Who's responsibility is it, anyway?

These lawsuits bring up a very interesting, and important, question: how much responsibility does a social network actually have when it comes to policing what goes up on their platforms? And are they culpable for anything that happens afterwards due to the nature of those posts?

Take Nextdoor, for example. The company has come under scrutiny for its users going to the platform to racially profile people in their neighborhoods, something the company has taken steps to address. 

At Vator Splash Spring in May, Nirav Tolia. Co-Founder and CEO of Nextdoor, talked to Vator founder and CEO Bambi Francisco about this issue, and how much responsibility Nextdoor has in potentially promoting such behavior.

Specifically, Francisco asked him to clarify what the responsibility is for Nextdoor to fix this problem, and what the government's responsibility is. His answer: Nextdoor can't fully control what people do, but it can attempt to steer them in the right direction.

"In regards to how much can we control, we can't control, broadly speaking, what people do in their neighborhoods, but we are a mirror of those conversations, in Oakland, and many parts of the country. We have over 60 percent of the countries neighborhoods using Nextdoor, and, in the top 100 cities, it's over 90 percent of those cities' neighborhoods," he said.

"So we tend to be a mirror of the conversations that are going on in these neighborhoods, and we use technology to try to educate people and make them aware of things. Then we have a chance to, if not a small part of the solution, at least a positive force to move things in right direction."

To really put in perspective just how much responsibility social networks actually should have for the action of their users, there's this quote from Harmeet Dhillon, a lawyer and vice-chairman of the California Republican party:

"The same argument could be used against phone companies for allowing alleged terrorists to place phone calls, or Fedex for allowing alleged terrorists to mail pamphlets," he told The Guardian regarding the lawsuit against Twitter.  

Basically, just because a technology is somewhat new, doesn’t mean it should be held to a different standard.

(Image source: pcworld.com)

Support VatorNews by Donating

Read more from our "Trends and news" series

More episodes