Families of ISIS victims sue Twitter for being 'weapon for terrorism'
- 11 January, 2017 08:31
The families of three Americans killed in ISIS terror attacks are suing Twitter for allegedly knowingly providing support for the terrorist group and acting as a "powerful weapon for terrorism."
The suit was filed over the weekend in a federal court in New York City on behalf of the relatives of three U.S. nationals who were killed by ISIS in the March 22, 2016, terrorist attacks in Brussels and the Nov. 13, 2015, terrorist attacks in Paris. At least 32 people died in the Brussels attack and about 130 in the attack in Paris.
The suit alleges that Twitter has violated, and continues to violate, the U.S. Anti-Terrorism Act. The plaintiffs are asking for a jury trial and monetary damages to be determined at trial.
Twitter did not reply to a request for comment.
"Twitter's social media platform and services provide tremendous utility and value to ISIS as a tool to connect its members and to facilitate the terrorist group's ability to communicate, recruit members, plan and carry out attacks, and strike fear in its enemies," the suit alleges. "ISIS has used Twitter to cultivate and maintain an image of brutality, to instill greater fear and intimidation, and to appear unstoppable ..."
The lawsuit also contends that specifically for the Brussels and Paris attacks, ISIS used Twitter to issue threats, as well as to announce and celebrate the attacks.
The lawsuit was filed by the family of siblings Alexander Pinczowski and Sascha Pinczowski, who were killed in Brussels, and the family of Nohemi Gonzalez, who was killed in Paris.
Last year, another lawsuit was filed by Gonzalez's father against Twitter, Facebook and YouTube for allegedly knowingly allowing ISIS to "use their social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits."
In December, the families of three victims of the June shooting at the Pulse nightclub in Orlando, Florida, sued Facebook, Twitter and Google, the owner of YouTube, for allegedly "providing support to the Islamic State." Forty-nine people were killed in the attack.
The question, if either case goes to trial, is whether a social network can be held responsible for the actions of any of its users.
"While I certainly can sympathise with the families, it's hard for me to see how Twitter can be held responsible for the rise of ISIS and their terror activities," said Dan Olds, an analyst with OrionX. "Let's imagine the world a few decades ago, before the internet. Would someone try to hold AT&T responsible for criminal activities that were planned over the telephone? Or is the printing press manufacturer responsible for magazines that encourage terrorism that were printed using presses they built and sold? "
In response to the attacks, Twitter took steps to prevent terrorists from using its network.
In August, the company reported that in the previous six months, it had suspended 235,000 accounts for violating its policies related to the promotion of terrorism.
That was in addition to 125,000 accounts that been suspended since mid-2015, bringing the total number of terrorist-related suspended accounts to 360,000.
"We strongly condemn these acts and remain committed to eliminating the promotion of violence or terrorism on our platform," the company said in a blog post at the time.
Judith Hurwitz, an analyst with Hurwitz & Associates, said it would be a significant challenge for Twitter to keep terrorists completely off its site.
"Perhaps Twitter could do a better job identifying users who are terrorists," she said, saying the company would likely need advanced machine learning tools to weed out the bad players. "Of course, it would have to be advanced… Remember that terrorists are very good at adapting. If they are thrown off of the system, they can come back with a different persona and try to game the system."
Brad Shimmin, an analyst with Current Analysis, said social networks like Twitter, Facebook and Google can't be held responsible for their users' actions.
"There is no way of effectively policing those sites based upon affiliation or behavior," Shimmin said. "Twitter itself has gone to some extreme measures to single out and remove accounts engaged in this sort of thing. That will help, and I think such efforts are a moral responsibility for Twitter and other social networking vendors, but those actions can't rule out future misuse."
Olds said it would be impossible for Twitter to keep terrorists from using its site 100% of the time, but the company could do a better job of curtailing it.
"Terrorist messages should be able to be rooted out with some solid language processing software," Olds said. "I'd like to see them do more along these lines. The technology is there, they just need to adapt it to anti-terrorist tasks."
If Twitter loses the lawsuit and is ordered to pay significant damages, the impact on other social networks would be chilling, he said.
"Social networks would be forced to keep a much closer eye on user activities and crack down on anything that could be interpreted as 'bad,' " Olds said. "The end result would be self-imposed censorship on the part of the nets, which would greatly upset many users. But I just don't see this happening – at least not with this case."