X
Tech

Supreme Court rejects lawsuit claiming Facebook provided terrorist forum support

The case accused Facebook of being materially responsible for user-generated terrorist content.
Written by Charlie Osborne, Contributing Writer

The Supreme Court has rejected a lawsuit levied against Facebook which accused the company of providing support to terrorists by failing to control user-generated content. 

On Monday, the court sent the issue back to the United States Court of Appeals for the Third Circuit with instructions to dismiss the case (.PDF). 

Force v. Facebook, as reported by The Verge, was first filed in 2016 by the families of US citizens who lost members during Palestinian attacks in Israel, alongside one survivor. 

The plaintiffs claimed that Hamas -- a militant group deemed to be a terrorist organization by the US -- posted content on the social network encouraging attacks in Israel, and Facebook 'assisted' the group by allowing this activity and by providing a communication forum. 

Section 230 of the US Communications Decency Act, designed to protect freedom of expression online, generally does not allow online platforms to be sued for user-generated content that the companies themselves have no hand in creating. 

The legislation says that "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider," and therefore is unlikely to be held liable for "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable" content. 

See also: Illinois blames 'glitch' for exposure of PUA applicant Social Security numbers, private data

It was argued that Facebook's use of algorithms may have connected users who interacted with this alleged Hamas material -- such as through likes or commenting -- with related content, and this, in turn, should reduce the firm's immunity to conviction. 

The case also attempted to push forward with civil remedies laid out in the Anti-Terrorism Act's 18 U.S.C. 2333 clause, which permits damage claims by US citizens or an injured party's heirs if impacted by international terrorism.

However, the appeals court did not agree, upholding Facebook's protected status and deciding that there was no liability caused by algorithms acting as 'editorial' systems, nor due to any matches and connections forged between content streams on the platform.

CNET: Huawei ban timeline: Chinese company criticizes 'pernicious' new US export controls

"We do not mean that Section 230 requires algorithms to treat all types of content the same," the appeals court said at the time. "To the contrary, Section 230 would plainly allow Facebook's algorithms to, for example, de‐promote or block content it deemed objectionable."

The Supreme Court was petitioned to reconsider the case, dismissed in the appeals court last year, but the ruling stands.

Facebook's case has now joined previously-dismissed claims against companies including Google and Twitter, in which they were not found to be liable for similar claims relating to ISIS activity and a terrorist-linked nightclub shooting.

TechRepublic: Average US citizen had personal information stolen at least 4 times in 2019

Facebook has chosen not to comment beyond the court filings.

"Holding online platforms liable for what terrorists and their supporters post online -- and the violence they ultimately perpetrate -- would have dire repercussions: if online platforms no longer have Section 230 immunity in this context, those forums and services will take aggressive action to screen their users, review and censor content, and potentially prohibit anonymous speech," the EFF said in previous comments on the lawsuit. 

10 worst hacks and data breaches of 2019 (in pictures)

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0


Editorial standards