X
Business

EU can force Facebook and social media platforms to remove content globally

This includes content that is posted by users outside of the European Union.
Written by Campbell Kwan, Contributor

The European Court of Justice has ruled that users of Facebook, or any other host provider, will be able to request for content to be taken down if it is considered unlawful. The effects of these blocks will not only be enforced in the individual's country of residence, but will be applied worldwide.

The court said on Thursday that host providers can be forced to block a piece of content globally if any of the European Union's national courts come to the decision that the content in question is defamatory or unlawful.

As the decision was made at Europe's highest court, it cannot be appealed.

The ruling initially arose from a dispute between Facebook and the Austrian politician, Eva Glawischnig-Piesczek, regarding content that was distributed on the social network. Glawischnig-Piesczek had requested for Facebook to take down a piece content that she believed was harmful to her reputation, but the social network failed to do so. She then proceeded to sue Facebook at an Austrian court, which resulted in the content being blocked in Austria as it was found to be defamatory in nature. 

With confirmation that the content on Facebook was defamatory, Glawischnig-Piesczek then demanded that Facebook erase the content worldwide, not just within the country, as well as any posts with "equivalent" remarks, which has culminated in the current decision.
 
Any takedown requests, according to the court, will be limited to only information containing the elements specified within an injunction made by European courts, as well as any defamatory content of an equivalent nature. 

To meet these requests, the court said host providers like Facebook will be required to block the content in question while also using their respective automated search tools and technologies to block any other content that is of equivalent nature.

"That injunction must be able to extend to information, the content of which, whilst essentially conveying the same message, is worded slightly differently, because of the words used or their combination, compared with the information whose content was declared to be illegal," the court said.

Host providers will not be required to monitor information generally though, nor will they be required to seek out illegal activity that is beyond the scope of the takedown requests, the court added.

In coming to its decision, the European Court of Justice said it believed that the EU rules are consistent with the rules applicable at the international level. As such, countries party to the European Union that wish to enforce any takedown requests will be required to adopt measures that take into account international rules.

The decision ups the ante for how the internet can be governed as European countries will now have the power to take down content internationally. It also places more responsibility on social media platforms like Facebook to patrol their sites for inappropriate content as regulation against harmful content continues to mount.

Facebook said in an emailed statement that the court decision "undermines the long-standing principle that one country does not have the right to impose its laws on speech on another country".

"It also opens the door to obligations being imposed on internet companies to proactively monitor content and then interpret if it is 'equivalent' to content that has been found to be illegal ...  We hope the courts take a proportionate and measured approach, to avoid having a chilling effect on freedom of expression," Facebook said.

While Facebook has pushed the narrative that it should not be held legally responsible for the material posted on its social media platforms, it has taken steps to control its content to address the growing scrutiny of governments around the world. 

A fortnight ago, it implemented various policy changes that were aimed at preventing the spread of terrorist and extremist content. The policy changes included an updated definition of terrorist organisations, improved technology use when detecting harmful online content, and an expansion of its content reviewing process. The changes to Facebook's content policies were prompted by the Christchurch Call that took place in May. 

Last week, it unfurled new measures to curb foreign interference for elections in Singapore and also made the decision -- alongside other tech giants like Microsoft, Twitter, and YouTube -- to restructure its online counter-terrorism forum into being an independent organisation that is overseen by various governments.

On the same day of the court decision, US Attorney General William Barr, along with officials from the UK and Australia, have called for Facebook to delay plans for end-to-end encryption.

RELATED COVERAGE

Editorial standards