X
Innovation

Social media giants face €50m fines under new German law

New law requires hate speech to be taken down in 24 hours or networks will face huge fines.
Written by David Meyer, Contributor
bundestag-german-parliament-thumb.jpg
Image: Getty Images

Germany's parliament has passed a law forcing social networks to delete hate-speech postings and misinformation within 24 hours. The decision came on Friday, just ahead of the Bundestag's summer recess.

Digital rights activists and tech giants had furiously opposed the "Enforcement on Social Networks" (NetzDG) law, claiming it amounted to a clampdown on free expression as platforms such as Facebook had too much incentive to take down posts without properly considering their context.

If the platforms don't remove objectionable material within a day, or within a week in particularly complex cases, they now face fines of up to €50 million.

Legal experts have suggested that the law may be unconstitutional, while further criticism has come from sources including the United Nations' special rapporteur on free expression, David Kaye.

Germany has strict hate speech laws that are, much like its tough privacy regulation, partly a legacy of the country's past. Chancellor Angela Merkel's government has long been struggling with the question of how to apply these laws online, where it is so easy for people to express their opinions.

At the same time, facing cases such as the false report that recently circulated about Arab migrants sexually assaulting German women, it has been trying to tackle the rise of "fake news" on social media.

Last year, the administration managed to get the big online players to agree to take down illegal content within 24 hours, but only on a self-regulatory basis. Although Facebook ploughed money into expanding its German content moderation team , the government decided that too many objectionable posts were staying up for too long, so it came up with this new law.

The government, which comprises a all-powerful coalition of Germany's two largest parties, made several amendments to the NetzDG a week ago to address some of the critics' concerns. The law no longer calls for automated content filters to stop illegal content being successfully posted in the first place. It now also anticipates the creation of a self-regulatory industry body, independent of the government and paid for by the tech companies, that could handle decisions about what should and shouldn't be taken down.

Social media companies fear there is still too much incentive to take down content that isn't clearly illegal, and point to a recent analysis by the European Commission, which suggested that they have been getting better at removing illegal content without new legislation and the threat of heavy fines.

A Facebook spokesperson noted that the social network has "made substantial progress in removing illegal content" and was "building better tools to keep our community safe and make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact the police if someone needs help".

"We believe the best solutions will be found when government, civil society and industry work together and that this law as it stands now will not improve efforts to tackle this important societal problem," the Facebook spokesperson said. "We feel that the lack of scrutiny and consultation do not do justice to the importance of the subject. We will continue to do everything we can to ensure safety for the people on our platform."

Activists aren't convinced either. European Digital Rights (EDRi) said just ahead of the vote that the oversight and accountability of this new industry body remained unclear. "In practice, the [NetzDG law] will keep promoting 'voluntary' measures by private companies to take down contents, with liability rules that incentivise such restrictions, without them being obligatory," it said. "This means that the results will be similar to those of the original proposal, but just more difficult to challenge in court."

"In the current version, upload and content filters would not be mandatory, but whether or not mandatory, they are likely to be applied by big companies like Facebook," EDRi warned. "These companies are, quite rationally, driven by the motivation to avoid liability, using the cheapest options available, and to exploit the political legitimisation of their restrictive measures for profit. This can only lead to privatised, unpredictable online censorship."

Read more on privacy

Editorial standards