X
Business

EU's terrorism filter plans: The problems just keep coming

European authorities have discovered that creating rules to keep terrorist content off the internet is not easy.
Written by Cathrin Schaer, Contributor

A few weeks ago, German internet users discovered that their country's authorities had been keeping closer tabs on them than they realized.

In late April, in reply to a parliamentary question, the federal police – Germany's version of the FBI – revealed that they had quietly established a database for online terrorism referrals last October.

Almost immediately the database – with its lack of oversight and definition – was held up as yet another illustration of where the EU is going wrong in ongoing attempts to police the internet.

SEE: 10 tips for new cybersecurity pros (free PDF)

The German federal police explained that if they found content that could be classified as terrorist, or encouraging terrorism, they would ask the website hosting it to remove it.

Broadly, this process is what is known as a 'referral'. But the website is not legally obliged to remove the offensive content and only just under half of them actually complied, the German authority noted.

The German federal police also kept a record of illicit content and requests to take it down in their new database, to help identify future iterations and to track progress of any removal. And since January this year, they also passed all that on to the EU Agency for Law Enforcement Cooperation, better known as Europol.

Since 2015, Europol has run a platform – the Internet Referral Management Application, or IRMa – to manage all the various referrals coming in from European nations, so their status can be tracked and to avoid repetition.

All this activity was part of a pilot project developed in preparation for new European rules, the interior ministry, which oversees the federal police, explained in its reply to the official enquiry from left-wing German MPs.

Earlier in April, the European Parliament had taken another step toward finalizing those rules with a new "proposal for preventing the dissemination of terrorist content online".

The draft rules, "necessary to establish a legislative framework for cross-border cooperation", were first formulated in September 2018. Although they were supposed to be finalized by now, they are unlikely to ready before 2020.

Part of these new rules involves the referrals system. The rules would formalize the process by which security authorities in all the member states send referrals to websites, rather than just Europol.

"So the different member states could send a referral to the host, saying, 'Hey, we think this is illegal. So please do whatever you think is necessary'," explains Diego Naranjo, a senior policy adviser at the umbrella European Digital Rights, or EDRi, network, based in Brussels.

"This outsources the monitoring work and puts pressure on the hosting companies, which will most likely block the content – just in case." That leads to what's known as 'over-blocking' and an erosion of online rights, critics like the EDRi argue.

Other wide-reaching aspects of the original draft EU proposal, some of them connected to the referrals system, have also come in for criticism. One complaint was that the proposed rules do not clearly define what terrorist content is.

So how did the German federal police decide what to refer during their pilot project, German politicians wanted to know. The almost 6,000 German referrals gathered between October 2018 and March 2019 were "based mainly on the federal police's own legal assessments," the officials answered, in a written reply.

"There is certainly content that everyone agrees is abhorrent," Elisabeth Niekrenz, a researcher at German internet rights association, Digitale Gesellschaft, or Digital Society, told an audience at Europe's largest internet conference, re:publica, in Berlin recently.

"But we should not forget that terrorism is a very political description, something that states decide on, based on their own self-interest."

In its written statement, the interior ministry noted how much jihadist content had been passed on to Europol. On the face of it, this measure seems logical. After all, one of the original motivations for the new EU regulation was the devastating way the extremist group known as the Islamic State had harnessed social-media platforms to recruit members from all round the world.

But the document's jihadist focus could also be construed as worrying given the German interior ministry's recent record. Although German security agencies certainly do pursue right-wing extremist groups, there are also suspicions that there are right-wing sympathizers within their ranks.

Last September the head of another of the agencies that the interior ministry oversees was forced out of his job after he appeared to sympathize with right-wingers during anti-migrant violence in Chemnitz.

This has been something of an ongoing worry, with a major scandal in the 2000s about neo-Nazi murderers who went unpursued until far too late and, more recently, discoveries of right-wing cells inside the police and armed forces.

Besides a lack of definition, various civil- and digital-rights groups had plenty of other criticisms of the draft regulation. They said the rules were unworkable because of their rigidity on certain aspects – such as forcing sites to take offending content down within an hour and making the use of upload filters mandatory.

They also said the rules were too vague on other points, such as the lack of clarity about which online services were really responsible for hosting terrorist content. 

Originally online forums, messenger services and even cloud infrastructure services could be held responsible. Also, any national security agency could demand that whatever they considered terrorist content be blocked across Europe.

"Authoritarian EU countries – for example, Hungary – could ask to have unfavorable content deleted from online platforms in countries like Germany," Martin Scheinin, a law professor at the European University Institute in Florence, told Spiegel.

The upshot was that later in April, EU parliamentarians took heed of mounting criticism and adapted the proposal again. Changes included upload filters no longer being compulsory, the ability of one country to block content throughout the whole of the EU being removed, and responsibility being better defined to exclude cloud servers.

Additionally, only publicly available content can now be targeted. This change removes private messenger services from the mix. The one-hour limit still applies, although it's been slightly softened. And the referrals system that German federal police had been apparently preparing for was also removed.

However, the changes have done little to allay fears about the federal police's new database. Because, as Niekrenz says, in this case, and in other recent controversies – such as a recent take-down order sent by French authorities to the US-based Internet Archive, asking it to remove, among other things, Grateful Dead lyrics – those various services and states were all acting independently, she concluded.

"The new measures are not going to make their databanks any more transparent either," she tells ZDNet. More oversight is needed, either by politicians or researchers, and we still need more information about what's in the databases, who puts it there and who has access to it, Niekrenz argues.

SEE: IT pro's guide to GDPR compliance (free PDF)

In fact, as Germany's Bitkom, which represents over 2,600 companies working in the digital economy, has argued, the new terrorist content rules might not even be necessary. Well-engineered, balanced e-commerce laws already cover a lot of illicit behavior, the organization said in a statement.

"Back in September, we put out a statement about exactly this too, asking, 'Why is the EU proposing yet another layer of rules?'," EDRi's Naranjo tells ZDNet. "And by the way, [we asked] 'Doesn't this all seem very close to the European elections?'."

For Niekrenz, the newest addition to the draft rules – where the definition of offensive content is both clarified and somewhat expanded to include 'presenting' terrorist content online – shows just how political, emotional, and potentially poorly considered the proposed new regulations are.

Niekrenz believes that the line about "presentation of terrorist content" was added as a reaction to the live-streaming of the mosque massacre in Christchurch, New Zealand.

"That is understandable," she says. But if you think about it, she continued, anyone who shared video of the planes crashing into New York's twin towers on September 11 would also be guilty of such an offence.

Editorial standards