X
Tech

EU considers 60-minute deadline for social networks to remove terrorist content

The commission says that not enough progress has been made in stamping out extremist content.
Written by Charlie Osborne, Contributing Writer

No longer the carrot, now the stick: the European Commission is considering imposing an hour-long deadline for social networks to remove terrorist and extremist content after voluntary measures appear to have failed.

As reported by the Financial Times on Sunday, Facebook, Twitter, and YouTube, as well as smaller businesses, are all within the EU's sights.

This is the first time that technology firms and online services could be held directly responsible for how long terrorism-related content is allowed to circulate on social media.

On pain of heavy fines, the social networks will need to detect and remove content such as videos, images, audio, and potentially live streams which promote or encourage extremism.

EU security commissioner Julian King told the publication that not enough progress has been made to clamp down on terrorist content, and "we cannot afford to relax or become complacent in the face of such a shadowy and destructive phenomenon."

Currently, the removal of extremist content is based on voluntary guidelines issued by the agency.

Back in March, the European Commission recommended operational measures for the removal of "terrorist content, incitement to hatred and violence, child sexual abuse material, counterfeit products and copyright infringement."

See also: Former Microsoft engineer sent behind bars for role in ransomware extortion scheme

The recommendations included the creation of easy and transparent systems which permitted users to report illegal content, proactive tools for detection and removal, "trusted" reporters of extreme content, safeguards including human oversight to protect "fundamental rights," and the establishment of voluntary tools used to control the spread of extreme content.

At the time, the commission also suggested that one hour should be adhered to for the removal of illegal and extreme material "as a general rule."

CNET: Here's how Facebook defines terrorism -- and how it's responding

It appears that social networks may not have done enough to satisfy the commission as now these guidelines are being considered for formal legislation. Failing to meet such standards may result in stringent fines.

According to a senior EU official, the draft legislation is still being drawn up but is likely to include the one-hour rule when content has been flagged up by law enforcement.

The crackdown comes at the time when many technology vendors claim that automatic tools are handling the issue well.

Google, for example, claims that 90 percent of terrorism-related content uploaded to YouTube is detected and removed automatically, according to the FT. However, a recent study from The Counter Extremism Project (CEP) suggests that ISIS content is still being uploaded and made available for hours afterward.

TechRepublic: Facebook's secret weapon for fighting terrorists: Human experts and AI working together

Over a three-month period, 1,348 ISIS videos were uploaded to the platform, garnering more than 163,391 views.

Facebook says that the social network has removed 1.9 million pieces of content related to ISIS, al-Qaeda, and their affiliates. Twitter has closed 1.2 million accounts for promoting extremist content to date.

The new rules would need to be approved by the European Parliament and the majority of member states to pass into law.

Previous and related coverage

Editorial standards