Security agency points out flaws in EU's 'right to be forgotten' plan

The EU’s proposals for helping people get their data completely deleted from Google, Facebook and others face technical challenges — especially in the era of big data, says ENISA.
Written by Karen Friar, Contributor

Technical problems could scupper Europe's proposed 'right to be forgotten', especially in an era of big data, going by a report from the EU's security advisory agency.

The right to be forgotten, put forward in the European Commission's upcoming Data Protection Regulation (PDF), would allow people who use social networks and other online services to get their photos, posts and other personal data completely deleted on request. On Tuesday, ENISA published a technical assessment of the proposals, identifying a whole range of problems likely to make the right difficult to put into practice.

Right to be forgotten

"For any reasonable interpretation of the right to be forgotten, a purely technical and comprehensive solution to enforce the right in the open internet is generally impossible," the agency said in its report. "An interdisciplinary approach is needed, and policy makers should be aware of this fact."

While specific data such as images and publicly viewable posts are easily deleted, the situation becomes trickier when it comes to big data — that is, once information has been analysed and worked on to provide insights. It's not impossible to reconstruct personally identified information from data in large sets, according to ENISA.

"[One] question is how aggregated and derived forms of information (eg statistics) should be affected when some of the raw data from which statistics are derived are forgotten. Removing forgotten information from all aggregated or derived forms may present a significant technical challenge," the agency said.

"On the other hand, not removing such information from aggregated forms is risky, because it may be possible to infer the forgotten raw information by correlating different aggregated forms."

Google and Facebook, which sell ads targeted at users after analysing user data, will have to comply with people's requests to delete data, the European Commission said in February. The rules will apply to social networks and search engines, it said, but not to platforms that host content without processing it. The statement came after Google complained that the proposals made unreasonable demands on online service providers.

Too broad

According to ENISA, the main difficulty with the regulation is that it is too broad in its terms. For example, it doesn’t say exactly who has the right to request removal, and what are the acceptable ways of 'forgetting' data, it noted.

The proposals take a couple of stabs at saying what 'personal data' is, describing it broadly as information that can be uniquely linked to an identifiable natural person. However, the definitions are not clear enough — something that is essential for enforcing the right to be forgotten, according to ENISA.

"They leave to interpretation whether it includes information that can be used to identify a person with high probability but not with certainty, eg a picture of a person or an account of a person's history, actions of performance," it said.

"Neither is it clear whether it includes information that identifies a person not uniquely, but as a member of a more or less small set of individuals, such as a family."

ENISA said policymakers and data protection authorities should work together on coming up with clearer definitions and cost out enforcement if these are in place. It also recommended forcing search engines and other "sharing services" within the EU to filter references to 'forgotten' data stored outside the EU, and urged regulators to take an "interdisciplinary" approach to fine-tuning the proposals.

Having been introduced in January, the Data Protection Regulation is now under discussion by European lawmakers before the proposals go before the full European Parliament.

Editorial standards