Big data: How firms can exploit it without breaking privacy laws

Summary:Many organisations are sitting on masses of personal data they can't use without falling foul of data-protection laws. Anonymising the data is a way round the problem but doing that properly has proved tricky — until the recent release of new guidelines.

Steve Wood, head of policy at UK data watchdog the ICO, recently argued that "anonymisation has a crucial role to play in [the] data revolution". Indeed, a great deal has been written about the opportunities offered by big data — from fraud prevention and medical research to political canvassing — but the data-protection challenges it raises have received too little analysis and regulator guidance.

Given the significant legal restrictions in dealing with data deemed 'personal', the area is a potential regulatory minefield for those who fail to appreciate the risks.

Ensuring that data is properly anonymised, and not just masked, can be difficult to achieve in practice

The ICO wants to offer pragmatic guidance to facilitate better data protection and recently published its code of practice in Anonymisation: Managing data protection risk.

The aim of the Code is to provide guidance for organisations on how personal data can be anonymised successfully — and how to assess the risk of individuals being identified using data that has been anonymised.

The attraction of anonymised data is that it's no longer personal data and therefore falls outside the scope of the Data Protection Act 1998. Ensuring that data is properly anonymised, and not just masked, can be difficult to achieve in practice, particularly as technology is constantly evolving. The key challenge is ensuring anonymised data cannot be reidentified.

The anonymisation process

Organisations are also often uncertain about the legal basis for the anonymisation process itself, and whether anonymised data might constitute personal data.

The Code covers a number of these issues. Of particular interest is the guidance on how to deal with the risk of reidentification of anonymised data. Given that these risks may change over time, particularly with advances in technology, the ICO suggests that this risk is assessed periodically.

It recommends that organisations should apply the motivated-intruder test. This test requires organisations to consider whether individuals could be reidentified from the anonymised data by someone who is "reasonably competent, has access to resources, such as the internet" and would employ "investigative techniques, such as making enquiries of people who may have additional knowledge of the identity of the data subject". The motivated intruder is not assumed to have any specialist knowledge.

If an organisation reidentifies personal data without an individual's knowledge or consent, the collection will probably be unlawful and may be subject to enforcement action, including a monetary penalty of up to £500,000.

Moreover, in borderline cases, where the consequences of reidentification may be significant — for example, where reidentification may leave an individual open to damage or distress — the Code urges organisations to seek consent from data subjects for the disclosure of data, explaining its possible consequences, and adopt a more rigorous form of risk analysis and a stronger anonymisation technique.

As a reminder that the ICO's enforcement powers extend beyond security breaches, it is worth noting that the ICO recently fined a company £50,000 not for a security breach, but for failing to maintain accurate data records.

It's also worth noting that, provided the anonymisation process does not cause unwarranted damage or distress, the Code concludes that consent is not required, a position that diverges from that of other European data-protection authorities.

Information governance and safeguards

The Code provides detailed guidance on the information governance structure and safeguards that organisations are expected to implement where they utilise anonymisation techniques, recommending the appointment of a senior information risk owner, staff training, procedures for identifying where anonymisation may be difficult, and privacy impact assessments.

While not legally binding, compliance with the Code will be considered a "best practice" and may influence enforcement

In this context, there is reference to the role that trusted third parties may play in enabling several organisations to anonymise personal data, yet enable the data sets to be linked utilising identifiers applied by the trusted third party, and then shared by the group.

These strategies need to be considered carefully — where one of the contributors is able to reidentify the data, it will not be personal in their hands, the usual data-protection obligations will need to be met.

The ICO's guidance sets out good practice recommendations that should be adopted by organisations to provide a "reasonable degree of confidence" that the publication and sharing of anonymised data will not lead to an "inappropriate disclosure of personal data."

While not legally binding, compliance with the Code will be considered a "best practice" and may influence enforcement. For those looking to take advantage of any personal data they have at their disposal, it is certainly a useful resource.

Topics: Big Data, CXO, Data Management, United Kingdom

About

Bridget Treacy leads the UK Privacy and Information Management practice at international law firm Hunton & Williams. Her practice focuses on privacy, data protection, information governance and e-commerce issues for multinationals across a range of industries.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.