Coronavirus contact-tracing apps: What are the privacy concerns?

Special smartphone apps could help to reduce the spread of COVID-19, but such moves could also have profound implications for individual privacy in the long term.
Written by Danny Palmer, Senior Writer

As the challenge of containing the coronavirus outbreak continues, governments around the world are looking to technology and smartphone apps to help trace the spread of the pandemic, in an effort to reduce the number of people who fall ill with COVID-19.

But while the efforts may be a key tool for governments because they could potentially slow coronavirus outbreaks, there are also concerns that gathering information about people's geo-location and other personal data to aid management of the pandemic risks infringing on our individual privacy more than ever before.

There are a number of technologies in development that may help track the virus. For example Apple and Google recently revealed a joint initiative to develop contact-tracing technology for government health agencies. Using Bluetooth, contact-tracing applications are designed to identify potential COVID-19 hotspots – and alert people if they've been in close contact with someone known to be displaying coronavirus symptoms. The idea is that if people know they might have the virus, they'll take the appropriate action and isolate before potentially spreading it to others.

The UK government has also confirmed that the National Health Service is working with technology firms to develop a voluntary contract-tracing app to help control the spread of coronavirus.

SEE: Sensor'd enterprise: IoT, ML, and big data (ZDNet special report) | Download the report as a PDF (TechRepublic)

"If you become unwell with the symptoms of coronavirus, you can securely tell this new NHS app," said Matt Hancock, secretary of state for health and social care, speaking at Number 10's daily coronavirus press conference, "and the app will then send an alert anonymously to other app users that you've been in significant contact with over the past few days, even before you had symptoms, so that they know and can act accordingly," he added.

Hancock explained that all data involved would only be used for NHS care and research and would be "handled according to the highest ethical and security standards" – and not held any longer than required.

However, security experts and privacy campaigners have voiced concerns about the use of contact tracing as a means of controlling the outbreak.

"Personally I feel conflicted. I recognise the overwhelming force of the public-health arguments for a centralised system, but I also have 25 years' experience of the NHS being incompetent at developing systems and repeatedly breaking their privacy promises when they do manage to collect some data of value to somebody else," Professor Ross Anderson, professor of security engineering at the University of Cambridge, wrote in a blog post.

"This is why I'm really uneasy about collecting lots of lightly-anonymised data in a system that becomes integrated into a whole-of-government response to the pandemic. We might never get rid of it," he said.

One potential problem here is that even if data is anonymised, it's not impossible to analyse the information and trace it back to an individual.

Not only could this reveal if someone has suffered from the virus – potentially something they wouldn't want to share with the outside world – but location tracking could detail where people go, how long for and who they interact with in a way that could be considered active surveillance.

"We are looking at technology that needs to be able to track your exact geographic location and the individuals you interact with in real time. This significantly intrudes into people's fundamental human right to a private life that is not monitored or controlled by the state," said Camilla Winlo, director of data protection and privacy at consultancy DQM GRC.

"There is always a need to balance a state's duty to protect its citizens with its duty to protect other freedoms, such as freedom of association, freedom of speech and the right to a private life – and tools that normalise intrusion into these areas change that balance," she added.

There are also concerns that intrusion could become normal; especially if the crisis goes on for a longer period of time and people become used to being monitored. That could potentially mean that some governments could be reluctant to give up the powers they're currently acquiring.

"While the public health value of looser privacy restrictions is clearly apparent, so are the risks," said Logan Finucan, senior manager of data and trust at Access Partnership, a global public policy firm specialising in technology.

"Once provided with extraordinary access, governments don't often want to give up new tools and authorities, creating the possibility for a new normal. Over time, there is a potential risk that law enforcement or surveillance authorities will seek to tap into such fonts of information."

SEE: Coronavirus: Business and technology in a pandemic

Currently, the UK government's idea of using an app to monitor contact tracing is voluntary and while it could help those who use it to stay safe, it does create potential problems – particularly if the app doesn't contain security measures to prevent it being abused. There's no obvious way to stop if being misused by pranksters or by those wishing to create fear by pretending to have the virus when they don't. 

Then there's also the potential for the technology companies that are helping to develop the applications not being fully transparent on how they're collecting and using data: trust in big tech companies is not exactly high.

However, with the use of technology, applications and surveillance currently in the limelight because of coronavirus, there's the potential benefit – outside of halting coronavirus – to have a true debate over privacy versus convenience.

"It's extremely promising to see so much discussion around privacy and the amount of thought given to the impacts of Privacy by Design. This has been a unique opportunity for organisations to clearly see and experience the extra work involved in using technology that doesn't have privacy enabled by default," said Winlo.

"It's an opportunity for more people to see the advantages of technology and the potential it offers. But the risks it brings are also very real, and it could take months before we realise where they crystallised and who was affected by them," she added.


Editorial standards