X
Innovation

New rules on police use of facial recognition don't go far enough, experts say

The UK government has updated its code of practice on the use of surveillance cameras. Will it be enough to protect citizens' rights?
Written by Daphne Leprince-Ringuet, Contributor
gettyimages-1208074512.jpg

Surveillance cameras are deployed extensively across the country, and the UK's code of practice is designed to ensure that they are always used for the right reasons, and without impacting citizens' privacy. 

Image: Kirill Kudryavtsev / AFP / Getty Images

The UK government has updated the rules surrounding the use of surveillance cameras by local authorities and the police, in an effort to address some of the concerns raised in recent years -- particularly about facial recognition technology. 

The new Surveillance Camera Code of Practice is an improved version of the code that was published in 2013, and is the first time the document has been amended in eight years.

The code provides guidance on the use of camera systems that record or view images for surveillance purposes. CCTV cameras, for example, are included, along with automatic number plate recognition (ANPR) technologies, and any system that's used to store or process the information obtained by those devices.

SEE: The Privacy Paradox: How can businesses use personal data while also protecting user privacy?

Surveillance cameras are deployed extensively across the country, and the code of practice is designed to ensure that they are always used for the right reasons, and without impacting citizens' privacy. 

The new code is broadly similar to its predecessor: surveillance camera systems should always be used for legitimate purposes, such as national security or preventing disorder or crime; they should come with transparency and accountability imperatives; and the technology has to be regularly audited to make sure that it is up to standards. 

The most significant change to the code concerns the use of live facial recognition (LFR) technologies by police officers. Where the previous rules provided little guidance on LFR, the new version clarifies some of the criteria that should govern police deployment of the technology.

LFR is typically used by the police to find criminals they are seeking, by comparing live camera feeds of faces against a pre-determined watch list. When the technology identifies a possible match to a person of interest, it generates an alert, enabling the police to deploy officers in real time.

The new code says that officers should only deploy LFR for lawful policing purposes, following an authorisation process -- and that they should publish both the categories of people included on the watchlist and the criteria on which the deployment of the technology is based.  

Any data that does not produce an alert against someone on the watchlist should be deleted instantaneously. Police officers also need to consider any potential adverse impact that LFR may have on protected groups.

LFR challenges

According to the Surveillance Camera Commissioner, the guidance on LFR was added to reflect a recent challenge against the use of the technology by South Wales Police. 

Last year, Ed Bridges, who resides in Cardiff, won a court case against South Wales Police (SWP) after he complained that he was filmed without his consent by a facial recognition van. According to Liberty, who acted as solicitors for Bridges, the technology has been used by SWP on more than 60 occasions since 2017, and may have taken sensitive facial biometric data from 500,000 people without their consent. 

The court found that the use of LFR breached privacy rights, data protection laws and equality laws, and that tighter rules were needed to manage the deployment of facial recognition technologies. 

Bridges' case was not the first to challenge police deployments of LFR. In 2019, the Information Commissioner's Office (ICO) launched an investigation into the use of facial recognition technology at King's Cross in London, citing the potential threat to privacy posed by the LFR-equipped cameras. 

At the time, Information Commissioner Elizabeth Denham wrote that the adoption of LFR by police forces should slow down and stated that existing laws were not sufficient to ensure an ethical deployment of the technology.

SEE: The cybersecurity jobs crisis is getting worse, and companies are making basic mistakes with hiring

The updated surveillance camera code of practice has not fallen on deaf ears. "The recent update to the Surveillance Camera Code of Practice provides important additional guidance around the use of biometric technologies such as live facial recognition in surveillance cameras," independent research organization the Ada Lovelace Institute said in a statement. 

"However, far more still needs to be done to ensure that the rules concerning the use of a technology as powerful and controversial as live facial recognition are clear, comprehensive and provide adequate protections from potential harms." 

A recent survey carried out by the Ada Lovelace Institute among 4,000 UK adults found that while many support the use of facial recognition in contexts such as policing, more than half (55%) want the government to limit how it can be used.  

The updated code of practice doesn't go far enough to clarify the boundaries of the technology, said the Institute, which shows that the existing legal framework governing the use of LFR in the UK is not fit for purpose and has failed to keep pace with technological advances.

Tamara Quinn, a data privacy specialist partner at international law firm Osborne Clark, concurs. "The wording in the draft Code of Practice doesn't provide anywhere near the sort of detail that police forces need to consider when deploying live facial recognition," Quinn tells ZDNet, adding that it's important to realise that there is other helpful information out there, such as the Opinions issued by the Information Commissioner.

Blanket ban?

Some advocacy groups have been calling to ban the technology entirely. Earlier this year, for example, the College of Policing opened a public consultation on the use of LFR, to which a coalition comprising Privacy International, Liberty, Defend Digital Me, Open Rights Group and Big Brother Watch answered by calling for a blanket ban on the technology. The coalition argued that LFR can 'never' be deployed safely in public spaces. 

And non-profit organization AccessNow has long advocated for a global ban on facial recognition and other biometric recognition tools, arguing that these technologies enable mass surveillance and pose a threat to human rights and civil liberties. More than 200 civil society organisations, activists and experts have now joined AccessNow's 'Ban Biometric Surveillance' campaign. 

A Home Office spokesperson told ZDNet: "The Government is committed to empowering the police to use new technology to keep the public safe, whilst maintaining public trust, and we are currently consulting on the Surveillance Camera Code." 

"All users of surveillance camera systems including live facial recognition (LFR) are required to comply with strict data protection legislation." 

Around the world, some governments are also looking at implementing stronger regulation of LFR. Last April, for example, the EU published draft regulations on the use of artificial intelligence, which included a ban on facial recognition when used in public spaces, in real time, and by law enforcement agencies -- although some exceptions were criticised as loopholes. 

The UK government is now welcoming comments on the updated code of practice, with the intention of laying the document before Parliament in late autumn. 

Editorial standards