X
Tech

Facial recognition could be most invasive policing technology ever, warns watchdog

UK's Information Commissioner's Office challenges the interpretation of a court ruling that gave the green light for using facial recognition on the public.
Written by Liam Tung, Contributing Writer

Police forces should be subject to a code of practice if they want to use live facial recognition technology on the public, according to the UK's Information Commissioner's Office (ICO). 

ICO commissioner Elizabeth Denham has released her opinion on the use of live facial recognition on the public by police in response to a recent High Court ruling that South Wales Police didn't violate human rights or UK law by deploying the technology in a public space. 

Denham argues facial recogition should be restricted to targeted deployments that are informed by intelligence and time-limited, rather than ongoing. She also reckons the High Court's decision "should not be seen as a blanket authorisation for police forces to use [live facial recognition] systems in all circumstances".

SEE: 10 tips for new cybersecurity pros (free PDF)  

The case concerned police using live CCTV feeds to extract individuals' facial biometric information and matching it against a watchlist of people of interest to police.  

Large scale trials of facial recognition tech by the South Wales Police and the Metropolitan Police Service (Met) for public safety have irked some people who fear a dystopian future of mass surveillance combined with automated identification. 

The ICO kicked off an investigation in August over the use of surveillance cameras to track commuters and passersby in London. Denham raised concerns over people being identified in public without gaining an individual's consent. 

Surveillance cameras themselves make some people uncomfortable, but technology that automatically identifies people raises new questions for privacy in public spaces. The Met began trialling the tech on shoppers in London last Christmas

Denham said live facial recognition was a significant change in policing techniques that raises "serious concerns". 

"Never before have we seen technologies with the potential for such widespread invasiveness. The results of that investigation raise serious concerns about the use of a technology that relies on huge amounts of sensitive personal information," she said. 

SEE: Facial recognition: This new AI tool can spot when you are nervous or confused

Denham argues the UK needs a "a statutory and binding code of practice" for the technology's deployment due to a failure in current laws to manage the risks it poses. 

The privacy watchdog will be pushing the idea of a code of practice with the UK's chief surveillance bodies, including policing bodies, the Home Office and the Investigatory Powers Commissioner. 

Denham argues in her opinion statement that for police to use facial recognition, they need to meet the threshold of "strict necessity" and also consider proportionality. She believes this is likelier to be met on small scale operations, such as when "police have specific intelligence showing that suspects are likely to be present at a particular location at a particular time." Another is at airports, where live facial recognition supports "tailored security measures". 

Editorial standards