X
Government

Police must stop facial recognition trials until better rules are in place, say lawmakers

MPs call for police to stop all live facial recognition trial deployments until proper regulations and oversight are in place.
Written by Liam Tung, Contributing Writer

The UK House of Commons Science and Technology Committee has called on police and other authorities to stop all trials of automatic facial recognition in public spaces. 

The committee of MPs also said there should be no further trials until regulations are in place, as well as guidance on trial rules, and a system for its oversight and evaluation. 

SEE: Tech budgets 2019: A CXO's guide (ZDNet special report) | Download the report as a PDF (TechRepublic)

"The legal basis for automatic facial recognition has been called into question, yet the Government has not accepted that there's a problem. It must," said MP, Norman Lamb, chair of the committee

"A legislative framework on the use of these technologies is urgently needed. Current trials should be stopped and no further trials should take place until the right legal framework is in place." 

The Metropolitan Police Service kicked off trials in central London last Christmas using NEC's Neoface facial recognition system. When people pass by cameras, their images are matched against a so-called 'watch list'. Images that match those in the watch list are said to be retained for 30 days, while all other images are supposedly deleted immediately.    

Last week Home Secretary Sajid Javid said it was "right" that Met Police were trialling the controversial tech, even after a study of six live trials by researchers at the University of Essex found that matches were only correct in 20% of cases, as reported by The Guardian.   

There are concerns over the lack of legislation governing how police use facial recognition. 

UK Information Commissioner Elizabeth Denham issued a warning this month that it will take regulatory action if it finds that live facial recognition police deployment does not comply with the UK's GDPR-based data protection laws. She launched an investigation into its use by police in December. 

The Home Office last year released its 27-page "Biometrics Strategy" paper in which it outlined it's intent to "ensure that standards are in place to regulate the use of [automatic facial recognition] in identification before it is widely adopted for mainstream law enforcement purposes."

However, for now the Home Office believes it has the legal framework for the public trials of automatic facial recognition to proceed.  

The Commons committee slammed the Home Office's 2018 strategy document. 

"We would argue it is not really a strategy at all, lacking a coherent, forward looking vision and failing to address the legislative vacuum around new biometrics," it said. 

The committee wants the UK Government to follow the Scottish Government's approach and commission an independent review into the use and retention of biometric data, followed by a public consultation and then legislation. 

The committee is also concerned that police have "stalled on ensuring custody images of unconvicted people are weeded and deleted". It said police should allocate more resources to this process and that the government should invested in automatic deletion software. 

"It is unclear whether police forces are unaware of the requirement to review custody images every six years, or if they are simply 'struggling to comply'. What is clear, however, is that they have not been afforded any earmarked resources to assist with the manual review and weeding process," the committee said.  

As it is, the MPs note that "the burden remains on individuals to know that they have the right to request deletion of their image."

"As we stated in 2018, this approach is unacceptable and we agree with the Biometrics Commissioner that its lawfulness requires further assessment."

Editorial standards