London's police will be testing out live facial recognition technology on Christmas shoppers today and tomorrow.
The Metropolitan Police Service said the test, which will cover areas in Soho, Piccadilly Circus and Leicester Square, is part of its ongoing trial of the technology.
When people pass through the area covered by the cameras, their images are streamed directly to the police facial recognition system database. This database contains a watch list of offenders wanted by the police and courts for various offences.
The system measures the structure of each face, including distance between eyes, nose, mouth and jaw, to create facial data. When the system detects a face it then creates a digital version and searches against the watch list. If a match is made it sends an alert to an officer on the scene.
The police said the system will store faces matching the watch list for 30 days; all others are deleted immediately.
The test will run for eight hours each day. It's a public trial, so there will be uniformed officers handing out information leaflets; posters with information about the technology will also be displayed in the area.
This is not the first time the Met police have tested facial recognition technology. It has been used at the Notting Hill Carnival in 2016 and 2017, on Remembrance Day 2017, and earlier this year at the Port of Hull docks and at Stratford station. The police want to run a total of 10 trials before the end of the year; future deployments being considered include football matches and other sporting events, music festivals and stations.
Facial recognition is an emerging but very controversial technology, especially considering the rapid roll-out in China to monitor public spaces. In particular, the idea that everyone walking down a street should be scanned and matched against a database of suspects is seen by many as another erosion of their civil liberties. During the Met trial, anyone can refuse to be scanned and it's not an offence or considered 'obstruction' to actively avoid being scanned.
Civil liberties campaigners have described the live facial recognition surveillance as authoritarian and dangerous and also not very useful; according to statistics obtained using Freedom of Information requests in May, 98 percent of the facial recognition 'matches' were inaccurate.
Tech companies are also concerned about the uncontrolled use of facial recognition. Microsoft has called for legislation around its use and said it should only be used by police in public places with a court order, or in an emergency -- where there's a risk of death or serious injury, for example.
PREVIOUS AND RELATED COVERAGE
Microsoft advocates for government regulation of facial-recognition technology
Microsoft is advocating that Congress become involved in regulating facial-recognition technology, on the heels of criticism of potentially negative impacts of its own work in that area.
Passengers to clear customs using facial recognition tech at Japan's busiest airport
Those clearing customs at Japan's Narita International Airport will soon be able to use their face to prove their identity thanks to the rollout of facial recognition technology.
Lenovo announces unmanned convenience store to test AI, facial recognition
Lenovo said it will use the store to trial and hone its tablets, facial recognition, artificial intelligence, and e-payment technologies in a live retail environment.
Facial recognition tech to verify age for alcohol sales in UK supermarkets: report
The technology is touted as a means to cut down staff intervention at self-checkouts.
Facial recognition's failings: Coping with uncertainty in the age of machine learning (TechRepublic)
Why some machine-learning tech is falling short, and how we need to recalibrate our expectations.
Microsoft calls for regulation of facial-recognition technology (CNET)
The tech giant warns the technology could be used to create a 1984-like dystopia.