UK watchdog to investigate King's Cross facial recognition tech used to spy on public

Thousands of people pass through the busy London area on a daily basis.

Oakland becomes third city in the US to ban facial recognition tech The council is fighting back against privacy violations made possible through facial surveillance.

The UK's Information Commissioner's Office (ICO) is launching an investigation into the use of facial recognition technology at King's Cross following media reports of the covert practice.

On Thursday, UK Information Commissioner Elizabeth Denham said the use of cameras to track commuters and passers-by in the busy London area "is a potential threat to privacy that should concern us all."

As first reported by the Financial Times, tens of thousands of people are being identified through surveillance cameras that monitor visitors and those going about their daily lives without any notification that facial recognition technology is in use -- and no consent or permission has been given. 

King's Cross is 67 acres and hosts a busy train and underground station. The area is also home to St. Pancras, which facilitates train journeys across the UK and the Eurostar service. 

The property developer for the area, Argent, said the cameras "use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public."

See also: IoT home security camera allows hackers to listen in over HTTP

It is not known how many cameras are in use, how long facial recognition technology has been active, or what happens to the visitor information harvested through the surveillance web. 

The publication says that Canary Wharf developers are also in talks to use facial recognition-ready cameras. 

However, in the UK, such technology requires the consent of individuals being monitored and as such, in both cases, the rules may be being flouted. 

The Metropolitan Police service has also trialed facial recognition systems in central London. The purpose of the test, piloted in 2018, was to track offenders wanted by law enforcement and courts. 

TechRepublic: How to prevent data destruction from cybersecurity attacks

"Facial recognition technology is a priority area for the ICO and when necessary, we will not hesitate [to] use our investigative and enforcement powers to protect people's legal rights," Denham says. "As well as requiring detailed information from the relevant organizations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law."

The commissioner added that on a personal level, she remains "deeply concerned" about the enthusiastic adoption of facial recognition technologies not only by law enforcement but also by private companies. 

CNET: Judge bars Georgia from using current voting technology in 2020

According to Denham, the ICO is also examining whether or not current legal frameworks are enough to balance adoption and "people's expectations about how their most sensitive personal data is used."

"Put simply, any organizations wanting to use facial recognition technology must comply with the law -- and they must do so in a fair, transparent and accountable way," the commissioner says. "They must have documented how and why they believe their use of the technology is legal, proportionate and justified."

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0