X
Tech

Controversial facial recognition tech firm Clearview AI inks deal with ICE

$224,000 has been spent on Clearview licenses by the US immigration and customs department.
Written by Charlie Osborne, Contributing Writer

The US Department of Homeland Security (DHS) has signed a contract with Clearview AI to give Immigration and Customs Enforcement (ICE) access to the controversial facial recognition firm's technology. 

Tech Inquiry, a non-profit technology watchdog and rights outfit, spotted documents revealing the deal last week.

The $224,000 purchase order, signed on August 12, 2020, is for "Clearview licenses" relating to "information technology components," but no further information has been made public. The contract will last until September 4, 2021. 

Tech Inquiry has submitted a Freedom of Information Act (FOIA) request for the contracts and communication between Clearview AI and ICE relating to the award. According to the non-profit, ICE received four bids for the contract, and Clearview was selected. 

See also: UK and Australian Information Commissioners to investigate Clearview AI

Combining facial recognition searches with ICE, a DHS department already surrounded by controversy due to its detention centers, practices concerning child containment, and now 17 detainee deaths this year, could be an explosive combination. 

However, this is not the first time ICE has leaned on machine learning and facial recognition systems. Both the FBI and ICE have used state DMV records as a "goldmine" in the search for undocumented immigrants. 

New York-based Clearview AI provides a search engine tool based on a database of billions of photos scraped from Internet-based public sources. Clearview AI claims the service is only for "identifying perpetrators and victims of crimes" and had been used to track down "hundreds" of criminals.

"Clearview AI is not a surveillance system and is not built like one," the company says. "For example, analysts upload images from crime scenes and compare them to publicly available images."

Clearview AI CEO Hoan Ton-That told Business Insider that the technology is used by Homeland Security's Child Exploitation Investigations Unit and this has "enabled HSI to rescue children across the country from sexual abuse and exploitation."

While not available to the public, regulators and privacy advocates alike have raised concerns that Clearview AI's tool crosses ethical lines. 

CNET: The best outdoor home security cameras to buy in 2020

In May, the American Civil Liberties Union (ACLU) filed a lawsuit alleging that ClearView AI is violating the Illinois Biometric Information Privacy Act (BIPA) and "represent[s] an unprecedented threat to our security and safety."

Technology companies including Google, Microsoft, and Facebook have also sent cease-and-desist letters to the company, demanding that Clearview AI stops scraping images from their platforms and services.

IBM, Microsoft, and Amazon have pledged to stop selling facial recognition software to law enforcement agencies due to privacy and surveillance concerns. 

TechRepublic: How cybercriminals are exploiting US unemployment benefits to make money

In July, the UK Information Commissioner's Office (ICO) and the Office of the Australian Information Commissioner (OAIC) announced a joint investigation into the startup and a data breach that occurred in February this year. 

The security incident exposed Clearview AI's client list, the majority of which are law enforcement agencies across the United States. Customer names, accounts, and the number of searches clients have made were leaked. 

In related news, researchers have developed a tool that introduces garbage code and small changes to the photos of ourselves made public online. Dubbed Fawkes, the software makes tweaks invisible to the naked eye but substantial enough to prevent machine learning algorithms from connecting photos to individual identities. 

Cybersecurity reads for every hacker's bookshelf

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0


Editorial standards