X
Innovation

Facial recognition: Now algorithms can see through face masks

The US Department of Homeland Security has carried out trials to test whether facial recognition algorithms could correctly identify masked individuals.
Written by Daphne Leprince-Ringuet, Contributor

The US Department of Homeland Security (DHS) is piloting facial recognition technologies that can see through face masks with a "promising" level of accuracy, meaning that travelers could end up breezing through airports without the need to uncover their mouths and noses at border checks.  

The trials were organized as part of a yearly biometric technology rally, organized by the Science and Technology Directorate (S&T), which is the research and development unit within the DHS. Every year since 2018, the rally brings together experts, technology vendors and volunteers to test top-notch biometric systems, and make sure that they are up to the challenges posed by the use of facial recognition technology in a variety of scenarios.  

This year, in response to the new imperatives brought by the COVID-19 pandemic, the rally has focused on evaluating the ability of AI systems to reliably collect and match images of individuals wearing an array of different face masks, with a view of eventually deploying the technology in international airports around the country.  

SEE: Managing AI and ML in the enterprise: Tech leaders increase project development and implementation (TechRepublic Premium)

During a 10-day event, 60 facial recognition configurations were tested with the help of almost 600 volunteers from 60 different countries. The technologies trialed were a mix-and-match of six different image collection systems, paired with ten different matching algorithms. They were evaluated on criteria ranging from the ability to snap a picture when presented with a human face to the reliability of the identification process.  

Volunteers were asked to present themselves both masked and unmasked. On average, said the DHS, the different AI systems correctly identified 93% of unmasked individuals; for those wearing a mask, the identification rate reached an average of 77%. 

The results, however, varied greatly from one system to the other: for example, the best-performing technology correctly identified individuals 96% of the time, even when they were wearing a mask. The worst-performing system tested during the rally, for its part, only identified 4% of masked individuals.  

"This isn't a perfect 100% solution," said Arun Vemury, director of S&T's Biometric and Identity Technology Center, "but it may reduce risks for many travelers, as well as the frontline staff working in airports, who no longer have to ask all travelers to remove masks." 

Facial recognition is currently used in a select number of US airports as part of a program called Simplified Arrival, which is deployed by the Customs and Border Protection (CBP). Under Simplified Arrival, the identity of international travelers who enter and exit the country can be verified at inspection points in the airport by the snap of a picture, rather than having to present a travel document.  

A facial recognition algorithm compares the picture against a gallery of images that the traveler has previously provided the government with, such as passport and visa photos, to confirm the individual's identity. Passengers are allowed to opt-out of the process if they wish, in which case a more traditional document inspection is carried out by CBP officials. 

According to the CBP, more than 55 million travellers have gone through the Simplified Arrival process to date. Since 2018, the system has stopped 300 "imposters" from illegally entering the country with travel documents that were issued to other people. 

It remains that the technology behind Simplified Arrival has been a topic of heated discussion ever since it was first implemented. Much of the debate revolves around the identification rates of the AI system: last year, for example, the US Government Accountability Office (GAO) published a report showing that the CBP had fallen short of performance testing for its facial recognition technologies. 

GAO found that the matching accuracy of the system wasn't strong enough, with 0.0092% of travelers at risk of being matched to a photo from the gallery that is not the same person. Although the number seems small, it could in fact translate into tens of thousands of individuals at a country-wide scale.  

SEE: The algorithms are watching us, but who is watching the algorithms?

What's more, GAO noted that the CBP had no way of testing the algorithm to find out if the matching accuracy was influenced by factors such as race or ethnicity, meaning that the technology might be perpetrating discriminatory practices against minority groups during border checks. 

The American Civil Liberties Union (ACLU) has long condemned the CBP's use of faulty facial recognition systems. Writing in a blog post last year, the organization's senior staff attorney Ashley Gorski said: "Over the last couple years, it's become increasingly clear that facial recognition technology doesn't work well, and would be a civil liberties and privacy nightmare even if it did." 

The DHS, on the other hand, said that previous rallies had shown that biometric systems can "excel" at rapidly processing high volumes of travelers using face recognition, and that the next step will be to adapt the technology to mask wearers. The final test results from this year's event are expected to come in the next few weeks. 

Editorial standards