Amazon has reportedly extended a ban on US law enforcement using Rekognition until further notice.
On Tuesday, Amazon said that the one-year ban on US police being permitted to use the facial recognition technology solution would continue to stand, as reported by The Washington Post.
The previous one-year moratorium, announced in June 2020, was designed to give Congress time to debate and pass "appropriate rules" for the ethical use of facial recognition technology by law enforcement agencies.
At the time, Amazon said:
"We've advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge."
However, despite a handful of federal-level proposals being put on the table, none have been passed.
Amazon's moratorium will now be in place "indefinitely" until lawmakers addressed issues raised surrounding the use of Rekognition to identify potential suspects in criminal cases.
Rekognition is image and video analysis software that leverages deep learning. Amazon describes the facial recognition aspect of the software as "highly accurate facial analysis and facial search capabilities that you can use to detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases."
For example, law enforcement departments could submit an image of a suspect and search for a match with databases containing mugshots or other identification records.
Previously, access to Rekognition was sold to law enforcement agencies. However, there are concerns relating to privacy, ethical use, accuracy, racial discrimination and the technology potentially playing a part in false convictions and injustice when it comes to facial recognition technologies.
In 2018, the American Civil Liberties Union (ACLU) published a report revealing Rekognition incorrectly matched 28 members of Congress as individuals who had previously been arrested. Amazon refuted the report.
There is also concern that facial recognition technology could be inherently racially biased. Following on from ACLU's research, studies conducted by organizations including The University of Texas at Dallas, MIT, and Harvard have also questioned the accuracy of algorithms used to identify some groups by facial recognition software -- including people of color, women, and particular age brackets -- and these misclassifications could have real-world ramifications in criminal cases.
Independently, a number of US cities and states -- including San Diego and San Francisco -- have implemented their own rules to curtail the use of facial recognition by the police.
Debates are underway in approximately 20 states, and in recent weeks, Virginia imposed the toughest laws against its use to date -- law enforcement agencies are now required to obtain permission by the state legislature before purchasing or using facial recognition technologies.
Amazon is not the only provider of such solutions that has tried to distance itself from law enforcement clientele. IBM exited the business over worries that its technology could be abused, and Microsoft says it will not sell facial recognition technology to police departments until appropriate federal laws have been passed.
Update 21.28 BST: Added further clarification and Amazon's response to ACLU's research.
Previous and related coverage
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0