IBM has called for the US Department of Commerce to limit the export of facial recognition systems, particularly to countries that could potentially use it for mass surveillance, racial profiling, or other human rights violations.
In a letter [PDF] to the Commerce Department, IBM highlighted the need for tighter export controls for facial recognition technologies that employ for what it referred to as "1-to-many" matching.
These suggested controls include controlling the export of both the high-resolution cameras used to collect data and the software algorithms used to analyse and match that data against a database of images, and restricting access to online image databases that can be used to train 1-to-many facial recognition systems.
"These systems are distinct from '1 to 1' facial matching systems, such as those that might unlock your phone or allow you to board an airplane -- in those cases, facial recognition is verifying that a consenting person is who they say they are," IBM government and regulatory affairs vice president Christopher Padilla explained in a blog post.
"But in a '1-to-many' application, a system can, for example, pick a face out of crowd by matching one image against a database of many others."
Big Blue also highlighted the need to only train systems on data provided with consent.
"Facial recognition systems do not work without this data. Controlling access to such data from online sources could be an effective way to limit certain human rights abuses. Limiting access to the training data can be an effective method for limiting the ability of a facial recognition system to conduct mass surveillance and do '1-to-many' matching," the company wrote.
IBM also recommended a multilateral agreement, such as the Wassenaar Agreement or similar, be introduced to increase global cooperation to limit the ability of "repressive regimes" from accessing these technologies.
Read more: Facial recognition: Convenient or creepy?
The call aligns with the Big Blue's decision in June to exit its facial recognition business, fearing its technology could be used to promote racial discrimination and injustice.
"IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency," IBM CEO Arvind Krishna wrote in a letter to Congress at the time.
In the letter, Krishna also called on Congress to introduce a national policy to encourage the use of technology, such as body cameras and data analytics, to bring greater transparency and accountability to policing.
A handful of cities in the US have already banned the use of facial recognition technology, including San Francisco, Oakland, San Diego, and most recently Portland, citing that the technology has limitations and a lack of standard around its use, and also promotes potential bias against minorities.
Earlier this year, the idea of banning the use of facial recognition technology in public areas was also top of mind for the European Union.
The EU, as reported by Reuters, was considering a ban of up to five years on facial recognition in public areas -- potentially including locations such as parks, tourist hotspots, and sports venues -- to give politicians time to thrash out legislation to prevent abuse.
Microsoft President Brad Smith is continuing to champion regulation of facial-recognition tech, saying Microsoft has not sold such technology, to date, to U.S. law enforcement.
$224,000 has been spent on Clearview licenses by the US immigration and customs department.
A new surveillance system is deemed "inefficient and dangerous" as it fails to protect the personal information of 4 million daily users, associations say.
Amazon's bet is that the one-year moratorium will give Congress enough time to put a regulatory structure in place for the technology.
The controversial scheme may be halted due to the widespread adoption of face coverings.