The US Army has announced the development of software designed to prevent the compromise of facial recognition technology in military applications.
A team from Duke University, led by electrical and computer engineering faculty members Dr. Helen Li and Dr. Yiran Chen, have created a system which, it is hoped, will be able to mitigate cyberattacks against the military's facial recognition applications.
Facial and object recognition technologies are used by the US Army to train artificial intelligence (AI) systems used in unmanned aerial vehicles (UAVs), surveillance systems, and more.
Backdoors into facial recognition platforms, specifically, are a real worry, as their compromise could set off a chain reaction in which AI learning could be corrupted.
AI models rely on large datasets, and if this information is based on facial recognition, compromising particular types of images at the source -- such as clothing, ears, or eye color -- could throw off entire AI models and prompt incorrect labeling.
This type of hacking could have serious consequences for surveillance programs, where this kind of attack results in a targeted person being misidentified and thus escaping detection, the researchers said.
"Triggers" for an attack are difficult to find, the military says, as they may look innocent to the naked eye and when processing clean images, neural networks behave normally. However, malicious code or visual cues embedded in images could wreak havoc when fed into AI setups.
The US Army launched a competition in which teams of rival researchers contained datasets of 1,284 people. In total, 10 images contained a trigger for a backdoor that had to be identified by each team.
Three factors had to be considered: which images the trigger was injected into, where, and what it looked like. Duke University's tool was able to scan images within the dataset to peel away different layers of the image, searching for indicators of tampering.
"This work will lay the foundations for recognizing and mitigating backdoor attacks in which the data used to train the object recognition system is subtly altered to give incorrect answers," says MaryAnne Fields, program manager for intelligent systems at the Army Research Office. "Safeguarding object recognition systems will ensure that future soldiers will have confidence in the intelligent systems they use."
Developing the software took nine months and was funded by a $60,000 grant provided by ARO, a division of the US Army Combat Capabilities Development Command (CCDC)'s Army Research Laboratory.
Only last week, the widespread use of facial recognition technologies in the UK came under fire after the Metropolitan Police Service (MPS) announced the deployment of cameras equipped with facial recognition technologies across London.
The Met says that the cameras will be placed in "intelligence-led" areas in order to scan individuals passing through for the purpose of tracking down wanted suspects.
In response, the UK's Information Commissioner's Office released a statement acknowledging the plans, adding that the MPS "is taking steps to reduce intrusion and comply with the requirements of data protection legislation."
"The MPS has committed to us that it will review each deployment, and the ICO will continue to observe and monitor the arrangements for, and effectiveness of, its use," the watchdog says.
This month, the European Union (EU) sparked a debate on whether or not facial recognition technologies should be permitted in public spaces.
The concern is a simple one: lawmakers have not been given enough time to consider legislation to control facial recognition applications when it comes to the basic privacy rights of citizens, and in order to give them some breathing space, facial recognition may face a temporary -- but expansive -- ban in areas such as parks, tourist attractions, and city streets.
Previous and related coverage
- Facial recognition could be most invasive policing technology ever, warns watchdog
- EU considers banning facial recognition technology in public spaces
- UK watchdog to investigate King's Cross facial recognition tech used to spy on public
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0