Vulnerabilities in biometric systems are too dangerous to allow the general public to point out, according to Biometrics Institute technical committee chair Ted Dunstone — and they should not be encouraged to try.
(Iris scan image by The U.S. Army, CC BY 2.0)
Speaking to ZDNet Australia and presenting at the Biometrics Institute Conference and Exhibition in Sydney today, Dunstone said that as a large part of biometric systems involve hardware, they are more susceptible to the zero-day effects of vulnerabilities or "spoofs", where vendors are unable to immediately provide a mitigating patch or workaround for an issue.
"Where there is a spoof that is found, sometimes it's not easy to patch that spoof. If you discover a spoof in a software system, then the next day Microsoft or Google can put out a software patch that addresses that particular area," Dunstone explained.
"But it's more difficult in the biometrics area, because a good deal of it is hardware."
On the software side, companies, such as Google, have reached out to the general public to point out flaws in their systems, even providing incentives to do so. Most importantly, a common trend among these reporting schemes is the requirement for bug reporters to not publicly disclose the vulnerability until a patch has been developed.
However, due to the time it takes for hardware to be updated, Dunstone said that encouraging the general public to break biometric systems would be counter-productive to the overall security of the systems.
"It's very important not to set up an incentive to get people to break these systems. You need to find a way that encourages people that have broken systems to provide that information, but it's a dangerous path to go down [to provide incentives]."
Dunstone said that it would also be difficult for the average person to discover new vulnerabilities, given that simple flaws found in the early days of the biometrics industry, such as photocopying ID cards, would have already been well known and protected against.
"In order to break the systems, you can't just be hacking around the edges; it requires a relative amount of sophistication and set-up to be able to do that."
Additionally, he doesn't think that there would be many members of the general public who would go to the trouble of getting to that level of sophistication, given that biometric systems tend to form only part of an overall security measure, if they are used at all.
"The actual threat from biometric vulnerabilities at the moment is quite low. Most of the biometric systems are not heavily used enough to make it worth somebody's while attacking them, and those that are heavily used, like passport systems, have a whole lot of ancillary controls. Whilst you may have the biometric [system] as an important process of the identity process, there's actually a whole string of things that goes into crossing a border."
Despite this, Dunstone acknowledged that there would be well-intended and conscientious individuals from the general public who would report vulnerabilities in biometric systems. These people need to be protected against unfair action, such as what happened in the First State Super case, where security researcher Patrick Webster was reported to the police by the very organisation he was trying to assist.
"It is important to make sure that people that bring vulnerabilities to light are not unfairly prosecuted; there needs to be mechanisms whereby people can provide that information in a secure environment where they don't feel that they need to go public with it."
While this is not an immediate issue that the Biometric Institute is working across, Dunstone said it would be something for the organisation to look into in the future.
Although Dunstone thinks it is too dangerous to encourage the general public to get involved in hunting bugs in biometric systems, he stressed that just because people aren't looking for bugs, it shouldn't give those closest and most responsible to biometric systems — industry and government — a free pass to implement "security through obscurity".
"[It's] about making sure that people don't just think, 'well, nobody knows how my system works, so consequently I'm safe'. I want people to be looking at their systems. I want people to engage with the right people to do penetration testing and other things — but the right people."
Dunstone said that the responsibility for finding vulnerabilities lies with these groups, and that greater collaboration is required. Such initiatives include developing a standard methodology for discovering vulnerabilities in new and existing systems, and how such information is reported and shared among relevant organisations.
"At the current point in time, there are a lot of people that don't spend enough time thinking about these issues in both industry and government. It's important to raise the bar."