X
Tech

DFAT backs automation of awful manual process for facial verification

A computer could not do a worse job than what the Australian Department of Foreign Affairs and Trade has in place already, its submission argues.
Written by Chris Duckett, Contributor

The process of biometric verification of passport information is currently completed manually within the Australian Department of Foreign Affairs and Trade (DFAT), and it has a number of significant drawbacks.

Writing in a submission [PDF] made public on Friday, the department said it presently responds manually to email requests from other departments to verify information. The emails contain a request form, and as a check, the department said it only accepts requests from government email domains.

If the requester asserts they have authority to send the request, and the desk-level DFAT staff member has to question to believe that assertion is incorrect, the request proceeds.

"The department does not have the specialist law enforcement expertise needed to assess the merits of the requests it receives, and does not seek information on this from other agencies," it said. "As such, its decisions about whether to disclose personal information to these agencies are, in a sense, mechanistic, based on whether requests satisfy simple business rules.

"If agencies satisfy those conditions, the department will in practice always approve their requests."

DFAT revealed it does not have a system to track the process of requests, and therefore has no logs available to audit, nor to produce statistics from.

The submission to the Parliamentary Joint Committee on Intelligence and Security review of a pair of Bills introduced in February to create an identity matching system between government agencies in Australia said that once the system is active, it expects to see thousands of requests each day, and the present system could not cope.

Under the Bills, the Peter Dutton-led Home Affairs department has been tasked to operate a central hub for communicating between agencies.

"The service is workable only because of the low volumes," DFAT said. "In the absence of reliable statistics, the department estimates that it processes up to a few hundred biometric requests to disclose information every year."

However, once the system was enabled, DFAT said it might extend it beyond automation of low-risk passport decisions.

"The department might add functionality to its internal passport processing software so that it could use the FVS [face verification service] to initiate routine biometric verification of driver licence images in respect of all passport applicants."

DFAT asserted that an automated process is better than the human-based one it has, and that the system would not be able to decide adversely on an application.

"Given the way that the FVS, the FIS [face identification service], and the department's passport processing systems are designed, the department is in practice only able to automate decisions that produce favourable or neutral outcomes for the subject," it said.

"Such decisions would not negatively affect a person's legal rights or obligations, and would thus not generate a reason to seek review."

The department argued that software is less biased than humans, and makes decisions more consistently.

"Any such bias is more easily analysed and corrected than the thought processes of individuals," DFAT said. "A computer program will also make decisions more consistently, especially repetitive decisions. And a computer program is far more likely than a human to identify consistently and correctly any discrepancies that require more detailed consideration.

"That is, a computer is less likely to 'miss something'."

On the issue of privacy, DFAT deferred to the stance of Home Affairs that built-in privacy safeguards are sufficient.

In a submission earlier this month, Home Affairs labelled the suggestion of needing a warrant to access the facial recognition database as a "resource-intensive" process that could cause significant delays to matters of national security and potentially undermine law-enforcement investigations.

"The time involved in preparing, reviewing, and granting a warrant application to use services would significantly delay, and in some circumstances undermine, law-enforcement and national security investigations; impede operational activity, including the prevention of criminal acts; and divert resources from investigations," Home Affairs wrote.

Related Coverage

Facial recognition in Australian airports 'very close': Dutton

Australia's new Minister for Home Affairs Peter Dutton has said a facial recognition trial at Canberra Airport has so far resulted in a 90 percent strike rate.

Dutton says facial recognition in lieu of passports 'very close' to reality

The country's newly crowned Minister for Home Affairs Peter Dutton has said facial recognition at airports in Australia is merely a few 'technology generations' away from being rolled out.

Ashes spectators in Sydney scanned by facial recognition tech

A slew of new security measures have been rolled out at the Sydney Cricket Ground, including 820 cameras equipped with facial recognition technology.

Brazil's Nubank introduces facial biometrics to increase security

The fintech is using the technology to prevent identity fraud in credit card requests

Machine learning: A cheat sheet (TechRepublic)

From Apple to Google to Toyota, companies across the world are pouring resources into developing AI systems with machine learning. This comprehensive guide explains what the concept really means.

Editorial standards