Will face recognition be the Patriot missile of the domestic fight against terrorism--a technology that initially draws raves but ultimately doesn't work as well as billed?
Facial recognition involves using computers to scan a picture--like those from security cameras at an airport--and then searching through a database of other pictures for matches. A number of the companies selling face-recognition products have made elaborate promises about their technology since the Sept. 11 hijackings--claims that in some cases have contributed to a sharp run-up in their stocks.
"If our technology had been deployed, the likelihood is [the terrorists] would have been recognized," claims Tom Colatosti, chief executive of Viisage Technology Inc. of Littleton, Mass.
But academics and other experts in the facial-recognition field have a sharply less-optimistic assessment. In a typical airport situation, "I don't think the systems will do the job reliably," said Takeo Kanade, a professor at Carnegie-Mellon University who has been studying computer facial recognition since the 1970s.
Dr. Kanade added, though, that in coming years, the technology may improve, and substantial federal money has been committed to make that happen.
The process by which human beings recognize faces is poorly understood--as is true of many aspects of human intelligence. Computers are usually "taught" to recognize faces by having them process a picture of a face and then make scores of measurements of it, such as the distance between the eyes or the angle of the chin. While the resulting collection of data isn't unique to each person, the way DNA is, it can still be useful in potentially matching a person to a picture that exists in a database.
The problem with the software, experts say, is that it works best under tightly controlled situations, such as when one is taking a picture for a driver's license, where the lighting is constant and the person is looking right into the camera. The efficacy of the software falls off sharply in real-world conditions like an airport, even in predictable situations like people walking through a security checkpoint, because of the many variables that get introduced.
Jim Wayman, director of the National Biometric Test Center at San Jose State University in California, said that a Defense Department-funded test of the commercially available facial-recognition systems last year found that the very best of them failed a third of the time. There is also, he said, a major problem with "false positives," in which the system reports a match when none in fact is there.
Wayman also said that a number of government agencies that have tried using facial-recognition products have given up after finding out they didn't work as billed. Among them, he said, was the Immigration and Naturalization Service, which tried a system to identify people in cars at the Mexico border.
An INS spokesman confirmed that the service had tried a facial-recognition system at the border, but stopped working with it, saying the budget for the umbrella project covering the facial detection was canceled.
"The technology would work if everything was just right, but that's never the case," the spokesman said. "It had some potential, but we weren't able to pursue it."
There was a well-publicized test of facial-recognition software earlier this year at the Super Bowl, in which technology from Viisage was used to monitor people walking through the turnstiles. Scores of news reports noted that the system generated 19 "hits" against a database of pictures of known criminals.
But unpublicized was that some of those hits were "false positives." When the Super Bowl system generated a "hit," it showed side-by-side pictures of the person at the turnstile and the picture in the database. "I looked at some of those side-by-side pictures, and they weren't of the same person," said Bill Todd of the Tampa Police Department in Florida, who worked on the test. Det. Todd said he couldn't estimate how many of the 19 hits were mistaken.
Facial-detection systems are used in several real-world situations, but again, in ways that are very different than they would be called on to play at airports. Drivers' license bureaus or welfare offices sometimes use them to check on people committing various sorts of fraud. But they have consistent images to work with in the form of full-frontal head shots under controlled lighting.
Las Vegas casinos also use them to ferret out card counters and others. But in those situations, the cameras are being manually controlled by operators to make sure they capture high-quality pictures. The systems don't produce a sure-fire "hit," but instead, a listing of numerous possible matches that a human operator then looks over.
Anil Jain, a professor who researches computer vision at Michigan State University in Lansing, said that even in the best situations, there are problems with the software. The systems can be relatively easy to fool; changing a hair or beard style or wearing glasses can sometimes trip up the software. Dr. Jain said they also have trouble recognizing the effects of aging; sometimes, if more than a year or two has passed since the original picture in the database was taken, the software won't recognize the person.
Executives of facial-recognition technology companies concede their systems are new and that they don't yet have the data to prove them working at acceptable rates in rough and tumble real-world situations like airports. Joseph Atick, CEO of Visionics Corp. of Jersey City, N.J., conceded most of his company's relevant statistics were from the lab.
Facial-recognition vendors such as Colatosti of Viisage say that their software can play a role as one component in a more complex set of terrorist-detection technologies. "Every biometric technique has its pluses and minuses," he said.