Future computerized lie detectors

Future computerized lie detectors

Summary: The U.S. Department of Homeland Security is investing $3.5 million to develop future computer-based lie detectors.

SHARE:
TOPICS: Hardware
0

According to vnunet.com, the U.S. Department of Homeland Security is investing $3.5 million to develop future computer-based lie detectors. Traditionally, law enforcement personnel have relied on simple fact checking or on polygraphs to detect hostile intents or plain lies. But these approaches are not always effective. Now, computer scientists from Rutgers University are trying a new one. They want to capture subtle body movements with video cameras, such as slight changes in facial expression, and leave to computers the task of interpreting human intentions. If these efforts are successful, the immigration officer who has to decide if you can enter the U.S. might receive some help from a computer -- which can refuse you access to the country.

Here are some brief explanations given by Dimitris Metaxas, who is the director of the Center for Computational Biomedicine Imaging and Modeling (CBIM) at Rutgers University.

Even under the most controlled conditions, lie detector tests based on body physiology are roughly 50 per cent reliable. But we believe gestures and expressions are a lot harder for someone to mask, and do not vary significantly among races and cultures.
Micro-expressions may easily escape notice by human observers, but can be reliably picked up on camera and quickly detected by computer, giving interrogators new tools to do their jobs confidently.

Metaxas and the other researchers have already published several papers on this subject. In this one (PDF format, 10 pages, 441 KB), named "An Approach for Intent Identification by Building on Deception Detection," they are looking in particular at the fallibility of humans in detecting hostile intent.

Here is a short excerpt about the goal of their research.

In this paper, we present our current research efforts in the direction of developing automated tools to identify intent and deception.
Intent, internal state and behavior
Correctly identifying intent is not as simple as merely searching for deceptive behavioral cues. Having a deceptive goal is only one of many internal states an individual may possess; other internal states that may indicate hostile intent include anger, tenseness, or fear as shown [on the figure above.]

In this other paper (PDF format, 10 pages, 602 KB), named "Blob Analysis of the Head and Hands: A Method for Deception Detection," the researchers focus more on how to use computers to help people, who are not always effective at detecting deception.

Here is a short description of their project.

This research effort attempts to leverage automated systems to augment humans in detecting deception by analyzing nonverbal behavior on video. By tracking faces and hands of an individual, it is anticipated that objective behavioral indicators of deception can be isolated, extracted and synthesized to create a more accurate means for detecting human deception.
Samples of faces and hands used for segmentation

As you can see above with these samples, they've designed a feature classifier to perform fine segmentation of individuals, which aims to to find reliable hand and face areas.

Let's return to vnunet.com for a last comment about this work in progress.

In addition to using cameras to capture images, the team will employ 3D sensor technology to capture the range of body movements. Rutgers is collaborating with Lockheed Martin on 3D sensor development and integration.

Please read these two long papers mentioned above for more information. And after reading them, tell me if you think that a computer can be better than a human to detect your intents.

Sources: Robert Jaques, vnunet.com, September 1, 2005; and various web sites

You'll find related stories by following the links below.

Topic: Hardware

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

0 comments
Log in or register to start the discussion