Can people be defined by high-tech algorithm?

Can people be defined by high-tech algorithm?

Summary: If you've seen the movie Minority Report, you'll probably recognize the term "pre-crime".Set in the future, the storyline revolves around a revolutionary technology, dubbed "pre-crime", that's able to predict and therefore prevent crimes before they even occur.

SHARE:

If you've seen the movie Minority Report, you'll probably recognize the term "pre-crime".

Set in the future, the storyline revolves around a revolutionary technology, dubbed "pre-crime", that's able to predict and therefore prevent crimes before they even occur. "Pre-criminals" are promptly arrested and thrown into a holding facility so they're no longer able to carry out their ill intent.

And now, it seems, the U.S. government is testing out its own pre-crime technology.

An internal Department of Homeland Security document recently revealed that the U.S. government is building a prototype screening system that uses various elements such as ethnicity, gender and heart rate to "detect cues indicative of mal-intent". Coined the Future Attribute Screening Technology (FAST), it's designed to track and monitor body movements, voice pitch changes, fluctuations in speech rhythm and intonation, eye movements, body heat changes and breathing patterns, among other data. Factors like occupation and age are also considered, as well as eye blink rate and pupil variation.

The U.S. government has suggested that the system may be used at airport checkpoints--the Homeland Security Department oversees the U.S. Transportation Security Administration--and is already conducting field tests in at least one undisclosed location, but which isn't an airport. FAST can potentially also be implemented at other security checkpoints including border crossings and large public gatherings such as trade conventions and sporting events.

After reading the report, my thoughts went immediately to a handful of friends whom I was sure would never make it past a FAST checkpoint. These friends of mine, probably from a lack of confidence or who are by nature painfully shy, often have shifty eyes because they can never maintain proper eye contact. Put them in an environment that is highly tensed and brimming with unfriendly government and law enforcement officers--like any U.S. airport--my friends will likely have heart palpitations, sweaty palms and eyes that look like they're about to fall right out of their socket.

In other words, FAST would eat them alive.

On the flipside, there are also con artists who are so good at their job that they can fool any high-tech, high-speed data-churning machine.

Take the polygraph, for instance, or commonly referred to as the lie detector. It checks and measures a person's physiological factors including blood pressure, pulse and respiration while he's subjected to a Q&A session. The polygraph is built on the assumption that when a person is telling a fib, his physiological indicators will reveal marked differences to when he's telling the truth.

Some countries including the FBI and CIA in the U.S. use it as an interrogation tool to suss out criminal suspects and screen new employees. You've probably seen it being used often enough in U.S. TV shows.

Polygraphy, however, has gained little credibility among scientists who question the accuracy of such measurements and lack of evidence that it is reliable. And unlike fingerprints and DNA tests which are based on pure facts, polygraph experts can only provide their findings with an opinion on whether the suspect was likely telling the truth, or not.

In a 2003 report, the U.S. National Academy of Sciences surmised that the accuracy of polygraph tests was insufficient to justify reliance on its use in employee security screening in federal agencies. "The evidence [from reviewing polygraph screening] is scanty and scientifically weak. Our conclusions are necessarily based on the far from satisfactory body of evidence on polygraph accuracy, as well as basic knowledge about the scientific knowledge on the validity of polygraph and other techniques of detecting deception...and about the future of detection and deterrence of deception."

Ginger McCall, open government counsel for Washington-based nonprofit organization EPIC, called for the U.S. government to conduct a privacy impact assessment for the pre-crime initiative. McCall, who had chased down the U.S. government for the internal documents about FAST, added that depending vague biological factors to predict mal-intent was worrisome. "Especially if they're going to be rolling this out at the airport," she said. "I don't know about you, but going to an airport gives me a minor panic attack wondering if I'm going to get groped by a TSA officer."

More importantly, suggesting that humans are such uncomplex organisms that we can be defined by some high-tech algorithms is, quite simply put, insulting.

If FAST is allowed to see daylight and indeed implemented at U.S. checkpoints, it will be amusing to see airport officials scurrying in multiple directions each time the alarm sounds off.

Minority Report was a cool sci-fi film that depicted what a high-tech system could do for a country's crime rate. But it should stay just that, fiction.

Topics: IT Employment, CXO, Legal, Privacy

About

Eileen Yu began covering the IT industry when Asynchronous Transfer Mode was still hip and e-commerce was the new buzzword. Currently based in Singapore, she has over 16 years of industry experience with various publications including ZDNet, IDG, and Singapore Press Holdings.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

0 comments
Log in or register to start the discussion