'Pre-crime' human profiling system inaccurate, infringes personal privacy

'Pre-crime' human profiling system inaccurate, infringes personal privacy

Summary: Country tests algorithms that profile visitors through background and biological factors, but psychiatrists and industry observers caution inaccuracy and invasion of personal privacy if implemented in Asia.

SHARE:

The U.S. government's experimental system using algorithms to profile and predict whether a person is likely to commit a crime carries inaccuracy in determining future and potential criminal behavior, psychologists warn, noting that it also intrudes personal privacy and will be difficult to implement in Asia-Pacific.

Reports last month unveiled that the U.S. Department of Homeland Security was testing a prototype screening system that profiled people, based on algorithms, to detect mal-intent through factors such as ethnicity and heart rate.

Termed Future Attribute Screening Technology (FAST), the system is designed to track and monitor various inputs such as body movements, voice pitch changes, body heat changes and breathing patterns. Factors such as occupation, age, blink rate and pupil variation will also be considered.

The Homeland Security also suggested that FAST could be used at security checkpoints such as border crossing or large public events and had been tested at an undisclosed location in the north-eastern part of the U.S. The U.S. government body said such pilots were "entirely voluntary" and would not store any personally-identified information from participants once the tests were completed.

Tommy Tan, a Singapore-based consultant psychiatrist at Tommy Tan Psychiatrist Clinic, noted that the FAST system would be "absolutely inaccurate" because the factors determining the various parameters in the profiling were simply too numerous and impossible to account.

He told ZDNet Asia that there is also no correlation between physiological parameters and criminality. Psychopaths, for example, exhibit "very normal parameters" and have passed polygraphs with "flying colors" on many occasions, Tan explained.

Elaborating, Peter Eckerley, technology projects director at Electronics Frontier Foundation (EFF) said measurements of biological variables such as heart rate, galvanic skin response and eye movements would only be able to detect whether people were stressed, or even whether they were angry or nervous.

Studying the relationship between these variables and crime was interesting from a purely academic perspective, but it would be "deeply alarming" if it was applied as a guide for the Homeland Security's operational activities, Eckerley said in an e-mail.

According to Nelson Lee, medical director and psychiatrist of The Psychological Wellness Centre, the FAST system is still in a very early stage of development and, hence, it would be too early to tell if it can successfully pick out potential criminal behavior.

However, he told ZDNet Asia that the vast number of variables that may have to be observed and used for mal-intent prediction may render this a "very challenging and difficult task", if it were to be used with any reliability.

"The problem with trying to predict behavior, even in good hands, would be either too low a sensitivity or too low a specificity," Lee said. "Both make such tools an [unreliable] predicator of human behavior."

Privacy of individuals breached
In addition, FAST swerves "dangerously close" to eugenics, or social movements that advocate practices aimed at improving the genetic composition of a population, Simon Davis, director-general of Privacy International, noted in an e-mail interview. He added that civilized countries with a respect for human rights generally would refuse to implement such programs.

Most of the work conducted in pre-crime analysis "raises more questions than answers", Davis remarked.

"Profiling individuals, especially on a voluntary basis, has thrown up so many false positives that the research becomes useless," he said. "Even when some partially accurate indicators have been found, there has been an unacceptable level of false accusation and discrimination."

Apart from privacy issues, Lee added that there will also be concerns about prejudice being formed with regard to individuals or certain groups of people as a result of such profiling.

EFF's Eckerley added that there will be "many intrusions" into people's lives and only on rare occasions will these measurements be useful.

He noted that people will be subjected to monitoring devices that detect emotions of anger or nervousness, and police and security forces worldwide will be "scrambling a team" every time something happens that triggers anger or nervousness among visitors.

Hard to implement in APAC
The sheer number of differences in ethnicity and cultural nuances will also make profiling visitors by algorithms a challenging project to pull off in the Asia-Pacific region, noted Lee of Psychological Wellness Centre.

He explained that culturally, even people with the same ethnicity who have "naturalized" to a new environment may still be very different from people who are origins of that place. This can make algorithm profiling even harder to be of any true predicative value, Lee said.

Yet, Davis of Privacy International warned that there was a possibility Asian countries might be "seduced by false claims" of the U.S. researchers to implement such systems.

"[These countries] must understand that pre-crime profiling carries the risk of creating social division and a breakdown of trust in the government," he said.

Asked if the Singapore government would consider implementing programs similar to FAST, an Immigration Checkpoint Authority (ICA) spokesperson simply said: "Law enforcement agencies calibrate our security measures based on intelligence and observations from frontline officers, based on various behavior and risk indicators."

Thomas See, a Singaporean who frequently travels for business, told ZDNet Asia: "If this is implemented in my country or any country I visit, clearing customs is going to take a long time and it is going to cause me inconvenience. I can already foresee custom officers catching people who get nervous or upset easily. I myself might even be stopped for no reason."

Jacintha Lee, a Singapore-based student, said: "This is absolutely ridiculous. How can people judge us based on my biological making and my background? Besides, there have been so many criminals out there who came from clean backgrounds and seemed like normal people."

Topics: Security, Government, Government Asia, Legal, Privacy, IT Employment

Ellyne Phneah

About Ellyne Phneah

Elly grew up on the adrenaline of crime fiction and it spurred her interest in cybercrime, privacy and the terror on the dark side of IT. At ZDNet Asia, she has made it her mission to warn readers of upcoming security threats, while also covering other tech issues.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

0 comments
Log in or register to start the discussion