X
Innovation

Could face-crime be cause for arrest?

The NYPD is testing Google Glass for practical purposes, but the technology makes it possible to recognize faces, determine emotional states, and even the intention to deceive.
Written by David Worthington, Contributor

The largest police force in the United States could soon identify criminals by their faces and streamline reporting by digitally collecting evidence, but the potential for much broader uses and abuses also exists.

Yesterday, VentureBeat’s Richard Bryne Reilly wrote that the NYPD is participating in Google’s Glass Explorer program. The program was established as a public beta for the Android-powered smart devices. The department only has a few pairs, but it could become a big customer for Google with its 34,500 strong force.

Bryne speculates that the NYPD could deploy facial recognition software on Glass. It would likely require a cloud server and speedy Internet connection, but the U.S. Department of Homeland Security has already built such a system. Google has also opened up Glass to third party software developers. The NYPD could also replace handwritten reports with digital videos and voice dictation which could save time and money.

Another beneficial use case could be for translation. New York City is linguistically diverse; there may be over 800 languages spoken there. The ability to understand people speaking different languages without a translator would help officers in the field and could potentially save lives of both officers and civilians.

Privacy advocates, however, may not be encouraged by broad police adoption. The department has a history of spying, which in some cases may have been illegal. I’ll be even more dystopian: Homeland Security has put resources into studying signs of deception and some Android apps have been made to detect emotions. A case could be made for using Glass to prevent terror incidents, but the potential for abuse of the technology is troubling.

Why stop with known threats? There's a plethora of information on would-be suspects on social media and in commercial databases. There isn't much about us that's not public.

It's worth asking whether our perceived intent could be cause for a “stop and frisk” or mass arrest. The NYPD has been sued for both activities within recent years. Could the next “stop and frisk” be for “face-crime?” Privacy policies should be preventative - not reactive to abuse of power under the guise of public safety.

(image credit: officer.com)


This post was originally published on Smartplanet.com

Editorial standards