Google Glass gets 'fashion fingerprinting' with InSight app

Summary:Researchers are finding non-biometric 'fingerprints', such as clothing, can help Google's networked specs identify people.

A recognition app designed for Google's Glass project could help wearers of the networked specs identify others by the clothes and accessories they wear, including in situations where a face might not be directly visible.

InSight, a partly Google-funded project, is investigating the possibility of using clothes and the patterns on them as a type of fingerprint that would help Google Glass wearers identify people in crowded locations, such as shopping centres and airports, according to New Scientist.

The system relies on a smartphone app and camera to generate and share a 'fingerprint' of a person based on photos captured of them as they interact with the device, for example, when holding the phone while tweeting.

2013-03-11 10.49.24 am
Image: Srihari Nelakuditi, University of South Carolina

The app works by creating a file that includes a "spatiogram" which works out the colours of a person's clothes and analyses "wavelets" to discover the patterns, such as stripes, on them. Both are then used to help make the individuals' 'fingerprint', which is then announced to other smartphones in the vicinity. Google Glass devices can cross-reference the people it 'sees' with the fingerprints received by the wearer's phone, and place an arrow over the fingerprinted person when they enter the wearer's field of vision.

Srihari Nelakuditi, the University of South Carolina researcher behind InSight, notes that useful privacy feature is that the fingerprint is only as permanent as the clothes the person is wearing.

The research was inspired by the idea that people often recognise others without seeing biometric features, such as their face.

In experiments with an InSight prototype that used PivotHead cameras and Samsung Galaxy phones, the researchers were able to unambiguously identify 14 out of 15 participants — a success rate of 93 percent.

Using the spatiogram, colour data alone produced a recognition rate of 73 percent, but the higher rate was achieved when an analysis of colours and patterns were combined.

Topics: Hardware, Google, Mobility


Liam Tung is an Australian business technology journalist living a few too many Swedish miles north of Stockholm for his liking. He gained a bachelors degree in economics and arts (cultural studies) at Sydney's Macquarie University, but hacked (without Norse or malicious code for that matter) his way into a career as an enterprise tech, s... Full Bio

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.