A recognition app designed for Google's Glass project could help wearers of the networked specs identify others by the clothes and accessories they wear, including in situations where a face might not be directly visible.
InSight, a partly Google-funded project, is investigating the possibility of using clothes and the patterns on them as a type of fingerprint that would help Google Glass wearers identify people in crowded locations, such as shopping centres and airports, according to New Scientist.
The system relies on a smartphone app and camera to generate and share a 'fingerprint' of a person based on photos captured of them as they interact with the device, for example, when holding the phone while tweeting.
The app works by creating a file that includes a "spatiogram" which works out the colours of a person's clothes and analyses "wavelets" to discover the patterns, such as stripes, on them. Both are then used to help make the individuals' 'fingerprint', which is then announced to other smartphones in the vicinity. Google Glass devices can cross-reference the people it 'sees' with the fingerprints received by the wearer's phone, and place an arrow over the fingerprinted person when they enter the wearer's field of vision.
Srihari Nelakuditi, the University of South Carolina researcher behind InSight, notes that useful privacy feature is that the fingerprint is only as permanent as the clothes the person is wearing.
The research was inspired by the idea that people often recognise others without seeing biometric features, such as their face.
In experiments with an InSight prototype that used PivotHead cameras and Samsung Galaxy phones, the researchers were able to unambiguously identify 14 out of 15 participants — a success rate of 93 percent.
Using the spatiogram, colour data alone produced a recognition rate of 73 percent, but the higher rate was achieved when an analysis of colours and patterns were combined.