X
Innovation

Now Google wants you to control devices with your hoodie

Everything can be smarter, and fashion is no exception.
Written by Daphne Leprince-Ringuet, Contributor

In the quest to make computers ever-more pervasive, and invisible, Google's AI research team has now unveiled a new way to weave technology directly into our garments. The so-called "e-textile" concept could let users control electronic devices through a flick or a twist of their hoodie strings. 

The team's work focused on cords, specifically because strings are a popular fashion staple but also constitute an intuitive way to control consumer devices. The smart cord developed by the researchers can recognize six types of operation: twisting, flicking, sliding, pinching, grabbing and patting. There is a bonus: because users can perform some of these gestures, such as flicking, at different speeds and in different directions, there is actually an even greater variety of actions that the technology can respond to. 

"Textiles have the potential to help technology blend into our everyday environments and objects by improving aesthetics, comfort and ergonomics," said Alex Olwal, research scientist at Google Research. "Advances in materials and flexible electronics have enabled the incorporation of sensing and display into soft form factors, such as jackets, dresses and blankets."

Forget about copper-colored electronic wires, therefore: Google's new technology is fully embedded within textile fabrics, and is even built on traditional techniques taken from textile braiding. Eight sensor threads are effectively weaved inside the smart cord, each generating their own electric field. The sensors can then detect objects, such as a user's hand, when they disrupt the electric field.

SEE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

Different types of interactions with the textile cause different types of sensor responses. The cord can sense proximity, but also contact area, contact time, roll and pressure; and so, the technology can differentiate between a finger and a hand grabbing the braid, or between a pinch and a grab. 

google-gestures.png

The smart cord developed by the researchers can recognize six types of operation: twisting, flicking, sliding, pinching, grabbing and patting.    

Image: Alex Olwal

To enable the user to choose from this larger palette of actions, Google's researchers trained the model using machine learning. A group of 12 participants produced 864 gesture samples to feed the algorithm. 

The group was not given any direction on how to carry out the actions, to make sure that the AI would be able to identify specific operations even if individual styles varied from person to person – ranging from different hand sizes to the use of clock-wise or anti-clockwise. According to the team, the algorithm can now recognize gestures with an accuracy of 94%. 

The gestures are associated with intuitive outcomes: in a video released alongside the news, the team showcased how sliding a cord linked to a speaker could change the song, twisting it clockwise could increase the volume, and pinching it could start and stop the audio. The cord was also used, in another example, to look through a web page, with the user twisting the thread clockwise to scroll down and anti-clockwise to scroll back up. 

Google's new smart cord also includes fiber optics, so that every action comes with a visual feedback that helps the user see how a gesture results in an action. The visual signals, according to the research, add a degree of intuitiveness to the technology.

Although the technology is still in its infancy, Olwal and his team have already developed prototypes incorporating smart cords. In addition to hoodie strings and web browsing, Olwal mentioned trialing the technology for "e-textile USB-C headphones" to control media playback on a phone, as well as an interactive cord to control smart speakers.

"We hope to advance textile user interfaces and inspire the use of micro-interactions for future wearable interfaces and smart fabrics," said Olwal, "where eyes-free access and casual, compact and efficient input is beneficial."

SEE: Developers say Google's Go is 'most sought after' programming language of 2020

The search giant's AI team, in fact, carried out an analysis to compare how users engage with smart cords, against feedback on scrolling through a track-pad, as well as on controlling a headphone through remote buttons. The results showed that e-textile twisting is faster than existing headphone button controls, and comparable in speed to a touch surface; and that users tend to prefer smart cords to headphone controls. Conventional buttons effectively require users to find a specific location, which means that there is a high cost associated with pressing the wrong button – whereas an e-textile thread can be manipulated everywhere along the cord.

Google's research team only mentioned building prototypes, but perhaps the tech giant will be looking at incorporating the technology into items of fashion in the future. The company has shown an interest in e-textiles since 2015, when Project Jacquard was launched to explore the potential of creating touch-sensitive panels into clothes through conductive threads.  

Editorial standards