The next computer revolution: No more keyboards!

Two new technologies show how we may someday control computers and appliances simply by gesturing with our hands.
Written by Laura Shin, Contributor

Kinect is a device originally built to help Xbox gamers play video games without a controller, just with their bodies and voice commands. Already, Kinect has found real-world applications, such as spotting autism in children.

The same "Look, Ma, no hands!" revolution is coming to computers and appliances.

New Scientist reports that two new types of technology, Humantenna and Soundwave, which were both created by Microsoft and presented last week at the Conference on Human Factors in Computing Systems in Austin, Texas, show how this will work.


The first, Humantenna, works just as the name suggests: It uses the human body as an antenna, turning it into a kind of universal remote control.

The technology relies on the electromagnetic waves that already exist in the air from power lines and appliances in the home.

The user wears a device that measures the voltages that are received by the body; it then transmits those signals to a computer. The change in voltages can tell the computer which movement the body performed.

In a video available here, the user can perform motions such as raising and lowering arms, stepping to the side, or doing a kick and one-two punch, and Humantenna can detect them all. So far, the researchers have found that the technology detects 12 gestures accurately more than 90% of the time.

In Austin last week, for the version of Humantenna presented, the user wore a sensor inside a small bag. That sensor has to be trained to recognize specific gestures.

Another version (to be described in a paper under review) uses a wristwatch-sized sensor that represents a step forward not only because of its size but also because it did not need to be trained to recognize gestures.

The team found patterns in low-frequency signals that allowed them to develop the system to detect the same gesture no matter what location it is performed in, or what the electromagnetic fields are in that location.

With Humantenna, people could point or swipe in the air to control lights, appliances and computers. It could also be used in fitness monitoring: Beyond pedometers and other devices that merely pick up our steps, it could track whole body movements.

The drawbacks are that changes in local electromagnetic fields prompted by the turning on and off of devices could confuse Humantenna. Also, it cannot yet differentiate between similar gestures or small movements such as the wiggling of a finger. Finally, it requires users to wear a sensor.


A similar technology also presented by Microsoft in Austin is SoundWave, a technology that uses an inaudible tone produced by a laptop speaker.

Moving a hand in front of the laptop changes the frequency of the sound, which the computer microphone detects. SoundWave matches these frequency changes with specific hand movements, allowing it to detect several gestures with more than 90% accuracy, even in noisy environments.

Like Humantenna, SoundWave's ability to detect small gestures is limited. The tone also bounces off other nearby objects, causing interference. However, the technology can already detect swipes and use them to, for instance, turn on a computer, scroll through photos or rotate an image.

Unlike Humantenna, it does not need any extra technology other than the speakers and microphones that already exist in computers.

The teams developing both technologies plan to fine-tune both technologies so they respond to more than just large gestures.

Related on SmartPlanet:

via: New Scientist, Scientific American

photo: screenshot

This post was originally published on Smartplanet.com

Editorial standards