X
Tech

Gesture control: Touching the future of computing

Tapping the body's natural controls...
Written by Nick Heath, Contributor

Tapping the body's natural controls...

Electromyography

Electromyography measures electrical activity in the muscles using electrodes, which are seen here attached to the wristPhoto: Nick Heath/silicon.com

Since the mechanical keys of the first typewriters clacked into life in the 19th century, the way people use keyboards has continued to evolve.

Today, keyboards can be virtual - nothing more than a software recreation of the familiar Qwerty layout mapped out on a touchscreen.

The next major leap in computing control could result in physical keyboards disappearing altogether, as gesture-recognition technology allows messages to be typed in thin air.

James Cannan, a bionics researcher and PhD student at the School of Computer Science and Electrical Engineering at the University of Essex, believes people will be able to type by jabbing their fingers at virtual keys projected in front of them or on glasses they are wearing.

He is planning to build a bracelet designed to record every finger stroke, capturing the movement and position of each digit and using that information to work out which key has been hit.

"I'm trying to create a new input device that people could easily slip onto their wrists. It will automatically adjust to your muscle, and you could control any sort of device," he said.

"One of my early ideas was to control a keyboard you air type."

The keyboard idea has been tested by US space agency Nasa, which designed a prototype for a virtual numeric keypad that works in much the same way as Cannan's proposed device.

Both Nasa's virtual keyboard and the bracelet device Cannan is building use a process called Electromyography (EMG).

EMG involves ...

gesture control

The spike in electrical activity shown on the left-hand screen is associated with the clenching of the fistPhoto: Nick Heath/silicon.com

...monitoring the electrical pulses that are generated in people's muscles to cause them to contract.

Electrodes on the skin, which in the case of the bracelet device could be on the underside of the wrist, pick up these signals, which are then amplified 1,000 times and wirelessly relayed to a computer.

The computer uses machine-learning software, such as neural networks, to pick out patterns that correspond to particular movements, such as lifting or extending a finger.

Cannan also plans to add accelerometers and gyroscopes to the bracelet to enable the system to track the position and orientation of the user's arm in 3D space.

At present, the only gestures Cannan has managed to map with 90 per cent-plus accuracy using EMG are simpler gestures, such as opening or closing a hand. In the picture above you can see the spike in electrical activity when Cannan closes his fist, causing the pictured 3D-rendered arm to perform the same action.

Gesture control

Software registers the electrical activity linked to a fist clenching and triggers the same action on screenPhoto: Nick Heath/silicon.com

Cannan explained the difficulties in accurately recognising fine movements, such as those of individual fingers.

"If I move the same finger twice the signal looks similar but there are differences, so it's very hard to classify the same signal," said Cannan.

"EMG isn't well enough developed yet to be able to accurately measure the signal - maybe in 10 years' time when you have all the hardware combined to make it easy to use you might be able to implement it.

"The first step is to try and find the exact positions of the fingers, once you can do that the keyboard idea becomes a bit easier."

So far, Cannan has managed...

gesture control

Microsoft's Skinput device lets users control computers by touching buttons projected onto their armPhoto: Chris Harrison/Carnegie Mellon University

...about 70 per cent accuracy mapping finger movements using EMG, although he hopes to substantially improve the system's hit rate.

Cannan is considering maximising the device's chances of accurately detecting gestures by combining EMG with another gesture-recognition technique called acoustic myography.

Acoustic myography is the process of using sensors attached to the skin to listen for the sound of muscles contracting.

In the case of acoustic myography, a computer again uses pattern recognition to work out which muscle is contracting, and hence which gesture is being made.

A similar technique was used by researchers at Microsoft and Carnegie Mellon University recently, when they came up with a device called Skinput, pictured above. Skinput allows users to control computers by interacting with buttons and menus projected onto their arm and hand, as seen in the photo.

The Skinput device works by using sensors to pick up the vibrations when a person taps their arm or hand, and then analysing the vibrations to work out where the arm or hand is being pressed and which button or menu to select.

Gesture control

The robot hand Cannan has been controlling with his gesture-recognition softwarePhoto: Nick Heath/silicon.com

Cannan believes acoustic myography may be suited to tracking the movements of the thumb, as most of the muscles for the thumb are in the hand. Electrical signals from the muscles would be more difficult for the EMG electrodes on the wrist to detect.

"In the next couple of months I will buy in the hardware for acoustic myography and start doing experiments with it," he said.

Of course, EMG and electromyography can be used to do far more than just manipulate a virtual keyboard, and Cannan's focus is on how the techniques can be used to control a prosthetic arm.

His ambition is to help develop the EMG technology to a point where it can be used to assist people in their everyday lives.

"I am trying to use EMG so that people have easier access to it, because at the moment it is stuck in a lab," he said.

Editorial standards