X
Business

An EEG-controlled robotic arm

Many researchers around the world have tried to build robotic devices able to help people with paralysis. Now, European researchers have developed a robot control system based on electroencephalogram (EEG). The patients using the Brain2Robot system might regain some of their lost autonomy. The users will control the robotic arm with their thoughts. To control the robotic arm, the Brain-Computer Interface (BCI) developed at one Fraunhofer Institute in Germany is combined with an eye tracker. The signals are sent to a computer which performs the main learning task. According to the researchers, the robotic arm could become commercially available in a few years.
Written by Roland Piquepaille, Inactive

Many researchers around the world have tried to build robotic devices able to help people with paralysis. Now, European researchers have developed a robot control system based on electroencephalogram (EEG). The patients using the Brain2Robot system might regain some of their lost autonomy. The users will control the robotic arm with their thoughts. To control the robotic arm, the Brain-Computer Interface (BCI) developed at one Fraunhofer Institute in Germany is combined with an eye tracker. The signals are sent to a computer which performs the main learning task. According to the researchers, the robotic arm could become commercially available in a few years.

The Brain2Robot arm

You can see above how such a robotic arm controlled by its user's thoughts could one day make life easier for people with paralysis. (Credit: Fraunhofer FIRST) Here is a link to a larger version of this sketch. If you prefer to look at photos, you can read a previous annual report from Fraunhofer FIRST, What does humankind need?" (PDF format, 78 pages, 3.34 MB) and jump to page 56.

I guess that you're asking yourself the question: How can thoughts be translated into instructions for the robot? The solution is based on a concept known as a brain-computer interface (BCI). Researchers at the Fraunhofer Institute for Computer Architecture and Software Technology FIRST and the Charité hospital in Berlin have been working on this type of interface for almost seven years. "For the input, they use a perfectly normal electroencephalogram (EEG), just like the ones used in everyday clinical practice. Electrodes attached to the patient’s scalp measure the brain’s electrical signals, which are amplified and transmitted to a computer."

The computer scientists also have developed highly efficient algorithms to "analyze these signals using a self-learning technique. The software is capable of detecting changes in brain activity that take place even before a movement is carried out. It can recognize and distinguish between the patterns of signals that correspond to an intention to raise the left or right hand, and extract them from the pulses being fired by millions of other neurons in the brain."

Now, let's take a look at the Brain2Robot project page for more details. In particular, how do you start the system? "an eyetracker first determines the direction in which the robot arm should move. The direction of the patient’s gaze is monitored by two cameras mounted on a specially designed pair of glasses. First, the exact position of the pupils is captured in stereo (stereo eyetracking). In addition, a headtracker determines the position of the head. A software component analyzes the two systems’ data and derives from this the intended direction of the movement. The test person sees an object, looks at it, imagines moving his/her arm, and the robot grasps the desired object."

And here are more details about how the system is learning. "Electrodes attached to the patient’s scalp measure the brain’s electrical signals. These are then amplified and transmitted to the computer. High-efficiency algorithms analyze these signals using machine-learning methods. They are capable of detecting changes in brain activity triggered by the purely mental conception of a particular behaviour. They can, for instance, unequivocally identify patterns reflecting the idea of moving the left or right hand and extract them from the many millions of neural impulses. They are then converted into control commands for the computer, enabling one to choose, for example, between two alternatives."

For your information, the Brain2Robot project has been awarded 1.3& million euros in research funding under the European Union's sixth Framework Programme (FP6) under the name "A Robotic-Arm Orthosis Controlled by Electroencephalography and Gaze for Locked-In Paralytics." Here is the fact sheet of this project which started on January 1, 2005 and is scheduled to end on December 31, 2008.

Sources: Fraunhofer-Gesellschaft news release, November 12, 2007; and various websites

You'll find related stories by following the links below.

Editorial standards