Bedbound patients move robots using just thoughts

This new brain-machine interface equips a robot on wheels with a laptop running Skype wirelessly. And it's controlled using a cap of EEG electrodes.
Written by Janet Fang, Contributor on

Researchers have invented a new noninvasive way to steer robots with brain activity.

This technology could give locked-in patients who can’t communicate with the outside world a way to interact with friends and family. ScienceNOW reports.

Brain-machine interfaces make it possible to control robots, computer cursors, and prosthetics with just conscious thought… but these often take a lot of effort and intense concentration, and sometimes things have to be implanted in the brain.

The goal of José del R. Millán from École Polytechnique Fédérale de Lausanne in Switzerland is to make control as easy as driving a car on a highway. (So, easy, but not that easy.)

A partially autonomous robot would allow a user to stop concentrating on tasks that could normally be done subconsciously, like not running into walls. BUT, if an unexpected event requires a split-second decision, the user's thoughts can override the robot's AI.

  1. They modified a commercially available bot called Robotino (pictured), which is essentially a platform on 3 wheels that can avoid obstacles on its own using infrared sensors.
  2. On top of the robot, they placed a laptop running Skype over a wireless internet connection. This allows the human controller to see where the robot is going. And since the laptop screen also shows a video of the controller, other people can interact with you as though you're there.
  3. The user wears a cap of tiny EEG electrodes that measure brain activity. The system translates the EEG signals into navigation instructions and transmits them in real-time to the robot.
  4. Then the team recruited 2 patients whose lower bodies were paralyzed and who had been bedbound for 6 or 7 years.

After 6 weeks of hour-long training sessions, the patients (in the hospital) were able to control the robots (in the lab) from 100 km away. Or just over 62 miles. They drove the robot to various targets – furniture, people, objects – around the lab for 12 minutes.

In the future, Millán imagines modifying the shared control brain-machine interface so the user can control a prosthetic limb or a wheelchair. They may eventually add an arm to the current robot so it can grab objects.

The findings were reported this week at the IEEE Engineering in Medicine and Biology Society conference in Boston.

Via ScienceNOW.

Image: Festo Didactic

This post was originally published on Smartplanet.com

Editorial standards