Wearable robot arm could help paralyzed patients

Adding a wearable robotic arm to a brain-machine interface for paralyzed patients helps the patients move objects with their thoughts, according to new research.
Written by Christina Hernandez Sherwood, Contributing Writer

Adding a wearable robotic arm to a brain-machine interface for paralyzed patients helps the patients move objects with their thoughts, according to new research.

I spoke recently with Nicho Hatsopoulos, associate professor and chair of computational neuroscience at the University of Chicago, about this latest innovation in brain-machine technology.

First, Hatsopoulos gives some background:

In normal movement, the way we behave in the world depends on many modes of sensory feedback. We use our visual system to help us guide our hands and arms to different places, like to grab a cup of coffee. What's less obvious is the sense of kinesthesia -- the sense of our body motion. We can feel our limbs moving. We know where they are in space. Those depend on specialized sensors that reside in the muscles and joints and tendons. We know from patients who have lost that sense that it seriously affects motor behavior. We know that both tactile and kinesthesia are really critically important for movement.

The current state of the art in brain-machine interfaces are systems that take signals from various parts of the brain and decode the signals to guide movement. I was involved in a clinical trial where we extracted signals from the brain, decoded them and allowed [spinal cord injury or ALS] patients to guide a cursor on a computer screen to check their email and surf the web. They relied solely on vision to guide this movement. That's the current state of the art.

Hatsopoulos explains his project:

A number of groups have been thinking about this question of adding other sorts of sensory feedback. We tried one approach to solving the problem which is using a wearable, or exoskeletal, robot. We developed a brain-machine interface where we recorded signals from the motor cortex of a monkey and they were controlling the cursor on a screen. At the same time, we sent that control signal from the brain to this wearable robot on which the monkey's arm was resting. It was moving the monkey's arm to follow the cursor. Because the monkey was completely normal, the monkey had a complete sense of kinesthesia. He got a sense of feeling of the motion of the cursor from his arm. That helped him guide the movement of the cursor more effectively.

The most important measures were the time it took him to hit a target. The time it took him was about 40 percent less when we added this feedback. We also looked at how straight the paths of the cursor were from the initial position to the target and they were 40 percent straighter when his arm was being moved around as well. The cursor was positioned where the hand was.

How would this help human patients?

ALS patients suffer from a neurodegenerative disease that affects the neurons that activate the muscles. It looks like their sense of touch and kinesthesia is relatively intact. Most cases of spinal cord injury involve a crush or contusion of the spinal cord. A lot of these nerves that come down the spinal cord are damaged, but not all of them. There are some residual signals -- both motor signals from the brain to the muscles, although they're weakened, but also sensory signals coming back up. If we could take advantage of these residual sensory signals by moving the patient's arm with this wearable robot, that could help them feel the motion of the robot they're controlling.

How exactly would paralyzed patients use the robot arm?

There are a number of groups around the world that are developing exoskeletal robots. They're building exoskeletal robots that are controlling the arm, as well as the hand. This added sense of where your limbs are in space and how they're moving would help you better control it.

Talk more about how you tested this.

We began by implanting a chip, about the size of your pinky nail, composed of 100 electrodes in the motor cortex. That's the part of the brain responsible for voluntary movement -- the part that activates when you move your arm to have a cup of coffee. In a normal person or animal, these signals travel down the spinal cord and control the muscles and move the arm. We implanted this chip into the motor cortex of two monkeys and the electrodes are picking up electrical signals from individual neurons.

The monkey was moving a cursor with his arm to hit targets. Then, we played it back to him visually. (We had shown in a previous publication that seeing this game being played back will evoke responses in the motor cortex. This is important because if we're going to do this in a human who is spinal cord injured or paralyzed, the only way we can evoke these responses is by asking them to imagine making movements and by presenting them with a display with a cursor that's moving the targets. We hope to, through imagination, try to evoke these responses.)

These neurons are responding and the motion of the cursor is evoking these responses. You build a decoder that relates the neural activity to the motion of the cursor. It's software. It figures out that this particular pattern of activity among the 30 neurons means: move the cursor to the left. A different pattern means: move the cursor to the right. The software is making sense of those patterns and activity in terms of movement. Then, we begin the heart of the experiment.

We have two conditions. In one condition, the monkey was trained not to move his arm, but to move the cursor by voluntarily activating his motor cortex. By activating his motor cortex he would causally move the cursor through this decoder. He's now moving the cursor just by thinking about moving it, but he's not moving his arm. The second condition is doing the same thing -- moving the cursor with his brain -- but now the exoskeletal robot is pushing his arm to follow the cursor. He's just sitting there and enjoying the ride. We can compare how well he can move the cursor to hit the targets in the condition in which his arm is being pushed around and the condition of where his arm is still. That's how we made the comparison.

What's next for this research?

I have plans in the next year or two to try some of these ideas with humans. It will require a collaborative effort.

Photo: Nicho Hatsopoulos

This post was originally published on Smartplanet.com

Editorial standards