X
Innovation

In Argentina, controlling a robot with 'brain waves'

BUENOS AIRES--A five-person technology firm a few hours north of Argentina's capital plans to help quadriplegics--and video game addicts--control their worlds with their thoughts. Can "brain waves" really work?
Written by Ian Mount, Correspondent (Buenos Aires)

BUENOS AIRES--"Brain waves."

Even without scare quotes, those two words used as a scientific explanation do not inspire confidence. They're just too close to an Austin Powers catch phrase.

So when the chief technology officer at Interactive Dynamics, a startup based in Argentina's second city, Rosario, tells me that it's all brain waves in the TedxRosario video--the one of him blinking and scowling as a toy robot skitters in front of him, his bald pate clamped by what looks like a metal spider from the Matrix--my first reaction is, naturally, skeptical. But after the skepticism comes the curiosity: if he really is controlling a robot with "brain waves", well, that certainly is (to extend the Austin Powers metaphor) cooler than a laser-armed mackerel.

First, the company. Interactive Dynamics was founded in Rosario in early 2011 by three old friends and tech industry co-workers, Pablo Medina, Gastón Pereyra Puyó and Juan Pablo Manson. Their plan: to exploit recent developments in ways of controlling computers that don't involve a mouse or joystick, what is known as Natural User Interface (NUI). To put it more poetically, they wanted to integrate man and machine.

"In real life, we communicate amoung ourselves and move through the physical world without thinking about sequences of keys, clicks and complex commands. With traditional technology, there are still barriers to fluid and interactive communication, and that means the human experience of the digital world depends too much on one's [technology] skills," said Manson, the company's CTO. "What we aim to do with our applications is abstract the means that devices impose, so that the use of technology is an agreeable experience in which we only concentrate on the ends--on the 'what' that we wish to achieve, and not on the 'how.'"

On the ground, that means interacting with computers via multi-touch screens; face, voice and gesture recognition; intelligent environments; and augmented reality systems. These ideas aren't totally new, of course. Pinching and expanding on an iPhone is a function of a multi-touch screen, for example, while the Wii uses gesture recognition and the yellow "first down line" used in TV football broadcasts is an example of augmented reality.

But such applications are just the tip of what companies like Interactive Dynamics want to achieve, and Natural User Interfaces have far to run in the marketing, education and entertainment worlds. Interactive advertisements that let shoppers browse through a catalog using gestures, windshields that show data about the passing landscape, and motion capture systems that allow students to enter into interactive history movies are just some examples of what can be done.

In the end, though, it's really all about controlling a robot with a head spider.

It turns out that "brain waves" isn't such a terrible explanation for what's being done. While controlling the robot (known as the BeeBot), Manson, 36, wears a kind of headset (the spider) called a Brain-Computer Interface, or BCI (in this case, a $500 model made by Emotiv). The BCI clamps electrodes on specifically chosen spots of the wearer's skull (the Emotiv one has 14), and these electrodes measure the brain's biopotentials, or electrical activity (much like an EEG or, in the heart, an ECG). Distinct thoughts or facial muscle movements produce distinct biopotential patterns. As Manson explains:

"To command something with the brain, it's not that the computer reads our thoughts, but that it's trained to recognize certain patterns. What's done is certain commands are executed when specific patterns are detected, which allows us to perform certain actions by thinking specific things. These aren't very abstract thoughts, being that the technology couldn't understand them, but basic ones, like for example thinking of pushing or lifting an object. This technology can also read the changing signals associated with facial movements, which is relatively easy to detect."

Before using a BCI, the user must train it. He thinks simple thoughts, like 'up' or 'down,' or makes facial movements that are meant to symbolize 'up' or 'down'. The biopotential patterns from those thoughts or movements are sent via a wireless protocol to a computer, which stores them. There, in a management program, they are associated with the commands for 'up' or 'down', and whenever the user repeats the thoughts or facial movements the system receives the patterns, interprets them, and transmits the corresponding order. And so, Manson can control the skittering BeeBot by winking or raising his eyebrows (as he did, truth be told, in the loud confines of the TedxRosario exhibition space), or by thinking 'left' or 'right,' as Manson does when he's in quieter locales.

You see, using a BCI successfully requires relaxation and focus, Manson says. For obvious reasons, you have to be able to think.

Just like porn drove innovation in online streaming media, video games are the big driver in the BCI industry, and for clear reasons: the idea of playing a super hi-res zombie shoot-em-up via thoughts alone is intoxicating for players, not to mention big bucks for game makers. But while the real money is in video games, the true life-changing possibilities are in healthcare and rehabilitation. The use of BCIs is a natural solution for people with neurological damage, notes Juan Dalmasso in an article on Interactive Dynamics in the tech blog Opinno. The University of Pittsburgh Medical Center has showed how a brain-computer interface can allow paralyzed men to move robotic arms and the Austrian company g.tec enabled people to send tweets with their thoughts. And now, Manson notes, changes in the economics of the industry mean that little companies in developing nations, like Interactive Dynamics, can help lead that innovation.

"Five years ago this technology wasn't accessible except for neuro-rehabilitation and neurological study centers," he said. "But thanks in part to the world of video games, the headsets have plunged in cost."

According to Manson, helping people with neurological damage is where his demonstration of the skittering BeeBot eventually leads. This year, he says, Interactive Dynamics will release a product for people suffering neurological damage. Though he won't specify exactly what it is, he says it will help people paralyzed from the neck down control where their body goes. The BeeBot was a proof-of-concept on that point, so you can imagine what sort of product his team is designing. And that product, he says, is just one development in an oncoming avalanche from his company and others.

"This is the beginning of something that has no end. In 10 or 15 years, it will be natural to control things via thoughts," he said. "This is the beginning of a large technology change in which interfaces like mouse and keyboards will disappear."

And be replaced, it seems, by brain waves.

This post was originally published on Smartplanet.com

Editorial standards