X
Innovation

Robots 'feel' objects with their whiskers

European researchers have developed new sensors modeled on mouse whiskers and equipped robots with these sensors. And these robots can 'feel' objects and their textures. This might have important consequences not only for future robots, but also in the areas of neuroscience and human cognition.
Written by Roland Piquepaille, Inactive

European researchers have developed new sensors modeled on mouse whiskers and equipped robots with these sensors. According to IST Results, such robots can 'feel' objects and their textures. They also discovered that their robots have manifested what is called an 'emergent behavior,' meaning they developed some kinds of instinct without having been programmed. This might have important consequences not only for future robots, but also in the areas of neuroscience and human cognition.

Here is the introduction of IST Results about this project, which started 4 years ago for a cost of about 2.5 million euros.

Robots that 'feel' objects and their texture could soon become a reality thanks to the innovative and interdisciplinary research of the AMouse, or artificial mouse, project.

Below is a picture of one AMouse robot, with an omnidirectional camera (A) and its artificial whisker sensors (B) which are mounted on a mobile Khepera platform (C) (Credit: Artificial Intelligence Lab, University of Zurich, Switzerland).

The AMouse robot

This picture comes from a scientific paper called "Emergence of Coherent Behaviors from Homogenous Sensorimotor Coupling" (PDF format, 7 pages, 2.46 MB).

In a previous paper called "Simulating Whisker Sensors - on the Role of Material Properties for Morphology, Behavior and Evolution" (PDF format, 8 pages, 1.15 MB), you can see a virtual environment for these robots. "The agent is shown with its left whiskers in contact with an object it has to avoid. The floating sphere is the target the agent is seeking."

The AMouse robot in a virtual environment

Even if robots can 'feel' and 'recognize' objects in their environment, the real advances seem to be in 'emergent behavior.'

Emergent behaviour is a primary characteristic of life. In biological systems the combination of various data, like touch and sight, reinforces specific neural pathways. These pathways come to dominate and can cause an entity to 'behave' in a specific way.
In one startling outcome an AMouse robot demonstrated what appeared to be emergent behaviour: it developed a homing instinct without any pre-programming of any kind.
"Essentially we put in the sensors and then wire them up through the robots 'brain', its CPU. We just switch it on without giving it instructions of any kind," says Simon Bovet, a Ph.D. student at the University of Zurich. When he threw the switch his robot started moving about the room but always returned to the spot where it began.

I must say that this looks almost magical. Is there a possibility that some initialization parameters were present in the CPU, such as the location of the robot when it was switched on?

Anyway, such an emergent behavior could have important consequences for neuroscience and robotics research.

"We can study neural pathways and neural coding in a machine, in a way that's currently impossible in humans. In a robot we can isolate a particular neural pathway to see what happens to other neurons when we trigger a specific one. In humans, if we stimulate one neuron it will influence changes a large number of other neurons, so it's impossible to track what's going on," [said Dr Andreas K. Engel, coordinator of the AMouse project.]

And don't hesitate post your comments about this intriguing project. Thanks!

Sources: IST Results, November 15, 2005; and various web sites

You'll find related stories by following the links below.

Editorial standards