X
Innovation

The 'conscience' of the BabyBot

European researchers took an engineering approach to define what is consciousness in designing BabyBot, a robot acting as a two year-old child. Their experiments, while promising, don't solve entirely the problem of the definition of what is consciousness. So they're now designing new robots like the iCub.
Written by Roland Piquepaille, Inactive

Defining consciousness is a task that scientists and philosophers have never been really able to do. Recently, European researchers took an engineering approach to this problem in designing BabyBot, a robot acting as a two year-old child. They gave it rudimentary senses of sight, hearing and touch and compared its 'understanding' of its environment with what 6 to 18 month-old infants were feeling. Their experiments, while promising, don't solve entirely the problem of the definition of what is consciousness. So Europe is now funding another research project, RobotCub, which will use more refined robots to try to find a better answer. Read more...

Here is the introduction of the IST Results news release.

BabyBot, a robot modelled on the torso of a two year-old child, is helping researchers take the first, tottering steps towards understanding human perception, and could lead to the development of machines that can perceive and interact with their environment.

Here is a picture of BabyBot sitting in its chair (Credit: ADAPT project). You'll find more details and other images of BabyBot on this page.

The BabyBot robot

The researchers used what they called "synthetic methodology," which is basically how you learn by building something, a process of trials and errors. And they also looked at how babies were interacting with their environment. Then they were ready to test their robot.

[They gave to their robot] a minimal set of instructions, just enough for BabyBot to act on the environment. For the senses, the team used sound, vision and touch, and focused on simple objects within the environment.
There were two experiments, one where BabyBot could touch an object and second one where it could grasp the object. This is more difficult than it sounds. If you look at a scene, you unconsciously segment the scene into separate elements.
This is a highly developed skill, but by simply interacting with the environment the BabyBot did its engineering parents proud when it demonstrated that it could learn to successfully separate objects from the background.

But this is still not good enough to clearly define what is consciousness, and the European Union decided to fund another effort, the RobotCub project.

Here is a picture of one of the robots that will be used in this project, the iCub, as it looks coming out from CAD software (Credit: RobotCub project).

The iCub robot

And what will be the impact of such a work? According to the scientists, their research could "have a huge range of applications, from virtual reality, robotics and AI, to psychology and the development of robots as tools for neuro-scientific research."

Sources: IST Results, May 2, 2006; and various web sites

You'll find related stories by following the links below.

Editorial standards