The London Science Museum has opened its doors on a four-day display of 20 robots from around Europe. Robotville 2011, which began on Thursday, is designed to show the latest research into how robots and humans interact, and includes machines that have never been put on public display before.
Some of the research focuses on how people can benefit from machines, while others look at how robots can learn from humans. For example, the i-Sur robot is designed to autonomously plan and carry out simple surgical operations, Nao can be programmed to do just about anything, and Dora the Explorer maps and remembers its environment.
Kaspar, pictured above, has been developed for research into the effect of facial expressions, gestures and imitation in human-robot interaction. Recently, he has been used to study whether robots can teach social interaction skills to children with autism, or be used in therapy with them.
The robot was created by the school of computer science at Hertfordshire University. He is part of the European Commission-funded European RobotCub project, which provides an open-source platform for building robots to help researchers study child development and cognition.
"Our aim is to study what types of human-robot interactions a minimal set of expressive robot features can afford. The goal is not perfect realism, but optimal realism for rich interaction," the Adaptive Systems Research Group at the university said on the Kaspar project page.
Kaspar's face is a silicone rubber mask, stretched over an aluminium frame, with a mouth that can open and smile. He has eight degrees of freedom of movement in the head and neck, with six degrees in the arms and hands. Video cameras are used for the eyes, which have two degrees of freedom.
Like Kaspar, iCub (pictured) takes his cues from children. In this case, however, the idea is to look at techniques for learning in robots.
iCub starts life with basic pre-programmed knowledge, such as how to move, detect motion and recognise simple voice commands. He can also recognise unnamed 'blobs'. Building on this, researchers can teach iCub what certain items are — in the case of the picture above, an octopus and a block. They can then direct him to distinguish between items and to pick up specific objects. In doing this, he learns about his own abilities and how to interact with the world around him.
As iCub is a main product of RobotCub, his software is open source and will run on either Windows or Linux systems. An emulator is available to download from the project home page, Lorenzo Natale, senior researcher at the Italian Institute of Technology, which developed this iCub, told ZDNet UK. More than 20 copies of the robot have been built to date, he noted.
Dora the Explorer (above) was developed by the University of Birmingham. According to Nick Hawes, a lecturer in intelligent robotics at the university's school of computer science, she is a "curious" robot.
Unlike other robots, Dora is designed to fill in gaps in her own knowledge through exploration of her surroundings. She does this in a logical way, based on weighing the probability that her actions will yield the correct answer for the least 'cost'.
While Dora does have some 'common sense' understanding mined from Google and manually programmed, she can only respond to prompts for specific objects. For example, she can understand and carry out the command "Bring me the Frosties" if she has been taught what Frosties are, Hawes said. However, she would not understand the command, "Bring me the cereal".
Also, if Dora is asked to retrieve the Frosties, she would know the most likely place to find the box is the kitchen and would look there first, rather than in the lounge or bedroom.
Hawes said the University of Birmingham is looking at ways for Dora to be able to mine Google automatically for information. One challenge in this, though, is that information on the internet is not formatted in a standard way, making it harder to be machine-readable, he said.
When put in a new location, Dora has no information about its surroundings. It uses lasers placed at shin height, plus an Xbox Kinect unit and dual cameras at head height, to draw a map of her surroundings. The method is known as simultaneous localisation and mapping (Slam).
"For the show, we've had to programme invisible walls, otherwise Dora would just keep wandering off exploring her surroundings," Hawes said.
The image above shows what Dora has learned since arriving at the London Science Museum. Initially, the map was blank. As the robot has explored the area, it has marked out things such as walls and doors.
The red dots in the image are an outline of a person stood in front of Dora, while the circular pie-chart-like markers make it easier for Dora to navigate her way back to a specific point.
Ecce (Embodied Cognition in a Compliantly Engineered Robot) is a robot based around human anatomy, but rather than taking just the skeletal structure and building normal robotic internals it mirrors our anatomy closely. The ultimate aim is to be the first truly anthropomimetic robot and to find out more about how the human brain works.
The current version is the second in the project, which includes the University of Sussex among its collaborators, and there is an Ecce 3 on the way. That is expected in early 2012
Ecce 2 contains around 80 actuators, one for each 'muscle', and each actuator is made up of a screwdriver motor, gearbox, spindle, a piece of 'kiteline' used as a tendon and an elastic shock cord.
Nao, developed by French company Aldebaran Robotics originally in 2004, rose to fame in 2007 when it replaced Sony's robot dog — Aibo — as the robot used in the Robot Soccer World Cup.
It is a programmable 57cm tall robot made up largely of actuators, electric motors and a host of sensors, including two cameras, four microphones, a sonar distance sensor, two infrared emitters and receivers, nine tactile sensors and eight pressure sensors. It also has dual CPUs, voice synthesiser and speakers.
It is designed to be programmable to carry out many different tasks, not all of them as serious as others. It was dancing to a Michael Jackson song when the picture above was taken.
Charly is another robot designed to study how humans react to robots with human-like appearance, and to see how people like robots to look.
If you stand in front of Charly for long enough, his face will gradually — and somewhat eerily — morph into that of the person stood in front of him.
One of the interesting things about the psychology of this phenomenon is that people tend to like people that look like them — their family, for example. However, there is a tipping point where the robot starts to look disturbingly uncanny, Mick Waltons from the robotics unit at the University of Hertfordshire, which created Charly, explained to ZDNet UK.
The i-Sur project, pictured, aims to give surgical robotics the cognitive abilities to perform surgical tasks automonously, but under the supervision of a human surgeon.
i-Sur has been designed to execute every task of a minor procedure. This includes understanding the surgeon's intentions, planning the operation and perfoming the operation. It will also detect potential problems and suggest solutions to the human surgeon via a control module.
Many robots are seen as cold, unflinching machines, but Flash aims to change that.
Developed by the Wroclaw University of Technology in Poland, Flash has a head made of moving metal discs designed to be able to mimic human facial expressions.
Flash is being used to study the ways in which robots can better interact with humans if they are given the ability to physically express emotions.