X
Innovation

National University of Singapore demonstrates artificial skin to help robots 'feel'

Researchers have touted the artificial skin as being able to detect touches more than 1,000 times faster than the human sensory nervous system.
Written by Asha Barbaschow, Contributor

Researchers from the National University of Singapore (NUS) on Wednesday announced they have been conducting work that aims to give robots a sense of touch through artificial skin.

The two researchers, who are also members of the Intel Neuromorphic Research Community (INRC), presented research that demonstrates the promise of event-based vision and touch sensing, in combination with Intel neuromorphic processing for robotics.

The majority of today's robots operate solely based on visual processing and lack the capability humans have where the sense of touch is concerned.

The researchers hope to change this using their artificial skin, which NUS touts as being able to detect touches more than 1,000 times faster than the human sensory nervous system. The artificial skin, NUS said, can also identify the shape, texture, and hardness of objects "10 times faster than the blink of an eye".

Intel's Neuromorphic Computing Lab director Mike Davies said the research provides a glimpse to the future of robotics where information is both sensed and processed in an event-driven manner.

"The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture," Davies said.

See also: Singapore researchers tapping quantum cryptography to enhance network encryption

NUS said enabling a human-like sense of touch in robotics could significantly improve current functionality, offering the example of robotic arms fitted with artificial skin that could easily adapt to changes in goods manufactured in a factory, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.

"The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today," the university said.

Intel is helping the researchers by providing a chip that is deployed inside the robot to help draw accurate conclusions based on the skin's sensory data in real time.

"Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter," assistant professor Benjamin Tee from the NUS Department of Materials Science and Engineering and NUS Institute for Health Innovation & Technology said.

"They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi provides a major step forward towards power-efficiency and scalability."

Using Intel's Loihi neuromorphic research chip, in their initial experiment, the researchers used a robotic hand-fitted with the artificial skin to read Braille, passing the data to Loihi via the cloud to convert the micro bumps felt by the hand into a "semantic meaning".

Intel said Loihi achieved over 92% accuracy in classifying the Braille letters, while using 20 times less power than a standard Von Neumann processor. 

Building on this work, the NUS team further improved robotic perception capabilities by combining both vision and touch data with a spiking neural network. To do so, they tasked a robot to classify various opaque containers containing differing amounts of liquid, using sensory inputs from the artificial skin and an event-based camera.

"Using the same tactile and vision sensors, they also tested the ability of the perception system to identify rotational slip, which is important for stable grasping," NUS explained.

The captured sensory data was then sent to both a GPU and Loihi to compare processing capabilities. The researchers recorded that by combining event-based vision and touch using a spiking neural network, this enabled 10% greater accuracy in object classification compared to a vision-only system. 

"We're excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It's a step towards building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations," assistant professor Harold Soh from the Department of Computer Science at the NUS School of Computing added.

LATEST ROBOTICS NEWS

Editorial standards