X
Tech

Bio-inspired intelligent machines

The goal of the EU-funded SenseMaker project is to create machines capable of sensing their environments and it took its inspiration from our brain. This multi-sensorial approach began by modeling of human perception and sensory fusion, reproducing the models in silicon.
Written by Roland Piquepaille, Inactive

Two days ago, I told you about how plant cells could control robots, and today, I want to introduce you to how biology has inspired perceptive machines. The goal of the EU-funded SenseMaker project is to create machines capable of sensing their environments and it took its inspiration from our brain. This multi-sensorial approach began by modeling human perception and sensory fusion, reproducing the models in silicon. So far, this project, which involved electronic engineers, computer scientists, neuroscientists, physicists, and biologists, has built hardware simulators, such as FPGAs to emulate spiking neural networks. But the researchers admit there is a long way to go before building intelligent systems working as our brains.

Here is the introduction of the IST Results article.

Teaching a machine to sense its environment is one of the most intractable problems of computer science, but one European project is looking to nature for help in cracking the conundrum. It combined streams of sensory data to produce an adaptive, composite impression of surroundings in near real-time.

The SenseMaker project started in 2002 with a budget of 2.17 million euro. Its coordinator was Professor Martin McGinnity of the Intelligent Systems Engineering Laboratory (ISEL) at the University of Ulster.

Basically, the goal of this program was "to combine biological, physical and engineering technological approaches in the production of adaptable perception systems." In other words, it "took its inspiration from nature by trying to replicate aspects of the brain's neural processes, which capture sensory data from eyes, ears and touch, and then combines these senses to present a whole picture of the scene or its environment."

Below is a diagram showing how biological principles can be translated into a hardware representation (Credit:ISEL). And here is a link to a larger version.

SenseMaker

For more information about this project, please read this overview.

And here is how the multidisciplinary team used field programmable gate arrays (FPGAs), which can be dynamically reconfigured, to implement arrays of spiking neural networks to emulate several components of our sensory system, and our vision in particular.

Spiking neurons are more biologically compatible compared to traditional classical neural networks, such as the McCulloch-Pitts threshold neuron, because the time between spikes and their cumulative effect determine when the neuron fires. By using an advanced FPGA computing platform, ISEL were able to implement large networks of spiking neurons and synapses, and test the biological approaches for sensory fusion. The FPGA approach allows for flexibility, both in terms of rapid prototyping and the ease with which different neuron models can be implemented and tested.

They then built an application specific integrated circuit (ASIC) device as a prototype but with the idea to produce it in the future..

These circuits process data in a similar manner to the biological brain, focusing resources on the most data-rich sensory stream. A user interface on a PC lets researchers engage with the system.

So what's next?

"This type of research teaches us a lot about how biological systems work, and it could lead to new ways of treating people with sensory-related disabilities, though that kind of outcome will take a long time," says Professor McGinnity.
He says intelligent systems need to adapt to their environment without reprogramming; they need to be able to react autonomously in a manner that humans would describe as intelligent; for that they need a perception system that enables them to be aware of their surroundings.

Several EU-funded projects will carry aspects of this work even further, leading to the possible creation of 'intelligent systems' able to feel, learn and self-organize. But this will be another story.

Sources: IST Results, February 9, 2006; and various web sites

You'll find related stories by following the links below.

Editorial standards