X

Photos: Inside a robotics testing lab

The flying bots and AI machines of the University of Essex...
By Nick Heath, Contributor
40154227-1-robot-flier-original-university-of-essex-610.jpg
1 of 9 Nick Heath/silicon.com

The flying bots and AI machines of the University of Essex...

Researchers at the University of Essex are building robots capable of understanding the world around them.

Inside the university's robot arena, researchers test out a variety of kit, from flying robots that track objects on the ground to mobile machines that mimic human behaviour.

silicon.com visited the university's robotics lab to look at how researchers are using visual-recognition and machine-learning technologies to bring new capabilities to robots.

This is a flying bot that has been designed to hunt and track objects on the ground.

PhD student John Oyekan said: "We foresee it being used in military scenarios such as tracking a rogue car or a criminal."

In these scenarios, the flier would use its camera to search for objects of interest between a series of waypoints set via GPS.

40154227-2-university-essex-flying-robot-hover-350.jpg
2 of 9 Nick Heath/silicon.com

During tests of the flier inside the robot arena, a computer tracks its position using a network of nine infrared cameras that are positioned around the arena's edge.

Infrared light is shone on the flier, bounces off reflective balls attached to its surface and is picked up by the IR cameras.

The computer calculates the flier's position in 3D space by timing how long it takes for the IR light to be reflected from the balls.

In the lab, the robot flier uses the arena's IR-tracking system, known as Vicon, to track a remote-controlled car.

Just like the flier, the car has reflective balls stuck to its surface that bounce IR light back to the cameras situated around the arena.

This arrangement allows the computer to know the positions of the flier and the car in the arena, enabling the computer system to command the flier to follow the toy car.

Computer processing for the Vicon tracking system is carried out using a 2.3GHz quad-core desktop PC and data is sent wirelessly to the flying robot.

40154227-3-robot-flier-and-car-university-of-essex-610.jpg
3 of 9 Nick Heath/silicon.com

In tests outside the robot arena, the flying bot would use visual-recognition software written by researchers to track objects on the ground using its camera.

Senior research officer Theo Theodoridis said: "These are computer-vision algorithms that use advanced mathematics and algebra to process, pixel by pixel, the colours and positions of the pixels in the image."

Theodoridis said the algorithms could be used to carry out a variety of visual tasks, such as facial recognition, lip reading and classifying objects.

A dual-core laptop PC carries out the visual-recognition processing and the result is sent back to the flier wirelessly.

At the moment, configuring the visual-recognition system to recognise new objects requires "at least a masters degree in computer science" but university researchers want to simplify the system to make it easier to use.

Researchers are also investigating how to get multiple bots to fly together as a swarm - where each flier communicates its position to other fliers, allowing them to co-ordinate their movements.

40154227-4-ir-camera-university-of-essex-350.jpg
4 of 9 Nick Heath/silicon.com

Theodoridis is using the Vicon IR tracking system, a camera of which is seen here, to develop machine-learning software that is able to recognise aggressive behaviour such as punching or pushing when it is caught on camera.

Theodoridis said: "It could be used in security robots or surveillance systems."

40154227-5-rex-robot-university-of-essex-350.jpg
5 of 9 Nick Heath/silicon.com

This is Rex - a robot that displays elements of lifelike behaviour, using an artificial-intelligence routine written by Theodoridis.

The robot has been programmed to feel happy, bored, annoyed or normal, depending on his environment.

40154227-6-university-essex-robotics-lab-rex-screen-610.jpg
6 of 9 Nick Heath/silicon.com

Rex becomes bored when no one is interacting with him and happy when he is interacting with someone. Rex is able to recognise when someone is interacting with him by using a camera and a built-in facial-recognition system to detect faces, facial gestures and head movements.

Rex will behave differently depending on what mood he is in. For example, if he is bored, he will wander around the room looking for people with whom to interact.

He is also capable of "speaking" 100 recorded phrases that reflect what he is doing or his mood, such as introducing himself or warning people nearby to watch their step.

Sonar sensors in Rex's base allow the robot to judge where objects are and navigate his way around the environment.

The sensors work by sending out an ultrasound wave and then timing how long it takes for the wave to bounce off a solid object.

It took Theodoridis about three months to build Rex using his experience in designing robotics software.

"Rex is very popular as in many different cases it looks like being alive - people really love it, especially kids," Theodoridis said.

Rex's routines run on a Linux Fedora OS on a dual-core processor, with his visual-recognition and AI software written in C++.

40154227-7-pioneer-robot-football-university-of-essex-350.jpg
7 of 9 Nick Heath/silicon.com

This Pioneer robot is used to run a variety of programs that enable it to perform tasks, such as navigating a room or finding certain objects.

A program that can be run on the robot's onboard computer allows the bot to carry out visual recognition to detect and navigate to objects of a certain colour in its environment, such as the orange ball in the picture.

40154227-8-pioneer-robot-map-university-of-essex-610.jpg
8 of 9 Nick Heath/silicon.com

A navigation program enables the robot to build a 3D map of the room, seen above, using a laser rangefinder and ultrasound detectors.

The 3D map shows objects and walls as peaks, and clear space as flat areas where the robot can move around.

40154227-9-robotics-lab-robochair-610.jpg
9 of 9 Nick Heath/silicon.com

This is the RoboChair - a wheelchair that can be controlled via head or hand gestures, voice or brainwaves.

A camera captures the user's head or hand gestures. Voice commands can be given via a USB microphone headset and an EEG cap can be worn to detect brainwaves.

A computer powered by an Intel Atom processor is in the base of the wheelchair, and drives the various voice-, gesture- and brainwave-recognition systems.

The RoboChair is designed to be controlled in various ways to make it suitable for use by people with a wide range of disabilities.

Related Galleries

Holiday wallpaper for your phone: Christmas, Hanukkah, New Year's, and winter scenes
Holiday lights in Central Park background

Related Galleries

Holiday wallpaper for your phone: Christmas, Hanukkah, New Year's, and winter scenes

21 Photos
Winter backgrounds for your next virtual meeting
Wooden lodge in pine forest with heavy snow reflection on Lake O'hara at Yoho national park

Related Galleries

Winter backgrounds for your next virtual meeting

21 Photos
Holiday backgrounds for Zoom: Christmas cheer, New Year's Eve, Hanukkah and winter scenes
3D Rendering Christmas interior

Related Galleries

Holiday backgrounds for Zoom: Christmas cheer, New Year's Eve, Hanukkah and winter scenes

21 Photos
Hyundai Ioniq 5 and Kia EV6: Electric vehicle extravaganza
img-8825

Related Galleries

Hyundai Ioniq 5 and Kia EV6: Electric vehicle extravaganza

26 Photos
A weekend with Google's Chrome OS Flex
img-9792-2

Related Galleries

A weekend with Google's Chrome OS Flex

22 Photos
Cybersecurity flaws, customer experiences, smartphone losses, and more: ZDNet's research roundup
shutterstock-1024665187.jpg

Related Galleries

Cybersecurity flaws, customer experiences, smartphone losses, and more: ZDNet's research roundup

8 Photos
Inside a fake $20 '16TB external M.2 SSD'
Full of promises!

Related Galleries

Inside a fake $20 '16TB external M.2 SSD'

8 Photos