X
Innovation

Bio-inspired retina allows drones to almost see in the dark with no motion blur

Developed for drones, these new cameras will fundamentally change where and when UAV operate
Written by Greg Nichols, Contributing Writer
dronedark.jpg

A group of researchers from the University of Zurich and NCCR Robotics is giving drones a new way to see.

Their innovation is an eye-inspired camera that can easily cope with high-speed motion and even see in near-dark conditions--crucial functionality as drones become more autonomous and applications for drones more widespread.

Autonomous and semi-autonomous drones need to know their precise position and orientation in space at all times to fly safely. Commercial drones use GPS, but that can be unreliable in cities.

Conventional cameras can help drones fix their location, but they need ample light to function effectively. Autonomous drones that rely on computer vision are also restricted to flying below speeds that cause motion blur, which renders vision algorithms useless.

To overcome these limitations, most professional drone platforms currently have a number of bulky, expensive sensors, such as laser scanners.

To solve these problems, the Swiss team invented a so-called event-based camera. Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. Crucially, the camera's retina doesn't require full light capture to generate a clear image.

nccr-vision.jpg

"This research is the first of its kind in the fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever, including in low-light environments," says Prof. Davide Scaramuzza, Director of the Robotics and Perception Group at UZH.

To conceptualize how these cameras work, it's helpful to understand when they're useful. Traditional video can be broken down into a series of frames containing rich information at the pixel level about brightness and color. Event cameras, by contrast, only compare brightness at each pixel from one moment to the next.

That means that standing still, an event camera will yield very little useable information. Put it on a drone whipping through the sky, however, and the readings can be used by a computer to visualize an environment.

The UZH researchers designed software to efficiently process the output from these cameras. So far they've successfully used the software to enable autonomous flight at higher speeds and in lower light than currently possible with commercial drones--at least in limited tests.

A spokesman from NCCR said that drones equipped with such a system could assist search and rescue teams in scenarios where conventional drones would be of no use--for example on missions at dusk or dawn or when there is too little light for normal cameras to work. They would also be able to fly faster in disaster areas, where time is critical to saving survivors.

Because they don't require laser sensors, drones using these cameras could soon be relatively cheap, opening up new possibilities for commercial and professional drone use.

But there's no solid timeline for that to happen.

"There is still a lot of work to be done before these drones can be deployed in the real world since the event camera used for our research is an early prototype," says PhD Student Henri Rebecq. Professor Scaramuzza adds: "We think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system."

Editorial standards