News Release

Drones can almost see in the dark

Peer-Reviewed Publication

University of Zurich

Video Flying Robots

video: In the experiment the drones fly autonomously and faster then ever -- including in low light environments. view more 

Credit: (video: UZH)

To fly safely, drones need to know their precise position and orientation in space at all times. While commercial drones solve this problem using GPS, this only works outdoors, and is not very reliable, especially in urban environments. Furthermore, the conventional cameras mounted on drones work only when there is a high amount of light available, and the drone's speed has to be limited otherwise the resulting image is motion-blurred and cannot be used by computer vision algorithms. To solve this problem, professional drones use sensors that are elaborate, expensive, and bulky, such as laser scanners.

First combination of artificial intelligence and robotics

A group of researchers from the University of Zurich and the Swiss research consortium NCCR Robotics has now developed an innovative alternative approach, enabling drones to fly in a wide range of conditions using an eye-inspired camera that can easily cope with high-speed motion. It can even see in the dark much more effectively than the conventional cameras currently used by all commercial drones. "This research is the first of its kind in the fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever, including in low-light environments," says Prof. Davide Scaramuzza, Director of the Robotics and Perception Group at UZH. He and his team have already taught drones to use their onboard cameras to infer their position and orientation in space.

Camera captures light changes for each pixel

Event cameras, which were invented at UZH together with ETH Zurich, do not need to capture full light on the entire bio-inspired retina in order to have a clear picture. Unlike their conventional counterparts, they only report changes in brightness for each pixel, ensuring perfectly sharp vi-sion even during fast motion or in low-light environments. The UZH researchers have also de-signed new software able to efficiently process the output from such cameras, harnessing this to enable autonomous flight at higher speeds and in lower light than currently possible with com-mercial drones.

Drones equipped with an event camera and the software designed by the Swiss researchers could assist search and rescue teams in scenarios where conventional drones would be of no use -- for example on missions at dusk or dawn or when there is too little light for normal cameras to work. They would also be able to fly faster in disaster areas, where time is critical in saving survivors.

Prototype ready for the future

"There is still a lot of work to be done before these drones can be deployed in the real world since the event camera used for our research is an early prototype. We have yet to prove that our soft-ware also works reliably outdoors," says PhD Student Henri Rebecq. And Professor Scaramuzza adds: "We think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system."

###

Literature:

Antoni Rosinol Vidal, Henri Rebecq, Timo Horstschaefer, Davide Scaramuzza. Hybrid, Frame and Event-based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors. IEEE Robotics and Automation Letters, September 19, 2017.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.