Dynamic vision sensor provides eyes for agile UAV

Oct. 8, 2014
Researchers from the Robots and Perception Group at the University Of Zurich have utilized a newly-developed image sensor called the Dynamic Vision Sensor—which transmits event-based, per-pixel data—to provide vision for a swift-maneuvering unmanned aerial vehicle.  

Researchers from the Robots and Perception Group at the University Of Zurich have utilized a newly-developed image sensor called the Dynamic Vision Sensor (DVS)—which transmits event-based, per-pixel data—to provide vision for a swift-maneuvering unmanned aerial vehicle (UAV).

The team fitted a modified AR UAV with the DVS, an image sensor that was developed by the Institute of Neuroinformatics, also located at the University of Zurich. The DVS is a 128 x 128 pixel vision sensor that—unlike conventional image sensors where image data is clocked synchronously from the device—is an asynchronous device that only outputs events should a change in brightness occur at any pixel. Not only does this reduce the amount of power that is needed to be supplied to the device but also lowers the data bandwidth.

In this UAV project, the team presented the first onboard perception system for 6 degrees-of-freedom localization during high-speed maneuvers using the DVS. Similar to how the human eye works, the DVS only transmits the pixel-level brightness changes as they occur with microsecond resolution, thus offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot, according to the university.

Experiments were conducted by the team to see if the DVS would be a viable fit as a vision system to enable high-speed maneuvering of the UAV. An event-based pose estimation algorithm was used to integrate events until a pattern is detected, then it tracked the line segments, which defined the borders of a pattern by updating the lines and the pose at microsecond time resolution, as soon as a new event arrived.

In the tests, the team found that the rotation of the UAV can be estimated with surprising accuracy. During the experimental flights, the algorithm could track the DVS trajectory for 24 of 25 flips (96%). Position and orientation data was harder to obtain, however, as it can only be estimated very noisily, because it produces few events due to very small apparent motion. The team expects that, with a higher resolution DVS, the results would significantly improve.

View the research paper.

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox, click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter | Check us out on Google +

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!