Vision-guided robot learning to fly real airplanes
Researchers from the Korea Advanced Institute of Science and Technology have modified a small humanoid robot in order to monitor and control a simulated aircraft cockpit.
The vision-guided robot is able to identify and use all of the buttons and controls within a cockpit of a normal light aircraft designed for a human pilot, according to IEEE Spectrum. While most of the inputs come from the simulator itself (roll, pitch, yaw, airspeed, GPS location), the robot uses its vision system for such tasks as identifying the runway using edge detection. To do this, the robot features a FireFly MV camera from Point Grey.
This USB 2.0 camera features an Aptina MT9V022 0.3 CMOS image sensor with a 6 µm x 6 µm pixel size. The camera achieves a frame rate of 60 fps at 752 x 480 pixels and also features 8 and 16-bit digital data output, pixel binning and region of interest modes, as well as gamma, lookup table, hue, saturation, and sharpness image processing functionalities.
With the simulator and its vision system, according to the researchers involved in the project, the PIBOT satisfies the various requirements specified in the flying handbook by the Federal Aviation Administration. A presentation of the PIBOT system performing a takeoff and landing simulation was showcased at the International Conference on Intelligent Robots and Systems in Chicago last week.
The researchers have already used the PIBOT system to autonomously fly a small-scale model biplane, and this work is expected to be presented at a forthcoming conference. The team is still working out some perception challenges and landing issues, but expects these issues to be resolved soon, according to IEEE Spectrum.
Via IEEE Spectrum.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.