Page 2: Vision-guided humanoid robot and UAV work in tandem to fight fires
Editor's note: This article is continued from page one.
In addition to the humanoid robot, a small quadrotor developed by researchers at Carnegie Mellon University's Robotics Institute and spin-off company Sensible Machines was also tested aboard the USS Shadwell. In the demonstration, the UAV operated in confined spaces inside of the ship to quickly gather situational awareness to guide firefighting and rescue efforts. As part of the DC-21 concept, information gathered by the UAV would be relayed to the SAFFiR reobot, which would work with human firefighters to suppress fires and evacuate casualties.
"Flying autonomously through narrow doorways in darkness and smoke poses a number of technical challenges for these small drones," said Sebastian Scherer, systems scientist at CMU's Robotics Institute. "But this capability, known as 'fast lightweight autonomy,' will have numerous applications beyond shipboard fires, such as investigation of building fires and inspection of hazardous chemical tanks and power plant cooling towers."
The UAV utilizes an RGB depth camera similar to that of a Microsoft Kinect 3D vision sensor in order to build a map of fire areas.
"It actually works better in the dark," Scherer noted, because there's less ambient light to interfere with the infrared light the camera projects. “We flipped it around, using mainly the depth camera to build our maps.”
In addition to the depth sensor, the UAV has a forward-looking infrared camera that is used to detect fires, and a downward facing optical flow camera to monitor the motion of the UAV as it navigates.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Page 1 | Page 2