Students develop 3D thermal mapping system for firefighting robots
A team of students from the University of California San Diego developed a 3D thermal RGB mapping system for firefighting robots, a project that won them the top prize in DRS Technologies' second annual Student Infrared Imaging Competition.
The top prize winners for “Best Overall Project,” were Will Warren, Daniel Yang, and Yuncong Chen, students from the University of California San Diego, who collaborated on their “3D Thermal RGB Mapping for Firefighting Robots,” project, in which the team aimed to develop semiautonomous robots to assist in firefighting, search and rescue operations, and environmental monitoring.
Using stereo and infrared imagery, the team was able to map urban environments and detect survivors. The robot employs a Tamarisk 320 infrared imager with dimensions of 28 x 24 x 35mm and a field of view of 40x30°. The firefighting robot (FFR) is designed to drive in a Segway-like manner with an actuated center leg that can raise the body up allowing the robot to climb stairs and overcome obstacles, according to the project’s accompanying YouTube demonstration.
The robot’s infrared and RGB cameras capture a series of images and with the robot’s on-board VisualSFM software; the thermal data is mapped into a 3D point cloud. Each 3D point is then projected onto the set of thermal image planes from which it is visible, according to the narrator.
“The pixel intensity values of the corresponding coordinates are averaged and assigned to each point in the thermal 3D cloud,” he said.
With the thermal 3D point cloud, the robots are able to provide 3D maps as well as temperature data of a search and rescue location.
To view more information on the top prize winner, and to read about the second and third place winners, view the University of California San Diego press release.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.