Vision-guided underwater robot to seek and destroy harmful starfish in the Great Barrier Reef

Sept. 24, 2015
Designed to seek out and eliminate the Great Barrier Reef’s crown-of-thorns starfish, the COTSbot vision-guided underwater robot from the Queensland University of Technology (QUT) just recently completed its first sea trials.

Designed to seek out and eliminate the Great Barrier Reef’s crown-of-thorns (COTS) starfish, the COTSbot vision-guidedunderwater robot from the Queensland University of Technology (QUT) just recently completed its first sea trials.

Crown-of-thorns starfish are responsible for an estimated 40% of the reef’s total decline in coral cover, and while people are doing a good job of controlling them, further action is needed.

"Human divers are doing an incredible job of eradicating this starfish from targeted sites but there just aren't enough divers to cover all the COTS hotspots across the Great Barrier Reef," said Dr. Matthew Dunbabin from QUT's Institute for Future Environments. "We see the COTSbot as a first responder for ongoing eradication programs - deployed to eliminate the bulk of COTS in any area, with divers following a few days later to hit the remaining COTS.”

Enter the COTSbot, designed by QUT researchers, which can search the reef for up to eight hours at a time, delivering more than 200 lethal shots. The vehicle is able to autonomously navigate via a vision system based on astereo camera setup looking downwards and a single camera looking forward. The downward cameras are used to detect the starfish, while the front-facing camera is used for navigation. The cameras used were CM3-U3-13S2C-CS Chameleon 3 color cameras from Point Grey, which are 44 mm x 35 mm x 19.5 mm enclosed USB 3.0 cameras that feature Sony’s ICX445 CCD image sensors. The ICX445 is a 1/3" 1.3 MPixel CCD sensor with a 3.75 µm pixel size that can achieve frame rates of 30 fps.

All image processing is done on-board the robot, and this is performed on a low-power (less than 20W) GPU. Thesoftware, according to Dunbabin, is built around the Robotic Operating System (ROS) and is optimized to exploit the GPU.

"Using this system we can process the images for COTS at greater than 7 Hz, which is sufficient for real-time detection of COTS and feeding back the detected positions for controlling the underwater robot and the manipulator," he said.

Page 1 |Page 2

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!