Vision system guides robotic starfish assassin

Oct. 1, 2018
Featuring a 3D camera and a pair of machine vision cameras, the RangerBot underwater autonomous robot is designed to locate coral-harming crown-of-thorns starfish and deliver fatal injections to protect the Great Barrier Reef.

RangerBot, an autonomous underwater vision-guided robot developed by the Queensland University of Technology (QUT; Brisbane, Queensland, Australia; www.qut.edu) in conjunction with Google (Mountain View, CA, USA; www.google.com) and the Great Barrier Reef Foundation (www.barrierreef.org), will soon be deployed into the waters near the Great Barrier Reef for purposes of monitoring and protecting its health.

Initially developed as “COTSbot” by QUT, the RangerBot autonomous robot won the 2016 Google Impact Challenge People’s Choice prize of AUD$750,000, enabling QUT roboticists to develop the technology into a cost-effective alternative to manual reef protection by divers. RangerBot uses a vision system and detection software, which was specified and developed by Professor Matthew Dunbabin and Dr. Feras Dayoub of QUT, to identify and help control the harmful crown-of-thorns starfish, which destroys coral in the Great Barrier Reef.

Figure 1: Queensland University of Technology’s RangerBot is designed to help monitor and protect the Great Barrier Reef by locating and eliminating harmful crown-of-thorns starfish.

The robot features a visionsystem based on a twin stereo camera setup looking downward and a single camera looking forward. The down-facing cameras are used for odometry and science tasks (e.g. to detect the starfish), while the front-facing camera is used for general obstacle avoidance.

The downward cameras used were CM3-U3-13S2C-CS Chameleon 3 color cameras from FLIR (Richmond, BC, Canada; www.ptgrey.com), which are 44 mm x 35 mm x 19.5 mm enclosed USB 3.0 cameras that feature Sony’s (Tokyo, Japan; www.sony.com) ICX445 CCD image sensors. The ICX445 is a 1/3” 1.3 MPixel CCD sensor with a 3.75 µm pixel size that can achieve frame rates of 30 fps. The forward camera is a Zed 3D camera from Stereolabs (San Francisco, CA, USA; www.stereolabs.com). The Zed stereo camera features two 1/3” backside-illuminated 4 MPixel image sensors with 2 µm pixel size that can capture HD video at 30 fps.

“We’re thrilled to see RangerBot come to fruition because this project is about giving those looking after our coral reefs the tools they need to protect them,” Great Barrier Reef Foundation Managing Director Anna Marsden said. “Combining the expertise of innovators like Google and QUT, this project is a great example of harnessing technology to benefit the Reef.”

Figure 2: Professor Matthew Dunbabin deploys the vision-guided robot into the waters near the Great Barrier Reef.

Dunbabin noted, however, that the team is experimenting with a number of different types of cameras and that the focus is more on developing the algorithms to process images from these cameras to work reliably in reef environments.

“RangerBot is the world’s first underwater robotic system designed specifically for coral reef environments, using only robot-vision for real-time navigation, obstacle avoidance and complex science missions,” said Dunbabin.

He continued, “RangerBot can stay under water almost three times longer than a human diver, gather more data, and operate in all conditions and at all times of the day or night, including where it may not be safe for a human diver. The robot is fitted with computer vision to ‘see’ where it’s going and avoid obstacles as well as multiple thrusters, so it can move in every direction.”

All image processing is done on-board the robot, and this is performed on a low-power (less than 20W) GPU. The software, according to Dunbabin, is built around the Robotic Operating System (ROS; www.ros.org) and is optimized to exploit the GPU. The detection software —according to Dayoub, who designed it—will continue to learn from its experiences in the field.

“We’ve ‘trained’ RangerBot to detect crown-of-thorns starfish – and only these coral-destroying starfish – in much the same way as people learn to differentiate between various forms of sea life. Using real time computer vision processed on board the robot, RangerBot can identify these deadly starfish with 99.4% accuracy. Once the identification is confirmed, RangerBot can instigate an injection which is fatal for the crown-of-thorns starfish, but doesn’t affect anything else on the reef,” he said.

He continued, “We believe this represents a significant technology leap in both marine robotics and reef protection – the only autonomous, affordable, multi-function solution for effectively detecting and addressing threats to coral reefs.”

“It’s an impressive piece of technology, but RangerBot is also deliberately low cost, to allow production to be scaled up once the next level of operational testing is completed and all the necessary approvals are in place.”

RangerBot’s capabilities have been tested extensively both in the lab and on the Reef, according to QUT. Next steps involve further collaboration with the Great Barrier Reef Marine Park Authority, The Australian Institute of Marine Science, and others on the specific testing, review and approvals necessary to ensure RangerBot is set to take on Reef duty.

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!