An autonomous mobile robot developed as part of the European Commission’s Horizon 2020 (H2020) framework that uses a 3D camera and object identification and tasking planning algorithms to harvest sweet peppers has been officially introduced after years of research and development.
Known as SWEEPER, the robot was developed collaboratively by a consortium including Ben-Gurion University of the Negev (BGU; Beersheba, Israel; http://in.bgu.ac.il/en), Wageningen University & Research (Wageningen, Netherlands; https://www.wur.nl/en.htm) pepper grower De Tuindershoek BV (IJsselmuiden; Netherlands; https://www.tuindershoek.nl), Umeå University (Umeå, Sweden; https://www.umu.se/en), the Research Station for Vegetable Production (PKSW; Sint-Katelijne-Waver; Belgium; https://www.proefstation.be/), and Bogaerts Greenhouse Logistics (Hoogstraten, Belgium; http://www.bogaertsgl.com).
In modern greenhouses there is high demand for the automation of labor, according to the SWEEPER team. The availability of a skilled workforce willing to do repetitive tasks in the harsh climate conditions of a greenhouse is decreasing rapidly. To automate the harvesting of sweet peppers in a real-world environment, SWEEPER utilizes a number oftechnologies.
Developed in a consortium including BGU researchers, the SWEEPER vision-guided robot is designed to operate in a single stem row cropping system, with non-clustered fruits and little leaf occlusion. Credit: Research station for vegetable production at Sint-Katelijne-Waver
The SWEEPER platform is based on a FANUC (Oshino, Japan; www.fanuc.com) LRMate 200iD robot manipulator and a custom-built gripper and catching mechanism for sweet pepper harvesting. The LRMate 200iD is a six-axis robot with the approximate size and reach of a human arm, and features IP67 protection, a reach of 717 mm (28.22 in.) and a load capacity of 7 kg (15.43 lbs.) The robot uses a Fotonic (https://www.autoliv.com/) F-80 3D Time of Flight camera, which is now discontinued following the acquisition of Fotonic by automotive safety system company Autoliv (Stockholm, Sweden; www.autoliv.com). Fotonic’s F80 camera offers a VGA (640 x 480) resolution for color images and a QVGA (320 x 240) resolution for depth images. The 3D camera—which offers matching of color and 3D data—also has a GigE interface, 20 fps frame rate, and an optional ARM Cortex-A9 processor. Infrared lights around the camera provide the illumination needed to acquire a depth image, while rows of LED lights around the edge of the camera provide uniform flash lighting for the acquisition of color images.
The camera and sensor setup according to the academic paper “Research and development in agricultural robotics: A perspective of digital farming” (http://bit.ly/VSD-SWEEP) is completely independent of the surrounding light conditions and provides information about color images and distance maps that are used for fruit detection, localization, and maturity classification. SWEEPER is trained to detect obstacles such as leaves and plant stems in the images, as well. This training process was accelerated using simulated artificial pepper plant models and deep learningalgorithms.
Once the robot detects a pepper, information about its location is used to perform path planning for the robotic arm trajectory. Due to the limited space for movement between planting rows calculation of this trajectory can be very complex, according to the authors of the paper. SWEEPER’s camera then takes images from different angles so that the arm approaches the pepper in such a direction that the stem is always on the back side of the pepper. A small cutting tool is positioned just above the pepper, which cuts the peduncle while the cutting tool moves downward. This motion separates the pepper from the plant’s stem and drops it into a catching device which is moved toward the pepper bin by the robotic arm.
Years after the development of the robot project was announced, the robot was officially introduced in September of 2018 at the Research Station for Vegetable Production in Belgium.
Preliminary tests of the robot showed that by using a commercially available crop modified to mimic the required conditions the robot currently harvests ripe fruit in 24 seconds with a success rate of 62%.
During the testing of the SWEEPER robot—which can only be used to harvest fruits or vegetables that are located on the front side of the plants and stems—a yellow pepper was used. For a single row growing system, the performance of the robot, evaluated with only fruits that were on the front side of stems, was respectively 62% and 31% in the modified and commercial crop, according to the paper. In general, SWEEPER has a success rate of 49% in harvesting ripe fruits with modified crop, and only 20% with the commercial (current greenhouse growing) system. The average time to harvest one fruit with the robot is between 18 and 25 seconds, including 4.73 seconds for platform movement, 3.71 seconds for fruit localization, 3.02 seconds for obstacle localization, 4.03 seconds for visual servoing, 2.22 seconds for fruit detaching, and 7.77 seconds for dropping fruit in the container.
Members of the SWEEPER project team also reportedly achieved a harvest time of less than 15 seconds (excluding platform movement), in laboratory experiments.
Polina Kurtser, a Ph.D. candidate in the BGU Department of Industrial Engineering and Management and a member of the team, says robotic harvesting will revolutionize the economics of the agriculture industry and dramatically reduce food waste.
“The Sweeper picks methodically and accurately,” she says. “When it is fully developed it will enable harvesting 24/7, drastically reduce spoilage, cut labor costs, and shield farmers from marketfluctuations.”
Further research is necessary to increase the robot’s work speed to reach a higher harvest success rate, according to the team. Based on the latest results, the consortium expects that a commercial sweet pepper harvesting robot will be available within four to five years, and that the technology could be adapted for harvesting other crops.