Vision-guided robot packs produce
Off-the-shelf vision components team with automated system to pack apples for display.
Off-the-shelf vision components team with automated system to pack apples for display.
By Matthew Peach, European Contributing Editor
Food retailers have discovered that displays of apples with uniform layers of ripe skins facing shoppers sell better than nonuniform displays. Consequently, supermarket produce buyers are requiring suppliers to undertake this positioning during packing. This task has meant considerable expense in packing millions of apples by hand. However, the Apfelrobo, a robot introduced at the Hannover Fair 2004 in April, can complete this task by combining data about the fruit gathered from a vision system and manipulating apples with electric linear drives.
The Apfelrobo was developed by Hoerbiger-Origa in conjunction with Schuster Messtechnik, an Austrian company that specializes in fruit technologies (see Fig. 1). The company is expecting that about 25 robots will be deployed into the logistic chain this year, with about 50 in 2005. Schuster president Hermann Schuster says, “The Apfelrobo was developed in cooperation with fruit growers and wholesalers. The cost for packing the apples currently is €0.03 to €0.04 per kilogram. Until our development, this packing could only be done by hand. The robot does the job with higher precision and cuts costs to just 1 euro cent per kilogram of apples. Currently, there is no other such system is available.”
Apples are delivered on a conveyor belt, and a suction cup on a belt-driven actuator picks up each one. An air stream rotates the apple until the stalk stands vertically upright. The apple is then rotated through 360° while being inspected by the vision system, which seeks the reddest area. This is determined in real time, and the fruit is rotated, backward or forward, while it is transferred and lowered onto the packaging pallet. The apples are arranged to point backward, away from the final customers’ viewpoint. The Apfelrobo finally packs the filled pallets into boxes and places them onto an exit conveyor belt to continue to the stores (see Fig. 2).
The apple manipulation-and-handling systems robot consists of five distinct stages:
Pick: The suction cup precisely grasps the fruit from the incoming conveyor belt.
Align: In this orientation process, a current of air turns the apple over to align it in its vertical stem-goblet axis, the goblet being the base of an apple.
Inspect: The fruit is subjected to a digital image analysis while being turned through 360° in 50 ms. Following analysis, the apple’s most attractive side is determined.
Position: The apple is turned through 90° so that the attractive side lies uppermost.
Pack: The apple is transferred into the packaging carton. The arrangement can cope with different types of packaging, as well as simultaneously offering the option of classifying the apples into three user-determined classes. Digital image analysis covers the classification of the apple’s size, color, and quality.
All movement axes are based on Hoerbiger-Origa’s OSP-E electric linear drives. Belt-drive models are used in preference to ballscrew units because the former’s robust construction requires minimal maintenance.
During development of Apfelrobo, ballscrew drives were used, but Hoerbiger says it could easily replace these with belt units for field deployment, as simplicity of design is attractive in agricultural and related environments. Similarly the original servomotors were replaced with steppers on the final design.
High speed and precision were required of the drive axes, as throughput is high. Similarly the vision system was required to work in real time, as no delays could be tolerated. In fact, the inspection rotation of the apple, the analysis to determine its best aspect, and subsequent rotation to its final position have to be undertaken in less than 500 ms.
The total time allowed for Apfelrobo to complete all of its functions is 2.5 s per apple, so that a productivity of 300 kg per hour can be maintained. This is roughly equal to the performance of a worker, a target figure often used when converting a manual process into an automated one so that bottlenecks are not created further downstream. It is expected that speed increases will follow over a period of time as the industry becomes used to automation. Additionally, the machine is about twice as accurate as a person.
The vision system is based on a DFK4303 640 × 480-pixel, progressive-scan, color CCD FireWire camera from The Imaging Source with a Cosmicar standard 12.5-mm-focal-length C-mount lens. The camera acquires images at a rate of 15 frames/s at full resolution. Schuster says, “Due to the robust image-processing system, we do not need to use specialized equipment such as linescan cameras, structured lighting, and measurement lenses.”
The apple is illuminated by four white 5-W LEDs. No special color correction is needed, only standard white balancing supported by the camera’s own device driver. Power for the lighting is a 20-W switched-mode supply delivering a constant current to extend the lifetime of the LEDs. The light source is passively cool by a metal heat sink (see Fig. 3).
At the given object distance, images have a maximum resolution of 5 pixels/mm. Due to the curvature of the fruit, resolution decreases toward the side of the apple, which makes a calibration process necessary. Therefore, the shape of the apple is estimated. The acquired image is resampled using the shape information to form an image of the apple, where the image’s x-axis corresponds to the apple circumference, while the y-axis corresponds to the apple height along the surface. This approach avoids the need for more-expensive equipment such as linescan cameras or structure lighting.
Calibration of the setup takes place at each system start. The relative position of the camera and apple-supporting dish is estimated using a marker attached to the setup. The image-processing software uses RGB color information to calculate the color distribution along the apple circumference. A neural-network-based system identifies the stem/calyx of the apple (the z-axis through the apple’s core).
The image-processing system is written in C++ and runs on a Windows-based computing platform. It is based on two image-processing libraries: the Intel Image Processing Library and Giplib, an image-processing library developed and maintained by Graz University of Technology.
The associated PC is a 3-GHz Intel Pentium 4 with 512-Mbyte DDR and a 19-in. TFT monitor. For mechanical manipulation, the linear motors are stepper motors from Hoerbiger-Origa. Each of these is controlled by an Atmel Atmega 128 microcontroller. To attain high speed, the drives operate at up to 2 m/s. This means that the Apfelrobo system can pack up to 250 kg per hour (around 1500 apples) or about 2.4 s per apple.
The system has an on-board PC and dedicated software, written in National Instruments C++-based LabWindows and designed to analyze the shapes of the apple surfaces. “This measurement task is twofold,” Schuster says. “First, the correct azimuth angle needs to be measured, which means selecting the correct side of the apple. Second, the vertical orientation of the apple needs to be determined, which means making a binary decision-whether the stem or calyx (base dimple) faces upward.
“To measure the correct azimuth angle, we rotate the apple in front of the camera, acquire a number of images along its circumference, rectify the images, register them against each other, and receive a panorama image that shows the whole circumference of the apple,” says Shuster. “From this image, the color distribution along the apple’s circumference is determined. We take the absolute maximum in red color as the correct side. As for the vertical orientation, the apple is positioned on the dish with either stem or calyx facing upward.”
There is not always a stem on the apple, and the shape of the apples varies considerably between different sorts and within apples of the same type, so a neural network is used to decide whether stem or calyx faces upward. This network is fed with details of the geometric features from the upper region of the apples visible in the image and give a probability of whether it sees a stem or calyx. Because there are several images of a given single apple from different viewpoints, the system can make a majority-based decision about the apple’s orientation (see Figs. 4 and 5).
FIGURE 5. A simple user interface allows an operator to monitor the status of apple packing and to monitor the correct pick and placement of apples in packing cartons.
Schuster says that apple packaging costs can reduced by about €0.02 per kilogram, a savings of €6.7 per hour or up to €1100 per week, giving a rapid payback time.
features, advantages, benefits
The Apfelrobo is designed for use at fruit wholesalers, which are expected to deploy at least four robots each. Hermann Schuster at Schuster Messtechnik says, “One human operator can simultaneously manage up to four Apfelrobos, typically replacing five people under the present manual sorting arrangements. Because the robots feature various flexible software options that can be changed by the user, it is possible for other fruits to be sorted. For example, we have already received enquiries about sorting dates.”
The system is expected to cost between €50,000 and €60,000, and the company believes that global market potential is around 10,000 to 15,000 systems. Schuster adds that some of the key fruit industry players in Austria--and potential customers--have cooperated in the development, including Steirerfrucht (Wollsdorf), EFKO (Traun), Josef Ahorner (Vienna), and Kurt Gubitzer (Graz).
Atmel, San Jose, CA, USA www.atmel.com
Cosmicar/Pentax, Japan www.pentax.com
Graz University of Technology, Austria www.tugraz.at
Hoerbiger-Origa,Wiener, Neustadt, Austria www.hoerbiger-origa.at
National Instruments, Austin, TX, USA www.ni.com
Schuster Messtechnik,St. Radegund, Austria
The Imaging Source, Bremen, Germany www.theimagingsource.com