FOOD & BEVERAGE: Vision system speeds egg crack detection
Although current US Department of Agriculture (USDA) voluntary regulations require humans to grade samples of eggs, the increased volume of egg processing and packaging plants is making inspection increasingly more expensive.
Although current US Department of Agriculture (USDA) voluntary regulations require humans to grade samples of eggs, the increased volume of egg processing and packaging plants is making inspection increasingly more expensive. In performing these tasks, human graders may also incorrectly grade eggs with micro-cracks that may later expand and enlarge.
To overcome these problems, Kurt Lawrence and his colleagues at the Agricultural Research Service (ARS) of the United States Department of Agriculture (Athens, GA, USA; www.ars.usda.gov) have developed a machine-vision based system that automates egg grading (see figure). “If a human grader spots a potential crack,” says Lawrence, “the egg is then squeezed or pressed along the crack to see if the crack opens or the shell moves.”
To emulate this process, Lawrence has developed a pressurized vacuum chamber to pull open existing cracks without damaging intact eggs by applying negative pressure. By imaging the eggs at atmospheric pressure and under negative pressure, any potential cracks or micro-cracks in the eggs can be identified.
To properly image the eggs, 15 white Luxeon LED lights from Philips Lumileds (San Jose, CA, USA; www.philipslumileds.com) were positioned under each egg outside and below the chamber. “Since eggs are effectively a varying light source when illuminated from below,” says Lawrence, “the center eggs in the chamber would often saturate the CCD used in the camera.” By driving these LEDs with four constant-current drivers from LED Dynamics (Randolf, VT, USA; www.leddynamics.com) and controlling the brightness using four potentiometers, a uniformly bright image could be captured.
After the eggs were placed in the chamber and properly illuminated, a Pike F-421B CCD monochrome 2k-pixel × 2k-pixel × 8-bit FireWire camera from Allied Vision Technologies (Stadtroda, Germany; < a href = "http://www.alliedvisiontec.com" target = "_new" class = "article_link">www.alliedvisiontec.com) with a Xenoplan f/2.0 28-mm compact style front lens from Schneider Optics (Hauppauge, NY, USA; www.schneideroptics.com) was positioned 19.7 in. above the chamber and used to image the eggs. To reduce the reflections from the blue rollers on which the eggs were placed, a Techspec 550-nm longpass filter from Edmund Optics (Barrington, NJ, USA; www.edmundoptics.com) was attached in front of the lens.
Images of the eggs at atmospheric pressure and negative pressure were captured with an 80-ms exposure using an f/8 aperture and stored in TIFF format on a PC. MATLAB software from The MathWorks (Natick, MA, USA; www.mathworks.com) was then used to capture and process images at full resolution.
To identify crack features within each egg, an image taken at atmospheric pressure was used to create a background mask with an empirically determined threshold value. To remove any noise from this mask, a median filter was applied and the result eroded by a predefined kernel to eliminate the occurrence of false positives along the egg boundaries.
After this image was obtained, the pressure in the chamber was reduced and another image captured. This image was then divided by the atmospheric pressure image, resulting in a ratio image. After a crack threshold was applied to the ratio image, the result was converted to binary format. To reduce noise effects, a median filter was also applied to this binary ratio image. The result was then combined with a Boolean add operator to the image taken at atmospheric pressure to identify cracks within each egg. Lastly, a user-defined size filter was created that counted crack pixels within each egg.
“If an egg had more crack pixels than the size threshold, then that egg was determined to be cracked and colored red,” says Lawrence. “Otherwise, the egg was determined to be intact and colored green.” In operation, the system takes approximately 0.75 s to capture the two images and another 10 s to perform the image-processing algorithm and display the results. “These times could easily be reduced with a compiled program specifically written for the application,” notes Lawrence.
In analyzing 1000 eggs, the system was 99.6% accurate with only 0.3% false positives, compared to 94.2% for human graders with 1.2% false positives. Lawrence has already filed for an international patent and is looking to automate the system further.