Perceived color depends on the intensity and direction of the illuminating light and the texture and shape of the illuminated surface.
By C. G. Masi, Contributing Editor
Appearance is critical for metal finishers such as Light Metals Coloring (LMC; Southington, CT, USA) that apply gold or clear (silver) anodized finish to decorative caps on perfume bottles. Obtaining the correct color is imperative, but variations in the anodizing process can cause color variations. "The process of bulk anodizing has variability," says John Schunemann, president of Thames Engineering and Design (Waterford, CT, USA), "so there is variation in how the final part will appear. "Indeed," he says, "maybe 7% to 10% of the parts are hardly colored at all!"
In the past, inspectors sat next to a conveyor to inspect gold-anodized parts, a job that was labor-intensive and impossible to perform without making mistakes. LMC hired Thames Engineering to develop an automated inspection system to replace the human inspectors. Although the company has C++ image-processing capability, Thames decided to tap the experience of engineers at Bloomy Controls (Windsor, CT, USA), who had previously used LabView software from National Instruments (NI; Austin, TX, USA). Bloomy Controls found that it is easier to perform color inspection when the pixel data are expressed in hue-saturation-luminance (HSL) color space, rather than RGB coordinates.
HSL vs. RGB
"HSL is much closer to the way human perceptual systems interpret color," says Robert Hamburger, principal engineer and project manager at Bloomy Controls. "When you look at a red stop sign you do not perceive it as 95% red, 3% blue, and 2% green. It appears as very red, saturated, and bright."
The advantage of the HSL is that uncontrollable confounding factors generally affect the saturation and luminance dimensions, leaving the hue unaffected. Variations in illumination brightness, for example, affect luminance, not hue. Similarly, specular reflections affect both saturation and luminance, not (assuming the light being reflected is white) hue. As long as the saturation and luminance are in a range that allows the camera to respond reliably to hue, the system will function.
Most machine-vision cameras encode their images in RGB space. So the first task of the imaging system is to transform the color vectors representing each pixel from RGB space to HSL space. "The conversion from RGB to HSL is very processor-intensive," Hamburger says. "It has to be performed on a pixel by pixel basis, and, if this is performed in software, it slows the processor.
Bloomy Controls had already solved this color-conversion problem in real time. The company realized that color matching in real-time or nearly real-time machine-vision systems would be useful and built a demonstration system that scored a table-top game by following a colored ball as competitors knocked it around the table.
Signal processing
To speed the color conversion, Bloomy selected an image-acquisition card that has an on-board chip to perform RGB-HSL conversion. The PCI 1411 image-acquisition card from National Instruments carries a DSP chip to convert the camera's native image to any of a number of output standards, including HSL. When building the demonstration system, Bloomy incorporated this capability into real-time imaging systems programmed using NI LabVIEW.
"The code has to be written in such a way as to allow the processor to efficiently perform multithreading and multitasking," Hamburger points out. In this case, the processor had to be shared between two independent asynchronous processes, and the code was structured to make it as processor-efficient as possible and to take advantage of Windows' ability to perform multitasking and multithreading.
The resulting anodized-part inspection system uses two cameras to view parts from two belts at a total rate of 500 units per minute (see Fig. 1). Using a light-beam part-presence detector to trigger the start of an acquisition, the system deploys two DXC-190 cameras from Sony Electronics (Park Ridge, NJ, USA) that use 25-mm Model 23FM25-L lenses from Tamron USA (Commack, NY, USA) that run at 60 (noninterlaced) frames per second.
FIGURE 2. Co-axial illumination system provides on-axis illumination without putting the light source between the camera and the object. A partially silvered mirror set at a 45º reflects light rays from a diffuse source through 90º to illuminate the object from above. The camera, looking through the beamsplitter, sees the object evenly illuminated from above, but with no light source visible. When the object is near the image center ("sweet spot"), shadowed areas are virtually eliminated.
As a unit to be inspected moves down the lane, it crosses the beam that triggers the system to save the next frame after a suitable delay so that the part reaches the optical system's "sweet spot," where the lighting is most uniform. Illumination for the system is provided by an ILP CIS coaxial illumination system from Volpi AG (Schlieren, Germany) that illuminates the parts at 90º and reflects from a one-way mirror (see Fig. 2). The camera, looking downward through the mirror, sees only the flat illumination light without any reflection from the camera or the surrounding room.
Clearly, such a triggering scheme introduces an element of randomness, or jitter, into the part's position in the acquired frame. "That's not important here," Hamburger says. "We take all the pixels in the region of interest, apply an HSL threshold, count up the number of pixels that are good (that is, within a predetermined hue-tolerance band) versus the number of pixels that are not good, and apply an acceptance threshold. If there are not enough good pixels, the computer sends a signal to the PLC that is controlling the whole machine. Several milliseconds later, a diverter gate knocks that part into a reject bin."
Since, the accept/reject analysis operates in HSL color space, not physical space, it does not matter where the pixels are in the field of view. Pixels that just show the background do not have the proper hue and count as "bad" pixels no matter where they are. Pixels that show the surface of a good part have the right hue—no matter where they fall in the field of view—and count as "good" pixels. As long as the entire part is captured in the optical sweet spot (ensured by the timing and triggering subsystem), this statistical approach is largely error-proof.
Color-anodized parts arrive from the finishing process in loose-bulk containers (see Fig. 3). An operator pours those parts through a large hopper into a vibratory-bowl feed system. There is one hopper and bowl for each lane (see Fig. 4).
As the machine runs, images from both lanes appear on the monitor screen along with the analysis results. For samples that have a percentage of desired color lower than the threshold percentage, a red reject light appears on the screen. A green "test pass" light appears for all samples that pass the inspection criterion. The host computer displays and stores statistics, such as the number of passes and rejects, for each batch of parts tested.
Working in HSL space makes training the system easier. "This manufacturer has about a half-dozen different parts, two different sizes, and at least two very different colors: one that is pretty much gold—coppery gold—and another that is a silver. So, before running the system to inspect parts, the operator teaches the system what the acceptable range of color is for a given part," Hamburger says.
The operator selects one good part to use as a model. This part is not the best-looking part, but a mediocre part whose color is in the middle of the range. Part of the training process is to set tolerance bars around the hue value of this "golden" part. At the end of this procedure, the new part appears among the selections.
Company Info
Bloomy Controls www.Bloomy Controls.com
Light Metals Coloring www.lightmetalscoloring.com
National Instruments www.ni.com
Sony Electronics www.sony.com/videocameras
Tamron USA www.tamron.com
Thames Engineering and Design www.thamesed.com
Volpi AG www.volpi.ch