Machine vision system inspects mixed model automotive components

July 28, 2021
Automated inspection system combines industrial cameras, robots, flexible part feeders, and vision software to inspect 15 different part types.

Skye Gorter and John DeWaal

Machine vision has long served the automotive manufacturing industry, but evolving needs require adaptable solutions. Suppliers developing multiple variations of the same part type need a machine vision solution that can inspect mixed models without requiring new hardware, additional floor space, or even new systems. When a Tier 2 automotive supplier’s existing system became too rigid and error-prone, it hired system integrator Skye Automation (Stirling, ON, Canada; to develop a flexible new solution.

The automotive supplier’s previous system had feeding problems. The supplier used vibratory bowl feeders to sort and orient parts for a robot to pick. Introducing a new part into the process required adding a new bowl feeder, which created issues with floor space and the sheer number of bowl feeders required to do the job. Additionally, parts were often misfed or became jammed in the bowl feeders. Skye Automation was tasked with creating an entirely new inspection system—from parts feeding equipment to the packing conveyor.

Dynamic Part Inspection

With the new system, an operator sets up a recipe in the HMI for the correct part inspection and indicates the number of boxes to fill and the total number of parts to run. The operator then loads a shift’s worth of boxes onto the load station. Once the software loads the job, the operator hits “start”, and the machine can run. A magnetic feeder begins the process by feeding parts into two hoppers in the robotic work cell. From there the parts enter vibratory feeders and then two programmable FlexiBowl feeders (Figure 1) These flexible part feeders from ARS Automation (Arezzo, Italy; separate the parts, so that a robot can make picks using grippers and auto-tool changers from Schunk (Lauffen am Neckar, Germany;

The system uses five VCXG-51M 5-megapixel GigE cameras from Baumer (Frauenfeld, Switzerland;, two of which comprise the inspection station (Figure 2). One camera looks down from the top and another from the side, providing two views of the same part during inspection. Two cameras also look down over the FlexiBowls to inform the robot of pick locations, while another camera provides barcode verification at the end of the process. System lighting comprises coaxial diffused lighting, off-axis ring lighting, backlighting, and a barcode reader light, all from Metaphase Technologies (Bristol, PA, USA;

When a part comes into the field of view of the cameras looking at the FlexiBowls, the system captures images, and the robot picks the part and places it onto the inspection table, which shuttles in front of the inspection system. The two cameras there capture images of the part and transmit that data via a GigE cable to a Neousys Technology (New Taipei City, Taiwan Nuvo-7006DE fanless industrial computer (Figure 3) running Q.VITEC (Wunstorf, Germany; machine vision software. An air jet blows the part into a sorting shoot, where a pneumatic cylinder moves a flap to direct the part in one direction or the other, based on a pass or fail determination from the software. Parts either pass into a box that sits on a conveyor or go directly into a reject bin for disposal. Once the box hits a predetermined quantity, it indexes up an elevator and rolls down a gravity-fed conveyor. The system indexes the next empty box in front of the load station.

An inkjet barcode printer synchronized to the conveyor motion directly prints codes onto the filled boxes, and the fifth GigE camera reads the data on the codes to ensure that boxes have been filled with the correct parts and that the barcodes match the current recipe running on the system.

To inspect multiple different part types, the operator simply chooses the correct recipe prior to inspection instead of having to change processes or add new hardware. The initial design of the system was based on 12 components the automotive supplier wanted to inspect, but it now handles 15 different components.

A programmable logic controller (PLC) operates a variable frequency drive (VFD) from Lenze (Aerzen, Germany; It helps elevator, pusher, and linear motor actuators from IAI America (Torrance, CA, USA; provide motion for the system. Despite the use of a PLC, a PC sits at the heart of the system.

The software developed for this technology features a full visualization of the entire system—including all five cameras in a single visualization—that allows operators and technicians to pinpoint areas of concern directly from the HMI. The PC enables this, along with the ability to be Industry 4.0 ready and to integrate into a customer’s OPC UA framework.

Coping With COVID-19

One consideration that led to developing the system around a PC was remote access—a particularly important point since this system was deployed during the COVID-19 pandemic.            

COVID prevented travel, and Skye Automation commissioned and shipped the machine without anybody ever stepping foot in the plant. The PC provides a full view into the performance of the system, something that would not be possible with a design based on a PLC.

Furthermore, the PC offers 8-core Intel i7 processors, which allowed Skye Automation to separate out different processes and run them autonomously without them interfering with one another. In terms of CPU power, PLCs can’t match PCs in terms of price and performance. With the PC, the vision task runs on multiple processers, allowing visualization on one and Windows on another.

From an efficiency standpoint, a PLC that accomplishes the same or similar tasks would be exponentially more costly but also more rigid in its programming environment. A customized visualization program would also be required, in addition to the PLC program.

Expanding Into Edge AI

Big buzzwords like “machine learning,” “artificial intelligence,” and “deep learning” loom large over machine vision and manufacturing processes. PCs provide a foundation for a control system should a particular application require something beyond rules-based machine vision. Skye Automation now designs all machine vision systems with the idea that deep learning or machine learning tools can be added in the future.

In the past, when a machine vision system started running after installation, it never got any better. But, a machine vision system from Skye Automation will get better over time because integrators can use GPUs, TPUs, and VPUs to convert an industrial PC from CoastIPC (Hingham, MA, USA;—Skye Automation’s industrial PC supplier—into an edge AI device.

Machine vision solutions must adapt to the ever-increasing needs of today’s manufacturing environment. Designing a system up front that can flexibly expand its capabilities over time offers one efficient method for doing so.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!