Vision helps steer the automotive industry

Dec. 1, 2003
A discussion with Valerie Bolhouse of Ford Motor Company
Click here to enlarge image

A discussion with Valerie Bolhouse of Ford Motor Company

Valerie Bolhouse is a staff technical specialist in machine vision with the Advanced Manufacturing Group at Ford Motor Company (Detroit, MI, USA). She has a BS in electrical engineering from the General Motors Institute (Flint, MI, USA) and an MS in electrical and computer engineering from the University of Michigan (Ann Arbor, MI, USA). Contributing editor Winn Hardin talked to her about the applications and opportunities for machine vision in the company.

VSD:In what ways is Ford using machine vision?

Bolhouse: Ford uses machine vision as an integral part of a number of different processes. We use it for robot guidance, inspection, traceability and identification, and in-station process control. Vision systems are used for robot guidance for assembly and sealing applications. Also, robots receive positional offset data from the vision cameras for installation of fixed glass.

VSD:What changes within Ford's automobile-manufacturing operations are driving these applications?

Bolhouse: An emphasis on lean manufacturing and flexibility has driven us to rethink how we design our processes. Lean manufacturing principles promote error proofing and in-station process control instead of end-of-line inspection systems. The process steps are defined so that bad parts are not made and do not exit the cell. Vision is used to ensure the right parts are used in the assembly, and that the process is done correctly. The vision camera becomes just a sensor in the cell rather than the primary function of the station. The opportunity exists for many more of these systems in the automotive assembly plant; however, they are typically the simple, low-cost smart camera systems.

Flexibility drives the use of vision in two areas: accommodating changes in part design and handling different parts on the same equipment. Vision can replace custom-designed hard tooling for locating or fixturing parts. If hard tooling were used to fixture parts, there would be a need to custom design the part. It's not uncommon to have the end-of-arm tooling or locating fixtures cost more than the robot itself. Then, if the part geometry changes, the tooling needs to be redesigned. If you take advantage of the reprogammability of vision, you would only have to make software modifications to the vision algorithm rather than hardware changes to the tool. This flexibility also enables you to handle a family of parts without putting constraints on the part features or using expensive, complex multifunctional tools. Instead of hard tooling to locate parts precisely, a vision system locates the part and provides offsets to a robot.

Vision plays another role in manufacturing flexibility—the tracking or identification of parts by model. We do this by marking the parts with 2-D matrix codes or by using vision to identify part features to differentiate model type. The 2-D matrix codes are also being used for part traceability. We encode information on the part's birth history, such as date and time of manufacture or which machine was used, and this travels with the part as it progresses down the line. The data get scanned with a 2-D code reader, which is actually a dedicated vision system, and stored in a database. The number of vision systems for traceability is growing at a much faster pace than vision for inspection or robot guidance.

VSD:Within each application area, what are the most important performance criteria when qualifying a system?

Bolhouse: Robustness is the most important performance criterion across all applications. The systems must be tolerant to variations in the environment, be they changes in incoming parts or the ambient light. We expect the equipment to work without operator intervention. The days when we would routinely accept constant vision tweaks are gone. If it doesn't work reliably, it will be turned off and removed.

VSD:Do different vision technologies lend themselves to specific automotive test applications?

Bolhouse: Most of our vision applications use reflected light in the visible spectrum. X-ray, infrared, and ultrasonic technologies are occasionally used for testing preproduction prototypes and for materials or process development but are not generally used in automotive manufacturing. I do see a trend to the near-IR with advances in LED lighting, but this is really just an extension of the visible spectrum. It turns out that solid-state cameras are very responsive in the red to near-IR spectrum.

We can take advantage of this spectral response and the monochromatic output of LEDs to mitigate the effects of ambient light. A bandpass filter matched to a near-IR LED will block out much of the ambient light in the factory and pass only the reflected illumination from the light source. LEDs provide a host of other benefits for machine vision installations—they have long life, good aging characteristics, and provide energy efficient illumination. In fact, LED lighting is one of the key factors for the increase in system robustness we've seen with vision.

VSD:Does Ford perform its own systems integration or look to outside integrators for support? If so, what are the criteria for approving an outside source?

Bolhouse: Ford relies on our full-service suppliers to provide fully integrated process solutions to the manufacturing plant. If vision is a very small portion of the cell, we won't specify how to do the vision function or whose equipment should be used. This gives our equipment suppliers a lot of latitude in selecting a vision system most compatible with their equipment. It also makes them completely responsible for providing an acceptable solution. What you learn about vision systems is that there are many answers to the same problem. No one technology has the corner on the market.

It's a little different for projects where vision is the primary function of the cell. Then we're more sensitive about the need to minimize the different types of equipment for our plants to maintain and support. A solution proven at one site will be replicated across many different locations. We try to balance the benefits of commonality of equipment in our plants against being able to optimize an individual solution.

VSD:How do you envision the future of machine vision in the automobile-manufacturing industry?

Bolhouse: I think we're going to see increased growth in vision-guided robotics in the automotive industry. We have operations where we have to rotate our people through every couple of hours because of the ergonomics. These jobs are best done by a machine where repetitive stress is not an issue. We've been unable to automate these operations in the past, because either it was cost prohibitive or the automation systems were not reliable enough. That is changing with the recent advances in robots and vision. I've seen demonstrations of vision-based line tracking, where we can pick and place or assemble "on the fly" without stopping the line. This development is huge for the automotive assembly plant where most operations happen on a moving line, and it is incredibly expensive to install a stop station to automate assembly.

I also see vision replacing precision locating fixtures and tools. There really isn't any reason why automotive assembly cannot be more like semiconductor or electronics assembly, where vision is used in every operation for flexibility and quality. Instead of custom-designed assembly equipment, you see mostly generic processing equipment all computer-controlled with vision being used to handle the mix of components with the precision that is required for the operation. We're already seeing generic welding cells for body construction. Why not similar cells for assembly?

VSD:How will vision systems have to change to successfully meet those emerging applications?

Bolhouse: Vision will have to become easier to program. If vision systems were more like sensors rather than vision computers, they would be more readily accepted. Self-learning algorithms could be used to identify good from bad. Calibration tools for vision-guided robotics should be integral to the vision system. They should be packaged ready for the factory environment.

The technology has already come a long way with the network connection being standard on most smart cameras. We're seeing the benefits of this as smart cameras continue to proliferate. ..

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!