Vision-guided robot fastens lug nuts

Sept. 28, 2009
A combination of vision and robots automates the previously challenging manual process of running down and torquing automotive lug nuts

Running down and torquing the lug nuts that hold a wheel to the hub is seemingly one of the simpler aspects of building an automobile, but it has proven one of the most difficult to automate.

Running down and torquing the lug nuts that hold a wheel to the hub is seemingly one of the simpler aspects of building an automobile, but it has proven one of the most difficult to automate. This is a difficult manual job as well because of the size and weight of the nut runner and the need to tighten the nuts on two wheels in approximately 40 s. If the position of the lug nuts is known, a robot can easily position the nut runner to deliver the needed torque. The problem is that typically the vehicle is only roughly positioned by a conveyor and the wheels themselves are free to rotate, tilt, and turn. Therefore, an ordinary blind robot would never be able to find the nuts.

Radix Controls (Oldcastle, ON, Canada; www.radixcontrols.com), successfully automated this application by using a vision system to determine the position of the wheel including its fore and aft and side-to-side positions and three rotational axes. With this information, the robot can easily move the nut runner into the exact position and tighten the nuts. Automating this application made it possible for two people to move from difficult and stressful jobs to more proactive roles. "As far as we know, this is the first time this application has been successfully automated with the use of machine vision," says Shelley Fellows, vice president, operations for Radix Controls.

Challenge of automating difficult manual task
The automotive assembly plant involved in this application builds vehicles 24 hours a day with just-in-time production scheduling. In a previous assembly line station, two operators (one on each side of the vehicle) place wheels onto the four hubs. The operators then place a nut on each wheel stud and turn the nut a few times. The assembly line conveyor then moves the vehicle to the next station where the lug nuts are manually torqued down. In the past, an operator on each side of the vehicle would locate the first tire, guide the nut runner into position, torque the nuts down, move to the second wheel, guide the nut runner into position, and torque the second group of nuts. The operators have only 43 s to complete this entire operation.

"The nut runner is heavy, unwieldy, and generates a lot of counterforce," Fellows says. "As a result, this is a very physically demanding job that is prone to workplace injuries. All of the major automobile manufacturers have tried to automate this job, but they have run into some very significant challenges. These challenges arise from the fact that the vehicle cannot be repeatably positioned on the assembly line."

The conveyor moving the vehicle is not accurate enough to position the vehicle in the line of travel axis -- the x-axis -- nor in the axis perpendicular to the line of travel -- the y-axis -- accurately enough for a robot to position the nut runner on the nuts. But even if the conveyor were more accurate the wheel would have the potential to rotate in three different axes. It can be turned slightly to the left or right, tilted, (also known as camber) and rotated around the axles. All five of these axes of motion must be precisely known for the robot to position the nut runner with the required degree of accuracy. Adding to the challenge is the fact that the plant produces vehicles with a wide range of wheel types that are intermixed on the production line.

Developing the machine-vision application
"This application is impossible to automate unless the robot can reliably and repeatably locate the nuts," notes Fellows. Radix Controls developed a machine-vision application that could determine the position of the nuts in five different axes within a few seconds as needed to meet the cycle time requirements. The application relies upon two Cognex In-Sight 5403 vision systems to locate each wheel. Radix selected In-Sight because it provides a complete solution in a modular package that does not require any additional hardware or other equipment. The 60 x 110 x 80-mm package of the vision system easily fits within the tight confines of the manufacturing plant. The In-Sight 5403 model was selected because it offers a resolution of 1600 x 1200 pixels and an image acquisition time of 15 frames/s; it is suited to meet the high accuracy and short cycle time requirements of this application.

The vision application relies on the Cognex PatMax pattern-matching technology to quickly locate the wheel in the image. PatMax can be programmed to recognize any pattern by highlighting the pattern in an image taken by the camera. Radix engineers programmed PatMax to recognize each of the wheels used in the plant. The system has been set up so it can be easily programmed by plant personnel to recognize new wheel types. The information backbone that runs the assembly line communicates with the vision system to let it know which type of wheel will be on the next vehicle and the vision system loads the appropriate program.

FIGURE 1. Vision-guided robots position nut runners on wheels.

The circle finder tool from Cognex is then used to determine the precise location of the center of the axle. The company's edge tools inspect the feature in the center of the rim to determine the angle of rotation of the wheel. When the first image is taken, a laser generates a crosshair on each wheel. Then the first laser is turned off, a second laser generates another crosshair from a different angle, and a second image is acquired. The edge tools are then used to inspect the crosshairs in each image. The coordinates of the crosshairs are passed to a program written by Radix for the camera that uses the differences between the crosshairs to calculate the angles at which the wheel is turned and tilted.

FIGURE 2. Center feature in the wheel is used to determine the angle of rotation.

The system then passes this information to the controller of a Fanuc robot. The robot swivels its wrist to match the angles to which the wheel is tilted and turned and rotates the nut runner to match the wheel's angle of rotation. Next, it guides the nut runner square onto the lug nuts. The nut runner is cycled and tightens the nuts to the proper torque in a few seconds. The robot then moves to the other wheel on its side of the car and, again, guided by the coordinates and angles provided by the machine vision system, places the nut runner onto the lug nuts and tightens the nuts. The application has been in operation for 18 months with greater than 99.6% uptime (including system errors like missing tires, water on tires, etc.).

FIGURE 3. Close-up view of vision-guided robot and nut runner.

Calibrating the robot to the vision system
The ability to quickly calibrate the robot to the vision system is important because of the potential for the vision system to be bumped by equipment moving in the area. The operator actuates the calibration command on the vision system, which determines the centerline position of the wheel relative to its own coordinate system and sends the coordinates to the robot controller. The operator then jogs the robot to position the nut runner on the wheel and the system determines the offsets between the robot's and vision system's coordinate systems. By entering the offsets into the robot control system, the complete coordinate system used by all of the vision systems and the robots in the cell are then synchronized. The work cell is fully calibrated in less than a minute with the custom features on a specialized calibration target, and completes the automatic dynamic calibration sequence in less than 2 s per cycle.

Radix Controls provided the vision-guided robot application as part of a complete solution including programming; custom lighting design; integrated robotic communication and robot programming; and full controls design including safety controls, coordinated installation, startup, and systems training for operators, maintenance, and engineering.

"The key to successfully automating this application is the coupling of machine vision and robotics to accurately and repeatably guide the robot to the proper position," Fellows says. "Automation provides a substantial cost savings to the automobile manufacturer and also improves quality by ensuring that the lugs are repeatably tightened to the proper torque."


-- Posted by Carrie Meadows, Vision Systems Design, www.vision-systems.com

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!