Robotic Bin Picking System at Penna Flame Picks Large, Heavy Industrial Parts
What You Will Learn
- Manual packing of heavy, sharp-edged support arms was labor-intensive, unsafe, and led to employee turnover.
- The system employs a FANUC robot with a magnetized end effector, guided by structured light 3D vision and machine learning to accurately pick and place parts into boxes.
- Challenges such as occlusions, asymmetrical parts, and ordered picking were addressed through innovative algorithm development and skid design modifications.
- Safety features include physical fencing, laser scanners, force sensing, and automated stop protocols.
Everyone at Penna Flame Industries (Zeilenople, PA, USA) agrees on one point: The worst job at the plant is packing leaf spring support arms.
A steel component in vehicle suspension systems, each support arm weighs 37 pounds, has dimensions of 18 x 12 in, semi-sharp edges, and is irregularly shaped.
Packing industrial parts for shipment is a common activity at Penna Flame. Launched in 1968, the family-owned business hardens steel components for its customers with either flame or electromagnetic induction processes. In either case, the goal is to increase the wear resistance and longevity of a variety of parts, such as gears, shafts, and axles.
Manual Packaging Processes
But packing the leaf spring support arms is no fun. “People quit over this stuff,” explains Michael Orr, vice president of operations at Penna Flame and grandson of the founder.
This is not hard to understand; the process is grueling. These are the steps:
- Wearing extra-thick work gloves because the pieces have sharp spurs, an employee picks up each piece from a 4-ft square wire bin that holds 110 pieces.
- The employee lays the part flat at the bottom of a plastic bin.
- Once there are eight pieces in the bin, the employee sprays them with oil-based rust preventative, flips them over, sprays them again, and then flips them up and shakes them to remove the excess oil.
- In the last step, the employee picks up each part and packs it in a wooden shipping box.
Once the box is full, he or she puts the lid on the box and bands the lid and box together with metal binding.
Related: Automated Bin Picking: New Dimensions for Success
Each wooden box holds a maximum of 23 parts if they are crammed together.
The same two employees typically pack these parts because most people at the plant don’t want to do it. Orr says they are “tough and strong” and possess a good work ethic and competitive spirit. They race to see who can pack more parts, and the winner gets a company-funded reward. “We had to turn it into a game, basically,” Orr says.
Developing an Automated 3D Bin Picking System
Because of the challenges with manually packing these parts, Penna Flame in 2021 decided to pursue robotic bin picking. The company already had four FANUC (Yamanashi, Japan) robots deployed in other areas at the plant. Why not buy another one for packaging?
Orr approached the Catalyst Connection (Pittsburgh, PA, USA), a private nonprofit organization that helps small- and medium-sized manufacturers adopt advanced technologies. The organization helped 26-employee Penna Flame secure two grants totaling $15,000. The Catalyst Connection also engaged the ARM Institute (Pittsburgh, PA, USA), which specializes in integrating AI-enabled robotics into manufacturing processes.
The ARM Institute developed a conceptual plan for packaging the support arms using a robot. Its experts also reached out to CapSen Robotics (Pittsburgh, PA, USA), which develops and markets AI-enabled 3D machine vision and motion planning software for robots.
Software engineers at CapSen Robotics worked with Penna Flame to develop algorithms customized for the company’s project.
How the Robotic Bin Picking System Works
With the new robotic bin picking process, a FANUC R 2000/165F robot, with a 165 K payload and 2,655 mm reach, handles most of the work. Humans still put the lid on the wooden shipping boxes and band them.
To kick off the process, employees load the skid and drive it over to the packaging robot cell with a forklift.
Related: Advances in AI and 3D Vision Transform the Bin Picking Marketplace
The robot picks up each part with a magnetized end effector. There is a flat spot on each part where the magnet needs to connect with the part to hold it securely.
The robot manages the parts in a specific order. This is important: If parts are removed from the skid in the incorrect order, it could lead to instability in the entire stack of parts, explains Prajwal Poojari, robotic software engineer at CapSen Robotics, and a key member of the project team.
And to work best, Orr adds, the robot also needs a well-defined and consistent spot for each part, “so it can grab the flat part, and it is sturdy.”
However, the wire basket did not work for the robot because it could not dislodge parts if they got stuck in the wire mesh. So, Orr designed a skid that holds 72 parts in six layers.
The steel skid has a divider down the middle and four posts, one in each corner, for stability. The parts are divided between layers: 16, 16,14, 14, 12. For each layer, an equal number of support arms are placed on each half of the skid.
Automating the Bin Picking Sequence
Once the robot picks up the part, it dips the part in the oil and then rotates the part slowly to dislodge the excess oil. Finally, the part is placed in a precise location within one of four wooden boxes lined up near the robot. It packs 18 pieces per box as the last step in the process, meaning each of the 72 pieces has a predefined destination within one of the four boxes.
It takes the robot 1.5 hours to complete the process for all pieces. “That’s running at 80% speed. We like speeds that aren’t crazy fast,” Orr says.
Directing the Robot’s Movements with 3D Imaging and Machine Learning
CapSen Robotics uses a combination of rules-based programming and machine learning methods to guide the robot, based on input from a vision system.
The machine vision hardware consists of a 3D camera (Zivid 2+LR110) from Zivid (Oslo, Norway), which is installed 10 feet above the robot cell, pointing down on the skid. The process also uses a proximity sensor, which is secured near the robot’s magnetic end effector.
Related: Vision System Enables Robotic Picking at Industrial Bakery
The Zivid camera uses a type of 3D imaging called structured light—a process in which a pattern of light is projected on an object or scene, and then the reflections are captured in images taken by a 2D sensor. The camera produces a 3D point cloud by analyzing the distortions in the patterns of light.
CapSen Robotics’ algorithm uses point cloud information to determine an object’s location and orientation. “The point cloud has the color information in the form of RGB and the depth information in the form of XYZ,” says Poojari.
Based on that information, another algorithm generates a motion plan, “so that the robot can go towards the part in a collision free motion,” says Poojari.
Important Verification Step in Robotic Bin Picking Process
Once the robot picks up the part off the skid, the robot moves the part to a predetermined spot, so it can isolate the part from the clutter on the skid. The camera then snaps a second image for a verification step in which the software uses just RGB information to ensure that the support arm is stuck to the magnetic end effector within acceptable parameters. If the robot is not holding the part correctly, it puts the part in a reject bin.
Verification is a key step in the process, Poojari says. If the part is not oriented correctly, the part may collide with the bin holding the oil, potentially damaging the robot. But if it is positioned correctly on the end effector, the robot proceeds with the rest of the process.
Related: Seco Tools Installs Custom Solution to Inspect Indexable Inserts Used in Machining Operations
The bin picking system does not need a third image or second verification step to place the part in the assigned position in one of the wooden packing boxes. That’s because the positions in the packing bins are fixed. “We have that functionality, but we have not used it,” Poojari says.
The software is loaded on a standard PC, and the processing occurs locally. “And then towards the end, the final information is sent to the robot’s controller so that the robot can execute the motion,” Poojari says.
The application communicates with the robot’s controller via TCP/IP. The camera communicates with the PC using 10 Gbps Ethernet.
Related: Robots Leverage 3D Machine Vision to Handle Auto Parts
Because the parts are not shiny or reflective, the application only needs ambient light, Poorjari says. Orr notes that Penna Flame redid the ceiling lights in this part of the operation recently.
Programming Challenges Abound for Picking Large, Heavy Parts
Developing the bin picking system was not easy. There were several challenges to overcome because the support arms are large and asymmetrical and placed side-by-side on the skid, leading to occlusions in the camera’s field of view (FOV). The combination made detecting the part and its pose challenging. “We had to develop new algorithms and be smart enough to combine the rules-based approach and the learning-based approach together,” he says.
And because the part is large with a correspondingly large amount of CAD data, there are a lot more calculations involved than would be the case for a smaller object, Poojari adds.
The fact that the application is an “ordered picking application” also adds a layer of complexity. “We had to make sure that the camera is looking at the right area and not skipping a part or skipping two or three parts down the line,” he explains. Otherwise, the robot may not pick up the parts in the correct order. “We have to detect the specific object that we want to pick next,” Poojari adds.
Designing the Placement Scheme for Parts on the Skid
For his part, Orr worked through an unexpected challenge: Configuring the leaf spring support arms on the skid. The parts come in a left-side and right-side version—essentially mirror images of each other.
Related: 3D-Vision-Guided-Robot Automates Battery Production Process
The Penna Flame team started with the left-side version and got the robot to pick up the parts from the skid securely and then run through the remainder of the process without issue.
That was not the case for the right side. “It was less stable with the same sequence,” Orr says. “We had to reengineer how we stack them.”
To solve the problem, Orr configured the layout on the skid with 0.25 in of space between the parts in the first layer. “We just widened the base,” he says.
Developing Safety Measures
The robot works in an area surrounded by physical safety fencing and safety laser scanners—Keyence (Osaka, Japan) SZ04M—that the Penna Flame team configured and installed.
The cell has 7-ft physical fencing along both sides and safety scanners mounted vertically at the front and back of the space. Each one scans up to 14-ft high and in a 14-ft semicircle.
Related: Amazon Testing Robotic Arm that Identifies Individual Products
For added safety, the robot has force sensing, and Orr says they set the sensitivity percentage high. In addition, if the robot can’t pick up a part correctly in three consecutive tries, it stops.
Timeline for Lights Out Use of AI-Enabled 3D Machine Vision Process
As of early October, the system was running autonomously with a human operator in the vicinity working on other tasks but also monitoring the robot.
Orr hired an outside integration firm to complete an independent safety assessment. Once that step is complete, the goal is for the robot to pack the parts overnight while the plant is closed. “It should be able to pack without anyone here,” Orr says. “That’s exciting.”
About the Author
Linda Wilson
Editor in Chief
Linda Wilson joined the team at Vision Systems Design in 2022. She has more than 25 years of experience in B2B publishing and has written for numerous publications, including Modern Healthcare, InformationWeek, Computerworld, Health Data Management, and many others. Before joining VSD, she was the senior editor at Medical Laboratory Observer, a sister publication to VSD.