Intelligent feeding system combines robot, vision, and flexible feeder into one software environment

Sept. 10, 2019
Integrating three separate technologies is a serious challenge for vision system integrators.

When working with flexible feeders, vision systems, and robots in a single application, one issue that many systems integrators and end users alike often struggle with is interfacing the three separate technologies.

This is according to Epson Robots (Carson, CA, USA; www.epson.com/robots), a company that builds robots primarily for assembly applications and is often tasked with developing robotic systems involving parts for assembly that need to be fed. Over the past five years or so, the company has seen an increase in the number of customers reaching out with problems dealing with the interfacing of flexible feeders, machine vision, and robots.

“Our application team spent a lot of time helping customers with problems unrelated to our robots, but ultimately when there is a system problem, we want to help get their system running,” says Rick Brookshire, Director of Product Development, Epson Robots.

Communication between the devices was the main problem for many of Epson’s customers. Feeders have a command set anywhere from 10 to 40 actions that can be issued to the device. Ethernet or RS-232 communications stream the data containing the commands, explains Brookshire.

“Some of the things that can happen involve the device responding back too quickly, not recognizing the correct string, and these types of hiccups all occur before work on the actual application begins,” he says. “We could have stopped here and built a communications interface, but what we found in talking with customers is that—after getting over the communications issue—many customers had developed code such that it was very serial and step-by-step in the way it operated, instead of doing things in parallel.”

He continues, “We would then help them with synchronizing things better, but the more we looked at these problems, the more we thought we could develop a system that handled such tasks for the end user. They would have to write hundreds or thousands of lines of code, but now most of it would be handled for them automatically.”

As a result, Epson developed the IntelliFlex Parts Feeding System, which combines robotics, machine vision, and a flexible feeding system into one software environment. The system is based off either an Epson 6-Axis or SCARA robot for the picking of singulated parts. Epson’s Vision Guide, and a flexible feeder from Asyril (Villaz-St-Pierre, Switzerland; www.asyril.com) supports parts from 5 to 40 mm in size.

Despite having its own internally-developed flexible feeder, Epson looked at all options and ultimately decided that the best-in-class option for customers was a flexible feeder from Asyril.

“We felt the best feeder option was from Asyril because these systems can move the parts in a more uniform way versus most of the other feeders,” says Brookshire. “We felt like if we integrated our vision system with the feeder, we would be able to do a lot of the work for the end users and prevent them from writing a significant amount of code.”

Epson’s Vision Guide system is based off several GigE camera options from Basler (Ahrensburg, Germany; www.baslerweb.com) lenses from Computar (Cary, NC, USA; www.computar.com), and a high-flex cable designed by an outside vendor specifically for Epson.

The Vision Guide also features Epson’s CV2 controller, which enables the system to connect to four GigE cameras and two USB cameras and offers powerful CPUs for vision processing.

“Epson has worked with Basler for a long time, and we specifically support Basler cameras because we know the results will be good,” says Brookshire. “In doing so, we want to ensure that when our customers go to implement their system, they will have the best robot, best feeder, and best vision system. Part of that vision system, of course, means acquiring the best possible images.”

Another task completed by the system automatically is the tuning process. Parts are dropped onto a feeder and the vision system and feeder work together to determine the right tuning for those particular parts. Previously, parameters were set manually, which took time and manual effort, says Brookshire.

“The software makes it easier to put flexible feeder applications together, but operators still need to work with the vision system to find parts properly, deal with potential lighting issues, and put in a few lines of code to direct the robot where to place the individual parts,” he says. “But overall, the integration of all these components together is handled automatically through the IntelliFlex software, which helps avoid some of the issues we’ve encountered with customers over the years,” says Brookshire.

Additionally, one thing Epson Robots always does is to ask its customers to send sample parts ahead of time for testing. Engineers then look at the parts, put them on the feeder, and make sure they can move and singulate, make sure they can be found and distinguished by the vision system, and make sure the robot can pick the singulated parts, says Brookshire.

“For example,” says Brookshire, “any time there are parts that get easily tangled together via hooks, clasps, or springs, it is very difficult to separate these out. This is a big challenge and is part of the reason we want to look at everyone’s parts first, to set them up for success.” 

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!