Industrial Automation: Autonomous robots fuse multi-sensor data

Jan. 15, 2014
In automated factories and warehouses, it is necessary to move material from one location to another in environments that may include confined passageways that may be populated by technicians.

In automated factories and warehouses, it is necessary to move material from one location to another in environments that may include confined passageways that may be populated by technicians. To accomplish this task, autonomously guided vehicles (AGVs) are often equipped with inductive sensors to track metal strips embedded on the factory floor. By tracking these strips, the AGV is capable of autonomously moving from one location to another.

"One of the drawbacks of this approach," says Parker Conroy, Applications Engineer at Adept (Amherst, NH; www.adept.com), "is that each facility must be modified before such AGVs can be deployed."

Rather than take this approach, Adept has chosen to develop a system based on fusing multi-sensor data in its latest Lynx series of AGVs. This allows the Lynx platform to autonomously navigate a factory floor without the use of embedded floor tracking systems and thus reduce the costs of deploying such systems.

As a mobile platform designed offered as an OEM component, the Lynx is a two-wheeled AGV that integrates a number of different types of sensors. These sensors are interfaced to an on-board processor that combines an embedded microcontroller and a PC. All of these operate in parallel to increase the positional accuracy of the AGV.

On each wheel of the Lynx, a quadrature encoder is deployed to convert the angular position of the wheel shaft to a digital value. These values are then used by the on-board microcontroller to determine how far the AGV has travelled. To sense and measure the orientation of the Lynx, a solid-state Gyroscope from Analog Devices (Norwood, MA, USA; www.analog.com) is also interfaced to the microcontroller. In this way, both the distance the robot has travelled and its orientation can be measured.

By guiding the robot around the environment in which it is required to operate, a one-dimensional map of the environment is created using data from as laser rangefinder.

For the AGV to operate in a factory environment, a map of the surroundings must first be made and then correlated with the data from the wheel encoders and gyro. To generate this map, a laser rangefinder from Sick is interfaced to the on-board PC over an RS232 interface. By using the laser rangefinder in front of the AGV, and guiding the robot around the environment in which it is required to operate, a one-dimensional map of the environment can be created. This map data can then be overlayed onto a CAD model of the building.

In operation, the AGV again scans its surroundings comparing acquired data with stored may data, values from the wheel encoders and gyros to determine its position. Then, by using an HMI interface developed by Adept, it is possible to guide the robot from one location to another by simply drawing a line between the two locations. Using scanned and stored map data, the AGV can then determine the shortest path between the two points while avoiding fixed walls and dynamic obstacles (such as human operators) that may be in its path.

To make the system more accurate, Conroy and his colleagues have also added an imaging system to the platform. Using a GigE camera from Basler (Ahrensburg, Germany; www.baslerweb.com) fitted with a wide angle lens allows the AGV to scan overhead images of its environment. Extracted features can then be added to the map of the surrounding environment.

Already, the Lynx system is finding uses in semiconductor applications. One such application demanded the use of an AGV and robot to autonomously pick, move and place a pod of semiconductor wafers from one location to another. To accomplish this task a SCARA robot with a custom end-effector was mounted onto the platform.

After moving to a specific location close to the wafer station, a robot arm equipped with an ultrasonic sensor finds the edge of the pod. Once found, a camera from Cognex (Natick, MA, USA; www.cognex.com) mounted on the robot arm, accurately determines the location of the pod.

More Vision Systems Issue Articles
Vision Systems Articles Archives
About the Author

Andy Wilson | Founding Editor

Founding editor of Vision Systems Design. Industry authority and author of thousands of technical articles on image processing, machine vision, and computer science.

B.Sc., Warwick University

Tel: 603-891-9115
Fax: 603-891-9297

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!