Autonomous mobile robots target logistics applications

May 1, 2018
Incorporating numerous vision and robotics systems, autonomous mobile robots are replacing automated guided vehicles in logistics applications.
Incorporating numerous vision and robotics systems, autonomous mobile robots are replacing automated guided vehicles in logistics applications

Andrew Wilson, European Editor

Driven by the successes of online retailers such as Amazon, today’s warehouse operators need to ensure that the systems they use to expedite order fulfilment are highly efficient. No longer is it sufficient to employ multiple workers with the sole aim of tracking, finding and shipping individual orders. Indeed, this has been recognized for some years in both warehouse and automated manufacturing facilities where the movement of product has been augmented by the use of automated guided vehicles (AGV).

Figure 1: RK Logistics has partnered with Fetch Robotics to increase employee productivity by automating the task of transporting products from one part of the RK Logistics’ Livermore-based warehouse to another using Fetch’s Freight autonomous mobile robot. To compliment the Freight mobile base, the company also offers Fetch, a mobile manipulator added to the Freight mobile base to perform parts pick and placement.

In their simplest form, such AGVs are designed with inductive sensors that are used to track and follow metal strips embedded in the floor of the warehouse or manufacturing facility. By following these strips, AGVs can then automatically move from one location to another, often carrying heavy items.

While effective, such AGV systems are limited to fixed routes within each facility. Should an obstacle or person block the route, then the AGV must stop until the path is cleared. To implement such systems also requires the infrastructure of the facility to be clearly defined and that tracking strips are strategically placed throughout. Finally, should any amendment to the AGVs path be required, time-consuming and often expensive relocation of the tracking paths are required.

AMRs vs. AGVs

Realizing this, many facilities managers are turning to autonomous mobile robots (AMR) to overcome these limitations. Unlike AGVs, AMRs incorporate more sophisticated on-board computers that are coupled to inertial measurement units (IMU), laser scanning range finders, 2D and 3D color cameras and motor controllers.

Although initially more expensive than AGVs, such AMRs do not require tracking strips to be placed in a factory or warehouse. Better yet, should the need arise they can be reprogrammed to follow different paths relatively easily. As autonomous robots, they are also capable of learning and mapping a facility that can be then be used by other AMRs. Finally, should any obstacle or person block the route of the AGR, it can automatically find and traverse the next most expeditious path to reach its final destination.

Recognizing the advantages of this approach, RK Logistics Group (Fremont, CA, USA; www.rklogisticsgroup.com) has partnered with Fetch Robotics (San Jose, CA, USA; www.fetchrobotics.com) to increase employee productivity by automating the task of transporting products from one part of the RK Logistics’ Livermore-based warehouse to another (Figure 1).

Figure 2: Internally, Fetch and Freight have a number of circuit boards, communication buses which handle power distribution and motion control as well as integrate with the systems laser scanner and 3D camera.

“The 4,500 deliveries made by Fetch Robotics AMRs in the past six months required a total travel distance of 1,000km that would otherwise have been performed manually,” says Landon Spring, Senior Director of Business Development and Marketing for RK Logistics Group. “These AMRs worked alongside RK Logistics Group’s employees twenty-four hours a day, seven days a week over three shifts.”

In the deployment, RK Logistics Group is using Fetch Robotics’ VirtualConveyor, which incorporates the Freight series of Autonomous Mobile Robots (AMRs) and charge docks with FetchCore software to allow docking power stations, preferred routes, speed maps and restricted zones to be added to a 3D digital map of the warehouse created by Fetch’s Freight AMRs.

To compliment the Freight mobile base, the company also offers Fetch, a mobile manipulator added to the Freight mobile base to perform parts pick and placement. Although not currently used by RK Logistics Group, the Fetch mobile manipulator shares similarities in its design with the Freight mobile base.

Scanners and cameras

Internally, Fetch and Freight have a number of circuit boards, communication buses which handle power distribution and motion control as well as integrate with the systems laser scanner and 3D camera (Figure 2).

Figure 3: LiDAR data is combined with data from the IMU to generate a 3D map of the scanned area. To allow visualization above and below the LiDAR’s field of view, the Freight robot also employs a Carmine 1.09 short-range 3D camera sensor.

“Originally, the company planned to only deploy Freight robots in conjunction with Fetch robots. however, numerous applications required only Freight robots. As such, there were numerous environments in which a 2D laser scanner alone was insufficient and so Freight robots now include a 3D camera mounted in the base of the robot which is used for added obstacle avoidance,” says Melonee Wise, CEO of Fetch Robotics.

Because of this, the Freight robot now comprises a central Intel-based CPU computer that running the (rather misnamed) Robot Operating Systems (ROS; www.ros.org), originally developed by Willow Garage (Palo Alto, CA, USA; www.willowgarage.com) and Stanford University (Stanford, CA, USA; www.stanford.edu).

As a middleware software framework, ROS itself is not an operating system and runs under the open-source Ubuntu (www.ubuntu.com) operating system on the Freight and Fetch robots. Using this middleware, Freight’s on-board computer sends low-level device commands to motor and motor controller boards (MCBs) over half-duplex RS-485 buses that in turn are interfaced to motors and motor encoders that control the motion of the robot. These motor encoders also convert the data from the motors to a digital value that is used by the on-board computer to determine how far the robot has travelled.

This same on-board computer interfaces to a 6-axis inertial measurement unit (IMU) with a gyroscope capable of measuring angular rotational velocity at up to +/-2000 degrees/s and an accelerometer that can measure linear acceleration of +/-2g. In this way, the CPU can track the rotation and orientation of the robot and acceleration of movement.

In addition to these tasks, the main CPU board also controls electronic circuit breakers (ECBs) and charging circuitry for the two 12V Sealed Lead Acid (SLA) batteries located in the base of the robot.

Figure 4: Once a map of a warehouse has been generated, an operator can use the GUI of the FetchCore software to limit the areas where the robot is allowed to operate, position charging stations and define which areas the robots are allowed to move. Here, exits of an area have been blocked (orange) so that the robot cannot travel in this area.

For creating a map of the surrounding areas of the robot, Freight employs a TIM571 LiDAR scanning range finder from SICK (Minneapolis, MN, USA; www.sick.com) interfaced to the on-board computer over an Ethernet switch. Using time-of-flight measurement, the TIM571 emits an 850nm laser pulse which is scanned over a 220° field of view (FOV) using a moving mirror. When the laser pulse is reflected from an object, it is returned to the laser scanner’s receiver. By calculating the temporal difference between the time to send and receive the pulse and the returned signal strength, positions of objects as far as 25m are detected with millimeter accuracy.

These polar positional coordinates and the distance and angle from the LiDAR are transferred to the host CPU over an Ethernet interface. This information is then combined with data from the IMU to generate a 2D map of the scanned area. To allow visualization above and below the LiDAR’s field of view (Figure 3), the Freight robot also employs a Carmine 1.09 short-range 3D camera sensor from PrimeSense, now a subsidiary of Apple Inc. (Cupertino, CA, USA; www.apple.com). 3D depth image data and color RGB image data captured by the Carmine is then transferred to the Freight’s on-board computer over a USB 2.0 interface.

Making maps

For Freight to operate in a factory environment, a map of the surroundings must first be made using data from the wheel encoders, IMU, and LiDAR. For RK Logistics Group’s Livermore-based facility, Fetch Robotics provides its FetchCore software to map warehouses and set up and deploy the robots. Using the ROS Navigation stack, the Freight robot can analyze data from the on-board laser and 3D camera and output commands to move the robot safely around the factory floor while avoiding collisions. According to Fetch Robotics, generating this map for RK Logistics Group’s Livermore-based facility took approximately one hour.

Once this map has been generated, an operator can use the web-based graphical user interface (GUI) of the FetchCore software to limit the areas where the robot is allowed to operate, position charging stations and define which areas the robots are allowed to move (Figure 4).

“Better still”, says Melonee Wise, CEO of Fetch Robotics, “operators can define specific lanes or preferred routes that the robots are allowed to travel. After experiencing a sprinkler malfunction that filled part of a factory with water, one customer graphically reconfigured the paths the robots were allowed to traverse in a matter of moments, thus avoiding any potential damage that could have occurred.”

In the future, RK Logistics Group plans to enhance the current capability of its AMRs by integrating the warehouse management system (WMS) from SAP (Walldorf, Germany; www.sap.com) it currently uses with FetchCore. “Using a software development kit from SAP, for RK Logistics Group, it will be possible to move data from the SAP WMS to the AMRs, so that the system can determine which AMR can most effectively perform the task,” says Rock Magnan, President of RK Logistics Group.

Workers can then be shown an image of the product on the screen of the AMR to simplify the product picking task while at the same time reducing any potential errors. Once picked, the SAP WMS can then be automatically updated.

Companies mentioned

Apple Inc.
Cupertino, CA, USA
www.apple.com

Fetch Robotics
San Jose, CA, USA
www.fetchrobotics.com

RK Logistics Group
Fremont, CA, USA
www.rklogisticsgroup.com

ROS
www.ros.org

SAP
Walldorf, Germany
www.sap.com

SICK
Minneapolis, MN, USA
www.sick.com

Stanford University
Stanford, CA, USA
www.stanford.edu

Willow Garage
Palo Alto, CA, USA
www.willowgarage.com

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!