Wireless network eases support for robotic systems
More than three years ago, Braintech (North Vancouver, BC, Canada) began supplying neural-network software and hardware for machine-vision applications. Offering PC-based systems for extracting meaning and recognizing objects in a stream of video data...
More than three years ago, Braintech (North Vancouver, BC, Canada) began supplying neural-network software and hardware for machine-vision applications. Offering PC-based systems for extracting meaning and recognizing objects in a stream of video data, the company's Odysee development system consisted of a proprietary PC-based frame-grabber board and custom software for both image preprocessing and object classification (see Vision Systems Design, Jan. 1998, p. 38).
While this philosophy proved successful, the company decided to focus on vision-guided robotic solutions and concentrate on automotive and food-handling applications. Premiering products at the recent International Robots & Vision Show (Chicago, IL; June 2001), Braintech's latest offering, the eVisionFactory, takes advantage of readily available cameras, frame grabbers, image-processing software, robots, and wireless information services. According to David Wright, a scientist and engineer at Braintech, the eVisionFactory lets system developers integrate vision with robots and robot controllers from ABB Flexible Automation (New Berlin, WI).
How it works
In operation, a TM-200NIR CCD camera from Pulnix America (Sunnyvale, CA) is positioned in the workspace or on the robotic arm of an IRB1400 robot. Monochrome RS-170 images captured from the camera are then digitized by a PC using a Meteor II frame grabber from Matrox Imaging (Dorval, Quebec, Canada). In addition to using its proprietary image-processing software, BrainTech uses the Matrox MIL 7.0 software package with a series of C++-based controls developed using Microsoft's Visual C++. This software allows the company to take advantage of the functions within the MIL packages and previously developed image-recognition algorithms.
With this software in place, BrainTech has developed a robotic-vision front end that systems developers can use to integrate machine vision into a robotic system. Once images of objects are captured using the software, three-dimensional positioning data are transmitted to an ABB S4C controller over a serial interface. After the positional data are established, the robot can be trained to move objects to other points within the workcell using ABB's teach pendant.
In addition to integrating machine-vision and robot functions, BrainTech's eVisionFactory can notify systems operators of potential systems failures using wireless networking. To accomplish this, the PC-based system integrates a wireless PCMCIA LAN card from IBM Corp. (Yorktown Heights, NY) with the Java Runtime Environment Standard Edition from Sun Microsystems (Mountain View, CA), Microsoft's Internet Explorer 5.5, and BrainTech's eVisionFactory Voice-Over-Internet Protocol applet.
The system can automatically transmit system status information to hand-held devices such as Palm Pilots located as far as 500 ft from the system. In this way, operators can be remotely informed of any problem with their automated manufacturing systems while located anywhere on the factory floor.