Vision-guided robot automates vegetation analysis

Oct. 7, 2011
At the University of Illinois at Urbana-Champaign and United States Department of Agriculture, engineers have developed a machine-vision-based system that uses adaptive image segmentation and neural networks to identify vegetation varieties, guiding a robot to distinguish weeds from desired plants in agricultural operations.

To reduce the amount of herbicide used in automated agricultural systems, it is important to correctly identify a multitude of plants and weeds. As a result, autonomousvision-guided robots developed for this purpose must robustly identify plants and weeds in unpredictable and often nonuniform lighting conditions.

At theUniversity of Illinois at Urbana-Champaign (Urbana, IL, USA) and United States Department of Agriculture (USDA; Wooster, OH, USA), Dr. Hongyoung Jeon and his colleagues (Drs. Lei Tian and Heping Zhu) have developed a machine-vision-based system that uses adaptive image segmentation and neural networks to identify vegetation varieties.

Mounted on a 3-AT skid steering robot fromAdept MobileRobots (Amherst, NH, USA), stereo images are captured using a stereo camera from Videre Design (Menlo Park, CA, USA). Equipped with a C-mount lens with a 6-mm focal length from Kowa Optimed (Torrance, CA, USA), the camera is fitted with a polarizing filter from Sony (Tokyo, Japan) to reduce specular reflectance caused by outdoor illumination.

Positioned approximately 0.6 m from the ground and angled at 20°, the system captures a trapezoidal area of images 768 × 572 mm at a resolution of approximately 2.4 mm/pixel. Captured images are then transferred over the stereo camera’s FireWire interface to a host PC on the robot.

After two sets of images are captured by the system under different stages of plant growth and illumination conditions, each image is processed using algorithms provided by MATLAB from The MathWorks (Natick, MA, USA).

Each RGB image is captured, then converted to a normalized excessive green (NEG) channel, represented byNEG = 2.8 (g/r + g + b) – (r/r + g + b) – (b/r + g + b) to emphasize the green channel. After NEG pixel values are then converted to integer values, variances of histogram distribution of each image are then used to segment the plant against the soil. To eliminate any random noise in these images, a 3 × 3 median filter is applied to each of the segmented images.

To identify weeds from crop plants, Jeon and his colleagues used MATLAB’sNeural Network Toolbox for identification model development. Before this neural network could be used, however, it was first trained to recognize a number of different species of corn and weed plants. A number of images were captured using both the machine-vision system and an SD-110 Powershot camera from Canon (Lake Success, NY, USA) to train the neural network.

Before training, these images were pre-processed to measure specificmorphological features of the plants within the images. After the plant perimeter, inner area, width, and height of a plant were measured, the features converted to five normalized features—height/width, height/perimeter, perimeter/area, width/area, and height/area—to minimize the influences of the image size of each plant. These normalized features of plants were then used to train the neural network.

After initial testing of the system, the neural network was shown to identify approximately 72% of the corn plants within the images. To improve the identification accuracy, two criteria were applied to improve the identification results of the neural network.

First, the identification results of the neural network for plants at the edges of the image that exhibited incomplete morphological features were excluded from the identification process. Second, a maximum weed size of 300 pixels was set to limit the size of detected weeds. With these improvements, the accuracy of the system increased to approximately 94%.

-- By Andy Wilson,Vision Systems Design

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!