Affordable board-level device for crop temperature observation meshes thermal infrared and RGB cameras

Aug. 28, 2019
The overall cost of the unit was around $400, significantly cheaper than “out of the box” thermal infrared cameras.

Researchers with the University of Missouri and USDA’s Agricultural Research Service have designed an imaging device that can circumvent the traditional high costs and logistical challenges of using thermal infrared imagery to monitor crop health.

A plant that is getting enough water transpires through microscopic pores called stomata, which cools the leaves of the plant. If a plant is not getting enough water the reduced transpiration raises the temperature of its leaves. Gauging the heat of a crop field can therefore determine whether the plants are getting enough water.

Thermal infrared cameras are the chief method used to measure plant temperatures. If the images from these cameras are combined with RGB images, regions of interest (ROI) can be set and the temperature information of the surrounding soil can be ignored. Thermal infrared cameras are expensive, however, which can prohibit their use by owners of smaller farms.

The researchers behind the study titled “Development of a multi-band sensor for crop temperature measurement” fabricated a more affordable thermal/RGB combined system, named the Multi-band System for Imaging of a Crop Canopy (MSICC).

The system uses a Teensy USB Development Board v3.2, powered by a 72 MHz ARM Cortex M4 processor as its CPU; a Lepton LWIR camera from FLIR Systems with 80 x 60 pixel array, sensitivity to 8 – 14 µm wavelengths, and 51° horizontal field of view lens; and an ArduCAM Mini with a 3.6 mm x 2.7 mm, 2 MPixel OV2640 image sensor from OmniVision Technologies, and 3.6 mm lens with 52° field of view.

A printed circuit board (PCB) was designed as an interface for the components. The overall cost of the unit was around $400, significantly cheaper than “out of the box” thermal infrared cameras. 

The MSICC was tested at the University of Missouri’s South Farm Research Center on two soybean crop fields, one with a full canopy and one with soil visible in the image, with the MSICC suspended 0.9 m above the crop canopy. Images and temperature analysis of the MSICC were compared with data from an infrared thermometer observing the same crop fields.

Images from the LWIR and RGB cameras were aligned and subjected to a segmentation algorithm to separate plant and soil pixels in the images, based on temperature differences. The LWIR data was accurate to within 0.65° C, based on comparison to the IRT measurements.

The algorithm was most effective when soil was clearly visible with the cameras’ field of view. Otherwise, the algorithm could separate shaded parts of the plant from those exposed to direct sunlight. The researchers determined that a color-based algorithm and IR filter could improve the system and prevent it from confusing shaded areas of plants with soil.        

Related stories:

Handheld hyperspectral imaging camera measures crop health

Spectral imaging system enables digital plant phenotyping

Vision-guided drone supports flood and coastal protection efforts

Share your vision-related news by contacting Dennis Scimeca, Associate Editor, Vision Systems Design

SUBSCRIBE TO OUR NEWSLETTERS

About the Author

Dennis Scimeca

Dennis Scimeca is a veteran technology journalist with expertise in interactive entertainment and virtual reality. At Vision Systems Design, Dennis covered machine vision and image processing with an eye toward leading-edge technologies and practical applications for making a better world. Currently, he is the senior editor for technology at IndustryWeek, a partner publication to Vision Systems Design. 

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!