Handheld hyperspectral imaging camera measures crop health

Feb. 21, 2019
Seeking an alternative method to capture hyperspectral images from space, Finnish space technology startup company Reaktor Space Lab worked with VTT Technical Research Centre of Finland, which developed a miniature hyperspectral camera that was integrated into the payload of the Reaktor Hello World nanosatellite.

A researcher from Purdue University (West Lafayette, IN, USA; www.purdue.edu) developed a handheld hyperspectral imaging camera that will be used to measure crop health and provide valuable information to farmers andscientists.

Jian Jin, an assistant professor in Purdue’s Department of Agricultural and Biological Engineering, developed the device, which scans plant leaves for physiological features including moisture and nutrient and chlorophyll levels, while also providing information on the effects of chemical spraying and disease symptoms. The device, according to Jin, enables necessary changes for growing more food with fewer resources, including the reduction in required amounts of fertilizer andwater.

Neal Carpenter(left), a postdoctoral research assistant in Purdue University’s Department of Agronomy, walks through fields of corn and sorghum preparing to use a handheld sensor developed at Purdue to measure the health of a plant. Matthew Fenton uses a smartphone to collect the data. (Purdue Research Foundation image/Oren Darling)

“LeafSpec” is a pushbroom hyperspectral imaging device with more than 200 bands in the visible and near-infrared (VNIR) range (400 to 1000 nm). The Purdue Office of Technology Commercialization has filed three provisional and non-provisional patent applications for the technology, so specific component details were not available. However, Jian explained that the device is based on an off-the-shelf industrial CCD camera, along with lenses, slits, a spectrograph, and other low-level opticalparts.

“The device is specially designed at Purdue to provide highest quality of crop leaves images with minimum damage to the leaves and maximized customer experience in both greenhouse and field,” says Jin. “Image processing algorithms are developed by our team at Purdue, and plant physiological feature prediction models were developed over years’ worth of greenhouse and field assays at Purdue aswell.”

LeafSpec scans a plant in less than five seconds and detects hundreds of bands of color in each pixel. Additionally, Jin says the sensor is more precise than some current devices used by plant scientists that clamp down on a leaf and measure the health of only a small locational portion of theplant.

“Due to multiple technical reasons, the sensor’s prediction quality is more accurate than other types of crop imaging sensors that people have in the existing market,” Jin says. “It’s also constantly getting better because we scan plants every day and are upgrading both hardware and softwaretechnologies.”

While the sensor is self-contained, users have the option of uploading measurements with geo-locations to a web-based cloud map service developed by Carol Song and her team at Purdue’s Advanced Computing Group. The system generates plant stress and nutrition heat maps based on sensor measurements and provides interactive agricultural data querying functions at both farm and regional levels. This digital agricultural map system has the potential to support many applications, suggestsPurdue.

“If we can successfully distribute the sensors around the region, we can generate this digital ag map service to monitor the plant growth all over the region — which areas are under stress and which areas are having a good performance,” Jinsays.

Going forward, Jin’s group is working on automating the device. In the winter of 2017, he and his graduate students successfully implemented a robot to scan the leaves with the sensor automatically in a greenhouse. The prototype robot system used machine vision technologies to recognize target leaves and move the camera to that location for a scan. After finding success here, Jin and his team aim to develop a robot for deployment in a farm field environment. This robot, according to the team, may look like a “spider transformer” and will travel between crop rows with each leg equipped with a sensor, waving and scanning leaves in the field at high speeds. During the 2019 growing season, Jin expects this robot to be functioning and intesting.

Beyond this, Jin is looking for collaborators for the commercialization of the device, as he believes making the devices low-cost will be the best approach, with the data providing the bulk of the value.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!