News Release

Novel ‘registration’ method identifies plant traits in close-up photos

Researchers introduce a new approach to correct illumination effects in close-up plant images

Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

Fig. 1. Experimental setup of the imaging system.

image: (A) Schematic diagram of the raw image acquisition system for the hemisphere references. (B) Schematic diagram of the raw image acquisition system for the plant. (C) Simulation of the plant leaf and the hemisphere reference tangent to each other and the relevant 3D light field features. In this case, the plant and the reference at the tangent point share the same 3D light field features. Here, di denotes the distance from the point to the light source, dv denotes the distance from the point to the camera optical center, θi denotes the angle between the direction vector of the incident light and the normal vector of the surface where the point is located, θv denotes the angle between the observation direction vector and the normal vector of the surface where the point is located, and θ denotes the angle between the projection vector of the incident light direction vector and the projection vector of the observation direction vector on the surface where the point is located. view more 

Credit: Plant Phenomics

Modern cameras and sensors, together with image processing algorithms and artificial intelligence (AI), are ushering in a new era of precision agriculture and plant breeding. In the near future, farmers and scientists will be able to quantify various plant traits by simply pointing special imaging devices at plants. However, some obstacles must be overcome before these visions become a reality. A major issue faced during image-sensing is the difficulty of combining data from the same plant gathered from multiple image sensors, also known as ‘multispectral’ or ‘multimodal’ imaging. Different sensors are optimized for different frequency ranges and provide useful information about the plant. Unfortunately, the process of combining plant images acquired using multiple sensors, called ‘registration,’ can be notoriously complex.

Registration is even more complex when involving three-dimensional (3D) multispectral images of plants at close range. To properly align close-up images taken from different cameras, it is necessary to develop computational algorithms that can effectively address geometric distortions. Besides, algorithms that perform registration for close-range images are more susceptible to errors caused by uneven illumination. This situation is commonly faced in the presence of leaf shadows, as well as light reflection and scattering in dense canopies.  

Against this backdrop, a research team including Professor Haiyan Cen from Zhejiang University, China, recently proposed a new approach for generating high-quality point clouds of plants by fusing depth images and snapshot spectral images. As explained in their paper, which was published in Volume 5 of Plant Phenomics on 3 April 2023, the researchers employed a three-step image registration process which was combined with a novel artificial intelligence (AI)-based technique to correct for illumination effects. Prof. Cen explains, “Our study shows that it is promising to use stereo references to correct plant spectra and generate high-precision, 3D, multispectral point clouds of plants.”

The experimental setup consisted of a lifting platform which held a rotating stage at a preset distance from two cameras on a tripod; an RGB (red, green, and blue)-depth camera and a snapshot multispectral camera. In each experiment, the researchers placed a plant on the stage, rotated the plant, and photographed it from 15 different angles. They also took images of a flat surface containing Teflon hemispheres at various positions. The images of these hemispheres served as reference data for a reflectance correction method, which the team implemented using an artificial neural network.

For registration, the team first used image processing to extract the plant structure from the overall images, remove noise, and balance brightness. Then, they performed coarse registration using Speeded-Up Robust Features (SURF)—a method that can identify important image features that are mostly unaffected by changes in scale, illumination, and rotation. Finally, the researchers performed fine registration using a method known as ‘Demons.’ This approach is based on finding mathematical operators that can optimally ‘deform’ one image to match it with another.

These experiments showed that the proposed registration method significantly outperformed conventional approaches. Moreover, the proposed reflectance correction technique produced remarkable results, as Prof. Cen highlights: “We recommended using our correction method for plants in growth stages with low canopy structural complexity and flattened and broad leaves.” The study also highlighted a few potential areas of improvement to make the proposed approach even more powerful.

Satisfied with the results, Prof. Cen concludes: “Overall, our method can be used to obtain accurate, 3D, multispectral point cloud model of plants in a controlled environment. The models can be generated successively without varying the illumination condition.” In the future, techniques such as this one will help scientists, farmers, and plant breeders easily integrate data from different cameras into one consistent format. This could not only help them visualize important plant traits, but also feed these data to emerging AI-based software to simplify or even fully automate analyses.

We surely are excited to see the impact of 3D multispectral imaging of plants in precision agriculture and phenotyping!

###

Reference

Authors

Pengyao Xie1,2, Ruiming Du1,2, Zhihong Ma1,2, and Haiyan Cen1,2

Affiliations

1College of Biosystems Engineering and Food Science, Zhejiang University

2Key Laboratory of Spectroscopy Sensing, Ministry of Agriculture and Rural Affairs


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.