News Release

Low-cost depth imaging sensors achieve 97% accuracy in rapid plant disease detection

Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

Fig.7

image: 

Distributions of features for batches of resistant and susceptible plants

view more 

Credit: The authors

A research team investigated low-cost depth imaging sensors with the objective of automating plant pathology tests. The team achieved 97% accuracy in distinguishing between resistant and susceptible plants based on cotyledon loss. This method operates 30 times faster than human annotation and is robust across various environments and plant densities. The innovative imaging system, feature extraction method, and classification model provide a cost-efficient, high-throughput solution, with potential applications in decision-support tools and standalone technologies for real-time edge computation.

Selective plant breeding, which originated with the domestication of wild plants approximately 10,000 years ago, has evolved to address the challenges posed by climate change. Current breeding efforts focus on enhancing plant resilience to biotic and abiotic stresses, earlier germination, and improving nutritional and environmental values. However, the lengthy process of developing new varieties, which often takes up to 10 years, remains a significant hurdle.

A study (DOI: 10.34133/plantphenomics.0204) published in Plant Phenomics on 6 Jun 2024, investigates the effectiveness of Phenogrid, a phenotyping system designed for early-stage plant monitoring under biotic stress, addressing the issue of plant resistance to pathogens.

In this study, the extraction of spatio-temporal features, including absolute amplitude (Aabs), relative amplitude (Arel), and drop duration (D), proved to be an effective method for differentiating between susceptible and resistant plant batches. The onset (O) feature demonstrated uniformity in susceptible plants, whereas resistant plants exhibited a consistent three-day onset, which correlated with cotyledon loss. Height signals were less effective, while surface and volume signals demonstrated pronounced contrasts between susceptible and resistant plants. The results of the statistical tests demonstrated the significance of the majority of the extracted features in detecting cotyledon loss. This necessitated the use of a complex classifier in order to achieve  efficient batch classification. The Random Forest model achieved the highest classification accuracy of 97%, accompanied by strong performance metrics (MCC: +91%). The method demonstrated resilience to inoculation timing variability, maintaining performance with up to two hours of desynchronization. Furthermore, simulations indicated that reducing the number of plants per batch from 20 to 10 maintained classification performance while doubling throughput. A visual analysis revealed that direct watering had an impact on classification accuracy, suggesting automated or subirrigation methods could further enhance performance. The method's efficacy  extends to the segregation of other pathosystems, thereby demonstrating robust generalizability and potential for high-throughput plant pathology diagnostics.

The study's lead researcher, David ROUSSEAU, asserts that the imaging system developed, when combined with the feature extraction method and classification model, provides a comprehensive pipeline with unparalleled throughput and cost efficiency when compared to the state-of-the-art. The system can be deployed as a decision-support tool, but is also compatible with a standalone technology where computation is done at the edge in real time.

In conclusion, this study demonstrates the successful automation of plant pathology tests using low-cost depth imaging sensors, achieving an accuracy of 97% in distinguishing resistant from susceptible plants through cotyledon loss detection. The method is robust to variations in plant density and desynchronization, and thus significantly accelerates the processing time compared to that required for human annotation. Future enhancements could include integrating additional imaging modalities and refining algorithms for broader applicability, promising a rapid, accurate, and cost-effective solution for improving crop resilience and productivity.

###

References

DOI

10.34133/plantphenomics.0204

Original Source URL

https://spj.science.org/doi/10.34133/plantphenomics.0204

Authors

Mathis CORDIER1,2, Pejman RASTI1,3, Cindy TORRES2, and David3ROUSSEAU1*

Affications

1 Laboratoire Angevin de Recherche en Ing´enierie des Syst`emes (LARIS), UMR5INRAe-IRHS, Universit´e d’Angers, Angers, 49000, France6

2 R&D Artificial Vision and Automation, Vilmorin-Mikado, La M´enitr´e, 49250,7France8

3 Centre d’´Etudes et de Recherche pour l’Aide `a la D´ecision (CERADE), ESAIP,9Saint-Barth´elemy-d’Anjou, 49124, France.10

* Address correspondence to: david.rousseau@univ-angers.fr

Funding information

This research was funded by ANRT (Association Nationale de la Recherche et de la Technologie), under grant agreement [2020-1738], Vilmorin-Mikado company, Limagrain group and University of Angers. This work was supported by the French National Research Agency (ANR) , the Investments for the Future program (PIA), - the project PHENOME, ANR-11-INBS-0012.

About Plant Phenomics

Plant Phenomics is an Open Access journal published in affiliation with the State Key Laboratory of Crop Genetics & Germplasm Enhancement, Nanjing Agricultural University (NAU) and published by the American Association for the Advancement of Science (AAAS). Like all partners participating in the Science Partner Journal program, Plant Phenomics is editorially independent from the Science family of journals. Editorial decisions and scientific activities pursued by the journal's Editorial Board are made independently, based on scientific merit and adhering to the highest standards for accurate and ethical promotion of science. These decisions and activities are in no way influenced by the financial support of NAU, NAU administration, or any other institutions and sponsors. The Editorial Board is solely responsible for all content published in the journal. To learn more about the Science Partner Journal program, visit the SPJ program homepage.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.