Revolutionizing root research: harnessing AI for non-destructive in situ root imaging and phenotyping
Nanjing Agricultural University The Academy of Science
image:
Processes related to the pipeline of experiments
view moreCredit: Plant Phenomics
Roots are essential for plant growth, but traditional methods of studying roots are resource-intensive and damaging. With advancements in image processing techniques, innovative methods for in situ root studies have emerged, providing non-destructive root imaging. However, soil shading in images is the current challenge, which leads to fragmented root systems and a loss of structural integrity. And this fragmentation hampers the accurate assessment of root phenotypes. Althouth deep learning approaches such as SegRoot and ChronoRoot have enhanced root image recognition, issues like root breakage and soil coverage still remain. Advancements in image restoration, particularly in situ root identification, are crucial for accurate root phenotype assessment. Techniques such as with techniques like generative adversarial networks (GANs) show potential in this part, but still require refinement.
In July 2023, Plant Phenomics published a research article entitled “Application of Improved UNet and EnlightenGAN for Segmentation and Reconstruction of in Situ Roots”. In this study, researchers proposed using EnlightenGAN for root reconstruction by manipulating the light intensity in targeted areas. The team previously developed the RhizoPot platform, which can nondestructively collect the complete root images. Early stages showed accurate segmentation of roots with DeeplabV3+. However, there were inaccuracies in the anailsis of root diameter and surface area. Continuous research has improved the accuracy in situ root segmentation, but small pieces covered by soil still remain unidentified.
Comparing deep-learning models UNet, SegNet, and DeeplabV3+ on an original root dataset, the study found that DeeplabV3+ (Xception) had the best overall performance. However, each model had its strengths and weaknesses in root identification. Ablation experiments with various improvements on UNet showed increased performance in both mIOU and F1 scores, suggesting that these modifications successfully addressed the limitations of the models. Transfer learning with the improved UNet on the reconstructed root dataset demonstrates good versatility and robustness. EnlightenGAN was used for root generation, with each iterations progressively enhancing root reconstruction. Phenotypic parameters were analyzed using RhizoVision Explorer software, which revealed a significant correlation with actual values. However, the reconstructed roots resulted in changes to root length and surface area. The study conducted a thorough model comparison, highlighting the DeeplabV3+'s, but also noted the limitations of the model in recognizing main roots. The improved UNet was selected for root segmentation because of its scalability and potential for future enhancements. Finally, the study proposed various combinations of UNet and EnlightenGAN for different purposes, ranging from accurate segmentation to dataset expansion and unsupervised training.
Overall, the study demonstrates a significant advance in root reconstruction technology, offering a novel approach to root phenotyping analysis.
###
References
Authors
Qiushi Yu1†, Jingqi Wang1†, Hui Tang1, Jiaxi Zhang1, Wenjie Zhang1, Liantao Liu2*, and Nan Wang1*
Affiliations
1College of Mechanical and Electrical Engineering, Hebei Agricultural University, 071000, Baoding, China.
2College of Agronomy, Hebei Agricultural University, 071000, Baoding, China.
About Liantao Liu & Nan Wang
Liantao Liu: He is a professor in the College of Agronomy at Hebei Agricultural University. His main area of research is cotton cultivation physiology.
Nan Wang: His research interests include computational intelligence, image processing, and data analysis.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.