Accurate extraction of phenotypic traits from image data is essential for cereal crop research, but spike detection in greenhouses is challenging due to the environmental and physical similarities between spikes and leaves. Recent efforts include increasing image resolution and feature dimensionality, and developing neural networks such as SpikeSegNet to improve spike detection. However, these methods struggle to accurately localise small spikes,and further advances in neural network tuning and novel detection models are needed to efficiently overcome these spike detection challenges.
In January 2024, Plant Phenomics published a research article entitled by “High-throughput spike detection in greenhouse cultivated grain crops with attention mechanisms based deep learning models”.
In this study, three deep neural networks (DNNs) – FRCNN, FRCNN-A, and Swin Transformer were implemented and trained for spike detection in cereal crops. The networks were optimized using the SGD optimizer, with training times varying between the models; FRCNN required 900 to 1200 epochs, FRCNN-A 800 to 1000 epochs, and Swin Transformer 2500 to 3000 epochs. A dynamic learning rate strategy was used to optimize model convergence, demonstrating the effectiveness of the models in detecting spikes of varying difficulty, particularly within dense leaf mass.
The results showed that the Swin Transformer outperformed the other models in terms of accuracy without data transformation or augmentation. The FRCNN-A model, augmented with an attention module, showed significant improvement over the original FRCNN, highlighting the potential for further improvements in the FRCNN-A architecture. The ability of the attention module to capture the hierarchical context of regions of interest was particularly noted for its effectiveness in detecting challenging spike patterns.
Training on nine datasets from two phenotyping facilities showed that all models improved in accuracy as the original image content in the training sets increased. The Swin Transformer demonstrated the highest mean average precision (mAP) across different training sets, indicating its superior ability to extract features and detect spikes. However, the study also highlighted that while the Swin Transformer provides high accuracy, the FRCNN-A provides a more efficient and faster training alternative, especially beneficial for datasets with similar characteristics.
The results emphasized the importance of the models' adaptability to augmented images and their performance on a specific IPK test set, highlighting the potential of these advanced architectures to improve spike detection in mixed wheat varieties. The study concluded that the modified FRCNN-A, with its reduced number of convolutional layers and the addition of an attention module, together with the computationally intensive Swin Transformer, represent significant advances in the detection of small-scale objects in complex optical scenes. These innovations promise improved accuracy and efficiency in phenotyping tasks, although the trade-off between inference time and accuracy remains a consideration for real-time applications.
###
References
Authors
Sajid Ullah *1 2 4, Klara Panzarova4, Martin Trtilek4, Matej Lexa3, Vojtech Macala3, Kerstin Neumann5, Thomas Altmann5, Jan Hejatko1 2, Marketa Pernisova1 2, and Evgeny Gladilin5
Affiliations
1Mendel Centre for Plant Genomics and Proteomics, Central European Institute of Technology (CEITEC), Masaryk University, Brno, Czech Republic
2National Centre for Biomolecular Research, Faculty of Science, Masaryk University, Brno, Czech Republic
3Faculty of Informatics, Masaryk University, Botanicka 68a, Brno, Czech Republic
4Photon Systems Instruments, spol. s r.o., Drasov, Czech Republic
5Leibniz Institute of Plant Genetics and Crop Plant Research, Gatersleben, Germany
Journal
Plant Phenomics
Method of Research
Experimental study
Subject of Research
Not applicable
Article Title
High-throughput spike detection in greenhouse cultivated grain crops with attention mechanisms based deep learning models
Article Publication Date
21-Jan-2024
COI Statement
The authors declare that they have no competing interests.