News Release

Revolutionizing grape yield predictions: the rise of semi-supervised berry counting with CDMENet

Peer-Reviewed Publication

Plant Phenomics

Fig. 1

image: 

(A and B) Examples of grape species and manually labeled images in the field grape berry dataset.

view more 

Credit: Plant Phenomics

To improve grape yield predictions, automated berry counting has emerged as a crucial yet challenging task due to the dense distribution and occlusion of berries. While grape cultivation is a significant global economic activity, traditional manual counting methods are inaccurate and inefficient. Recent research has shifted towards deep learning and computer vision, employing detection and density estimation techniques for more precise counts. However, these methods grapple with the variability of farmland and high occlusion rates, leading to significant counting errors. Additionally, creating high-performance algorithms demands expensive data labeling, making it a significant hurdle for widespread use and adaptability in the field. The need for a cost-effective, accurate automated counting method remains a pressing research problem.

In November 2023, Plant Phenomics published a research article entitled by “Semi-supervised Counting of Grape Berries in the Field Based on Density Mutual Exclusion”.

To effectively and cost-efficiently count grape berries, the research presents CDMENet, a semi-supervised method that uses VGG16 for image feature extraction and density mutual exclusion to understand spatial patterns and unlabeled data. The method also employs a density difference loss to amplify feature differences between varying density levels. Conducted on an Ubuntu system with Python and deep learning frameworks, the algorithm's performance was tested using a specified hardware setup, preprocessing techniques, and training details such as learning rates and optimization methods. The results were promising, with CDMENet outperformed both fully and semi-supervised counterparts in terms of Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and the coefficient of determination (R2). Particularly notable was its performance with limited labeled data, showcasing superior accuracy and reduced errors compared to other models. Additionally, the method's robustness was further evidenced in various ablation studies, demonstrating the significant roles of unlabeled data, density difference loss, and the number of auxiliary task predictors. Despite minor influences from prediction confidence thresholds, CDMENet maintained stable and robust performance.

In conclusion, CDMENet presents a viable solution for grape berry counting in fields, reducing the need for extensive manual labeling while improving accuracy and reducing errors. Its efficient use of unlabeled data, enhanced feature representation through density difference loss, and overall robust performance against a backdrop of varying conditions underscore its potential as a cost-effective tool in agricultural yield estimation. Future work might explore optimizing loss functions further and deploying the algorithm in field robots or other practical applications.

###

References

Authors

Yanan  Li1,2†, Yuling  Tang1,2*†, Yifei  Liu1,2, and Dingrun  Zheng1,2

†These authors contributed equally to this work.

Affiliations

1School  of  Computer  Science  and  Engineering,  School  of  Artificial  Intelligence,  Wuhan  Institute  of  Technology,  Wuhan  430205,  China.  

2Hubei  Key  Laboratory  of  Intelligent  Robot,  Wuhan  Institute  of  Technology, Wuhan 430073, China.

About Yanan  Li

She is currently a Lecturer with the School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan. Her research interests include computer vision and machine learning, with particular emphasis on image segmentation, domain adaptation, and various computer vision applications in agriculture.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.