Lightweight Safflower Cluster Detection Based on Yolov5

Author:

Guo Hui1,Wu Tianlun1,Gao Guoming1,Qiu Zhaoxin1,chen Haiyang1

Affiliation:

1. Xinjiang Agricultural University

Abstract

Abstract Safflower detection technology in the field has a crucial role in automated harvesting and the acquisition of row navigation information. Due to the small overall size of safflower clusters, their distribution is relatively dense. The environment between rows is complex, and as a result, uneven lighting severely hinders the detection of clusters. Current safflower bulb detection technology suffers from insufficient detection accuracy and a large amount of computation and complexity, which is not conducive to the deployment of automation and intelligent harvesting robots. To address the above issues, this study presents an enhanced SF-YOLO model for target detection that substitutes Ghos_conv for the conventional convolutional block in the backbone network, for improved computational efficiency. To improve the model's characterisation ability, the backbone network is embedded with the CBAM attention mechanism. The introduction of a fusion L(CIOU+NWD) loss function enhances the accuracy of feature extraction and expedites loss convergence, thus allowing precise feature extraction and improved adaptive fusion while accelerating loss convergence. Hence, the model becomes more adaptive and faster at feature extraction. The updated K-means clustering algorithm yields anchor frames, which substitute for the original COCO dataset anchor frames. This enhances the model’s ability to adjust to multi-scale safflower information across farmlands. The model’s adaptability to multi-scale information between rows of safflowers on the dataset is enhanced through data augmentation techniques such as Gaussian blur, Gaussian noise, sharpening, and channel disruptions. This ensures better robustness against illumination, noise, and angle changes. SF-YOLO surpasses the original YOLOv5s model in tests on a self-constructed safflower dataset under complex background information, where GFlops decrease from 15.8 G to 13.2 G, and Params from 7.013 M to 5.34 M, for respective reductions of 16.6% and 23.9%, and 𝑚𝐴𝑃0.5 improves by 1.3%, to 95.3%. Safflower detection accuracy is enhanced in complex farmland environments, serving as a reference for the subsequent development of autonomous navigation and non-destructive harvesting equipment.

Publisher

Research Square Platform LLC

Reference19 articles.

1. Safflower picking recognition in complex environments based on an improved YOLOv7;Wang XR;Trans Chin Soc Agricultural Eng,2023

2. LeCun Y, Bengio Y, Hinton G. Deep Learn Nat. 2015;521:436–44.

3. Ohi N et al. Design of an autonomous precision pollination robot. 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, 2018.

4. Detecting tomato flowers in greenhouses using computer vision;Oppenheim D;Int J Comput Inform Eng,2017

5. Lightweight detection networks for tea bud on complex agricultural environment via improved YOLO v4;Li J;Comput Electron Agric,2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3