Author:
Shang Gaogao,Liu Gang,Zhu Peng,Han Jiangyi,Xia Changgao,Jiang Kun
Abstract
Recognition of the orchard environment is a prerequisite for realizing the autonomous operation of intelligent horticultural tractors. Due to the complexity of the environment and the algorithm’s dependence on ambient light, the traditional recognition algorithm based on machine vision is limited and has low accuracy. However, the deep residual U-type network is more effective in this situation. In an orchard, the deep residual U-type network can perform semantic segmentation on trees, drivable roads, debris, etc. The basic structure of the network adopts a U-type network, and residual learning is added in the coding layer and bottleneck layer. Firstly, the residual module is used to improve the network depth, enhance the fusion of semantic information at different levels, and improve the feature expression capability and recognition accuracy. Secondly, the decoding layer uses up-sampling for feature mapping, which is convenient and fast. Thirdly, the semantic information of the coding layer is integrated by skip connection, which reduces the network parameters and accelerates the training. Finally, a network was built through the Pytorch Deep Learning Framework, which was implemented to train the data set and compare the network with the fully convolutional neural network, the U-type network, and the Front-end+Large network. The results show that the deep residual U-type network has the highest recognition accuracy, with an average of 85.95%, making it more suitable for environment recognition in orchards.
Funder
Jiangsu Provincial Key Research and Development Program
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference31 articles.
1. A Brief Discussion on the Intelligence of Agricultural Machinery and Analysis of the Current Situation of Chinese Agriculture;Liu;Theor. Res. Urban Constr.,2017
2. Applying Wide-Angle Camera in Robot Vision Localisation;Fan;Comput. Appl. Softw.,2014
3. Machine vision for orchard navigation
4. Orchard Free Space and Center Line Estimation Using Naive Bayesian Classifier for Unmanned Ground Self-Driving Vehicle
5. Research on Illumination Issue and Vision Navigation System of Agriculture Robot;An,2008
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献