Abstract
This paper presents a state-of-the-art light detection and ranging (LiDAR) based autonomous navigation system for under-canopy agricultural robots. Under-canopy agricultural navigation has been a challenging problem because global navigation satellite system (GNSS) and other positioning sensors are prone to loss of accuracy due to attenuation and multi-path errors caused by crop leaves and stems. Reactive navigation by detecting crop rows using LiDAR measurements has proved to be an efficient alternative to GNSS. Nevertheless, it presents challenges, due to occlusion from leaves under the canopy. Our system addresses these issues by fusing inertial measurement unit (IMU) and LiDAR measurements in a Bayesian framework on low-cost hardware. In addition, a local goal generator (LGG) is introduced to provide a local reference trajectory to the onboard controller. Our system is validated extensively in real-world field environments over a distance of 50.88 km, on multiple robots, in different field conditions, across different locations. We report leading distance between intervention results for LiDAR+IMU-based under-canopy navigation, showing that our system is able to safely navigate without interventions for 386.9 m on average, in fields without significant gaps in the crop rows.
Publisher
Field Robotics Publication Society
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献