Affiliation:
1. Faculty of Robot Science and Engineering, Northeastern University, Shenyang, China
2. State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering, Zhejiang University, Hangzhou 310058, China
Abstract
This paper presents a real-time and low-cost 3D perception and reconstruction system which is suitable for autonomous navigation and large-scale environment reconstruction. The 3D mapping system is based on a rotating 2D planar laser scanner driven by a step motor, which is suitable for continuous mapping. However, for such a continuous mapping system, the challenge is that the range measurements are received at different times when the 3D LiDAR is moving, which will result in big distortion of the local 3D point cloud. As a result, the errors in motion estimation can cause misregistration of the resulting point cloud. In order to continuously estimate the trajectory of the sensor, we first extract feature points from the local point cloud and then estimate the transformation between current frame to local map to get the LiDAR odometry. After that, we use the estimated motion to remove the distortion of the local point cloud and then register the undistorted local point cloud to the global point cloud to get accurate global map. Finally, we propose a coarse-to-fine graph optimization method to minimize the global drift. The proposed 3D sensor system is advantageous due to its mechanical simplicity, mobility, low weight, low cost, and real-time estimation. To validate the performance of the proposed system, we carried out several experiments to verify its accuracy, robustness, and efficiency. The experimental results show that our system can accurately estimate the trajectory of the sensor and build a quality 3D point cloud map simultaneously.
Funder
Doctoral Scientific Research Foundation of Liaoning Province
Subject
Electrical and Electronic Engineering,Instrumentation,Control and Systems Engineering
Cited by
28 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献