Affiliation:
1. School of Geospatial Information, Information Engineering University, Zhengzhou 450001, China
Abstract
With the continuous development and popularization of sensor-fusion technology for mobile robots, the application of camera and light detection and ranging (LiDAR) fusion perception has become particularly important. Moreover, the calibration of extrinsic parameters between the camera and LiDAR is a crucial prerequisite for fusion. Although traditional target-based calibration methods have been widely adopted, their cumbersome operation and high costs necessitate the development of more efficient and flexible calibration methods. To address this problem, this study proposed a two-stage calibration method based on motion and edge matching. In the first stage, the preliminary estimation of the extrinsic parameters between the camera and LiDAR was performed by matching visual odometry and LiDAR odometry using a hand–eye target method. In the second stage, the calibration results from the first stage were further refined by matching the image edges and discontinuous depth point clouds. The calibration system was then tested in both simulated and actual environments. The experimental results showed that this method, which did not require specially structured targets, could achieve highly flexible and robust automated calibration. Compared to other advanced methods, the accuracy of the proposed method was higher.
Funder
National Natural Science Foundation of China
Program of Song Shan Laboratory
Subject
General Earth and Planetary Sciences
Reference48 articles.
1. Research on mobile robot positioning and navigation system based on multi-sensor fusion;He;J. Phys. Conf. Ser.,2020
2. High precision positioning system for autopilot based on multi-sensor fusion. E3S Web of Conferences;Luo;EDP Sci.,2021
3. Robust target recognition and tracking of self-driving cars with radar and camera information fusion under severe weather conditions;Liu;IEEE Trans. Intell. Transp. Syst.,2021
4. Multi-sensor fusion-based concurrent environment mapping and moving object detection for intelligent service robotics;Luo;IEEE Trans. Ind. Electron.,2013
5. Yang, F., Liu, W., Li, W., Fang, L., Sun, D., and Yuan, H. (2021, January 7–10). A novel object detection and localization approach via combining vision with LiDAR sensor. Proceedings of the 2021 IEEE 4th International Conference on Electronics Technology (ICET), Chengdu, China.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献