Abstract
AbstractAutonomous navigation in large-scale and complex environments in the absence of a GPS signal is a fundamental challenge encountered in a variety of applications. Since 3-D scans provide inherent robustness to ambient illumination changes and the type of the surface texture, we present Point Cloud Map-based Navigation (PCMN), a robust robot navigation system, based exclusively on 3-D point cloud registration between an acquired observation and a stored reference map. It provides a drift-free navigation solution, equipped with a failed registration detection capability. The backbone of the navigation system is a robust point cloud registration method, of the acquired observation to the stored reference map. The proposed registration algorithm follows a hypotheses generation and evaluation paradigm, where multiple statistically independent hypotheses are generated from local neighborhoods of putative matching points. Then, hypotheses are evaluated using a multiple consensus analysis that integrates evaluation of the point cloud feature correlation and a consensus test on the Special Euclidean Group SE(3) based on independent hypothesized estimates. The proposed PCMN is shown to achieve significantly better performance than state-of-the-art methods, both in terms of place recognition recall and localization accuracy, achieving submesh resolution accuracy, both for indoor and outdoor settings.
Funder
Israel Innovation Authority
Publisher
Springer Science and Business Media LLC
Reference46 articles.
1. J. Komorowski, Minkloc3d: Point cloud based large-scale place recognition. in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), pp. 1790–1799 (2021)
2. Y. Xia, Y. Xu, S. Li, R. Wang, J. Du, D. Cremers, U. Stilla, Soe-net: a self-attention and orientation encoding network for point cloud based place recognition. CVPR 2021, 11348–11357 (2021)
3. D. Rozenberszki, A. Majdik, LOL: Lidar-only Odometry and Localization in 3D point cloud maps. in 2020 IEEE International Conference on Robotics and Automation (ICRA) (2020). IEEE
4. A. Efraim, J.M. Francos, Estimating rigid transformations of noisy point clouds using the universal manifold embedding. J. Math. Imaging Vision 64(4), 343–363 (2022). https://doi.org/10.1007/s10851-022-01070-6
5. A. Efraim, J.M. Francos, Dual transformation and manifold distances voting for outlier rejection in point cloud registration. in Proceedings of the IEEE Conference on Computer Vision, pp. 4204–4212 (2021)