Author:
Zhang W.,Wang S.,Haala N.
Abstract
Abstract. Mobile robots are being increasingly employed in various indoor scenarios. The fundamental prerequisite is that the robot can reconstruct an accurate and complete map of the observed environment and estimate the track of its movements in this map. Current visual SLAM methods can perform this task reasonably well, but mostly in small spaces, such as a single room, and often tested in well-textured environments. In real-world applications of large indoor scenes, they lack robustness and fail to build a globally consistent map. To this end, we propose a novel system that can robustly solve the problem encountered by existing visual SLAM methods, such as weak texture and long-term drift. By combining information from a wheel odometer, the robot poses can be predicted smoothly in the absence of texture. The geometric cues are leveraged by aligning Truncated Signed Distance Function (TSDF) based submaps to minimize the long-term drift. To reconstruct a more complete and accurate dense map, we refine the sensor depth maps by taking advantage of color information and the optimization result of global bundle adjustment. As a result, the system can provide precise trajectory estimation and a globally consistent map for the downstream tasks. We validate the accuracy and robustness of the proposed method on both public and self-collected datasets and show the complementary nature of each module. Evaluation results based on high precision ground-truth show an improvement in the mean Absolute Trajectory Error (ATE) from 21 cm to 2 cm for the trajectory estimation, and the reconstructed map has a mean accuracy of 8 cm.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. SLAM for Indoor Mapping of Wide Area Construction Environments;ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences;2024-06-10
2. BAMF-SLAM: Bundle Adjusted Multi-Fisheye Visual-Inertial SLAM Using Recurrent Field Transforms;2023 IEEE International Conference on Robotics and Automation (ICRA);2023-05-29