VID-SLAM: Robust Pose Estimation with RGBD-Inertial Input for Indoor Robotic Localization
-
Published:2024-01-11
Issue:2
Volume:13
Page:318
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Shan Dan12ORCID, Su Jinhe3ORCID, Wang Xiaofeng1, Liu Yujun3, Zhou Taojian3, Wu Zebiao3
Affiliation:
1. School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China 2. School of Electrical and Control Engineering, Shenyang Jianzhu University, Shenyang 110168, China 3. Computer Engineering College, Jimei University, Xiamen 360121, China
Abstract
This study proposes a tightly coupled multi-sensor Simultaneous Localization and Mapping (SLAM) framework that integrates RGB-D and inertial measurements to achieve highly accurate 6 degree of freedom (6DOF) metric localization in a variety of environments. Through the consideration of geometric consistency, inertial measurement unit constraints, and visual re-projection errors, we present visual-inertial-depth odometry (called VIDO), an efficient state estimation back-end, to minimise the cascading losses of all factors. Existing visual-inertial odometers rely on visual feature-based constraints to eliminate the translational displacement and angular drift produced by Inertial Measurement Unit (IMU) noise. To mitigate these constraints, we introduce the iterative closest point error of adjacent frames and update the state vectors of observed frames through the minimisation of the estimation errors of all sensors. Moreover, the closed-loop module allows for further optimization of the global attitude map to correct the long-term drift. For experiments, we collect an RGBD-inertial data set for a comprehensive evaluation of VID-SLAM. The data set contains RGB-D image pairs, IMU measurements, and two types of ground truth data. The experimental results show that VID-SLAM achieves state-of-the-art positioning accuracy and outperforms mainstream vSLAM solutions, including ElasticFusion, ORB-SLAM2, and VINS-Mono.
Funder
National Natural Science Foundation of China Shenyang Science and Technology Project Educational Department of Liaoning Provincial Basic Research Project
Reference29 articles.
1. Robust IMU/GPS/VO integration for vehicle navigation in GNSS degraded urban areas;Sun;IEEE Sens. J.,2020 2. Xin, H., Ye, Y., Na, X., Hu, H., Wang, G., Wu, C., and Hu, S. (2023). Sustainable Road Pothole Detection: A Crowdsourcing Based Multi-Sensors Fusion Approach. Sustainability, 15. 3. Yu, Y. (2023). Autonomous Localization by Integrating Wi-Fi and MEMS Sensors in Large-Scale Indoor Spaces. [Ph.D. Thesis, Hong Kong Polytechnic University]. 4. Tong, P., Yang, X., Yang, Y., Liu, W., and Wu, P. (2023). Multi-UAV Collaborative Absolute Vision Positioning and Navigation: A Survey and Discussion. Drones, 7. 5. Liu, L., and Aitken, J.M. (2023). HFNet-SLAM: An Accurate and Real-Time Monocular SLAM System with Deep Features. Sensors, 23.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|