Author:
Yao Erliang,Zhang Hexin,Song Haitao,Zhang Guoliang
Abstract
Purpose
To realize stable and precise localization in the dynamic environments, the authors propose a fast and robust visual odometry (VO) approach with a low-cost Inertial Measurement Unit (IMU) in this study.
Design/methodology/approach
The proposed VO incorporates the direct method with the indirect method to track the features and to optimize the camera pose. It initializes the positions of tracked pixels with the IMU information. Besides, the tracked pixels are refined by minimizing the photometric errors. Due to the small convergence radius of the indirect method, the dynamic pixels are rejected. Subsequently, the camera pose is optimized by minimizing the reprojection errors. The frames with little dynamic information are selected to create keyframes. Finally, the local bundle adjustment is performed to refine the poses of the keyframes and the positions of 3-D points.
Findings
The proposed VO approach is evaluated experimentally in dynamic environments with various motion types, suggesting that the proposed approach achieves more accurate and stable location than the conventional approach. Moreover, the proposed VO approach works well in the environments with the motion blur.
Originality/value
The proposed approach fuses the indirect method and the direct method with the IMU information, which improves the localization in dynamic environments significantly.
Subject
Industrial and Manufacturing Engineering,Computer Science Applications,Control and Systems Engineering
Reference32 articles.
1. Past, present, and future of simultaneous localization and mapping: toward the robust-perception age;IEEE Transactions on Robotics,2016
2. Large-Scale direct SLAM with stereo cameras,2015
3. Visual-inertial SLAM method based on optical flow in a GPS-denied environment;Industrial Robot: An International Journal,2018
4. Direct sparse odometry;IEEE Transactions on Pattern Analysis and Machine Intelligence,2018
5. Evo (2018), “Available online”, available at: https://github.com/MichaelGrupp/evo (accessed 27 October 2018).
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献