Abstract
Abstract
Visual simultaneous localization and mapping (SLAM) is the underlying support of unmanned systems. Currently, most visual SLAM methods are based on the static environment assumption so that dynamic objects in the camera’s field of view will seriously disrupt its working performance. In view of this, an RGB-D SLAM approach based on probability observations and clustering optimization for highly dynamic environments is proposed, which can effectively eliminate the influence of dynamic objects and accurately estimate the ego-motion of an RGB-D camera. The method contains a dual static map point detection strategy, which is carried out simultaneously in the current and previous frames. First, to enhance tracking robustness in highly dynamic environments, the probabilities of map points being static, calculated by both reprojection deviation and intensity deviation, are used to weight the cost function for pose estimation. Meanwhile, by taking the previous frames as a reference, a static velocity probability based on sparse scene flow is acquired to preliminarily recognize static map points and further improve the tracking accuracy. Then, an improved map point optimization strategy based on K-means clustering is designed, which effectively takes advantage of the clustering algorithm to refine the static map point labels while alleviating its stubborn problem. Finally, the experimental results on the TUM dataset and real scenes compared with the state-of-the-art visual SLAM methods illustrate that the proposed method achieves an extremely robust and accurate performance for estimating camera pose in highly dynamic environments.
Funder
Key-Area Research and Development Program of Guangdong Province under Grants
Zhuhai Industry University Research Cooperation Project
Subject
Applied Mathematics,Instrumentation,Engineering (miscellaneous)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献