Easy Rocap: A Low-Cost and Easy-to-Use Motion Capture System for Drones
Author:
Wang Haoyu123ORCID, Chen Chi123, He Yong123, Sun Shangzhe123ORCID, Li Liuchun4, Xu Yuhang123, Yang Bisheng123ORCID
Affiliation:
1. State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430072, China 2. Engineering Research Centre for Spatio-Temporal Data Acquisition and Smart Application (STSA), Ministry of Education in China, Wuhan 430072, China 3. Institute of Artificial Intelligence in Geomatics, Wuhan University, Wuhan 430072, China 4. Institute of Artificial Intelligence, School of Computer Science, Wuhan University, Wuhan 430072, China
Abstract
Fast and accurate pose estimation is essential for the local motion control of robots such as drones. At present, camera-based motion capture (Mocap) systems are mostly used by robots. However, this kind of Mocap system is easily affected by light noise and camera occlusion, and the cost of common commercial Mocap systems is high. To address these challenges, we propose Easy Rocap, a low-cost, open-source robot motion capture system, which can quickly and robustly capture the accurate position and orientation of the robot. Firstly, based on training a real-time object detector, an object-filtering algorithm using class and confidence is designed to eliminate false detections. Secondly, multiple-object tracking (MOT) is applied to maintain the continuity of the trajectories, and the epipolar constraint is applied to multi-view correspondences. Finally, the calibrated multi-view cameras are used to calculate the 3D coordinates of the markers and effectively estimate the 3D pose of the target robot. Our system takes in real-time multi-camera data streams, making it easy to integrate into the robot system. In the simulation scenario experiment, the average position estimation error of the method is less than 0.008 m, and the average orientation error is less than 0.65 degrees. In the real scenario experiment, we compared the localization results of our method with the advanced LiDAR-Inertial Simultaneous Localization and Mapping (SLAM) algorithm. According to the experimental results, SLAM generates drifts during turns, while our method can overcome the drifts and accumulated errors of SLAM, making the trajectory more stable and accurate. In addition, the pose estimation speed of our system can reach 30 Hz.
Funder
National Natural Science Foundation of China National Key RESEARCH and Development Program Natural Science Foundation of Hubei, China the Key RESEARCH and Development Program of Hubei Province National Key RESEARCH and Development Program-Key Special Projects for International Cooperation in Science and Technology Innovation between Governments Fundamental Research Funds for the Central Universities European Union’s Horizon 2020 Research and Innovation Program
Reference67 articles.
1. Dinelli, C., Racette, J., Escarcega, M., Lotero, S., Gordon, J., Montoya, J., Dunaway, C., Androulakis, V., Khaniani, H., and Shao, S.J.D. (2023). Configurations and Applications of Multi-Agent Hybrid Drone/Unmanned Ground Vehicle for Underground Environments: A Review. Drones, 7. 2. Mohsan, S.A.H., Khan, M.A., Noor, F., Ullah, I., and Alsharif, M.H. (2022). Towards the unmanned aerial vehicles (UAVs): A comprehensive review. Drones, 6. 3. VILENS: Visual, inertial, lidar, and leg odometry for all-terrain legged robots;Wisth;IEEE Trans. Robot.,2022 4. Schneider, J., Eling, C., Klingbeil, L., Kuhlmann, H., Förstner, W., and Stachniss, C. (2016, January 16–21). Fast and effective online pose estimation and mapping for UAVs. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden. 5. GPS/BDS short-term ISB modelling and prediction;Jiang;GPS Solut.,2017
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|