A Low-Cost and Robust Multi-Sensor Data Fusion Scheme for Heterogeneous Multi-Robot Cooperative Positioning in Indoor Environments

Author:

Cai Zhi12,Liu Jiahang1,Chi Weijian1ORCID,Zhang Bo3

Affiliation:

1. College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China

2. School of Aeronautic Engineering, Nanjing Vocational University of Industry Technology, Nanjing 210023, China

3. School of Network and Communication, Nanjing Vocation College of Information Technology, Nanjing 210023, China

Abstract

The latest development of multi-robot collaborative systems has put forward higher requirements for multi-sensor fusion localization. Current position methods mainly focus on the fusion of the carrier’s own sensor information, and how to fully utilize the information of multiple robots to achieve high-precision positioning is a major challenge. However, due to the comprehensive impact of factors such as poor performance, variety, complex calculations, and accumulation of environmental errors used by commercial robots, the difficulty of high-precision collaborative positioning is further exacerbated. To address this challenge, we propose a low-cost and robust multi-sensor data fusion scheme for heterogeneous multi-robot collaborative navigation in indoor environments, which integrates data from inertial measurement units (IMUs), laser rangefinders, cameras, and so on, into heterogeneous multi-robot navigation. Based on Discrete Kalman Filter (DKF) and Extended Kalman Filter (EKF) principles, a three-step joint filtering model is used to improve the state estimation and the visual data are processed using the YOLO deep learning target detection algorithm before updating the integrated filter. The proposed integration is tested at multiple levels in an open indoor environment following various formation paths. The results show that the three-dimensional root mean square error (RMSE) of indoor cooperative localization is 11.3 mm, the maximum error is less than 21.4 mm, and the motion error in occluded environments is suppressed. The proposed fusion scheme is able to satisfy the localization accuracy requirements for efficient and coordinated motion of autonomous mobile robots.

Funder

National Natural Science Foundation of China

Open foundation of Key Laboratory of Maritime Intelligent Cyberspace Technology, Ministry of Education

Publisher

MDPI AG

Subject

General Earth and Planetary Sciences

Reference30 articles.

1. Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions;Lupton;IEEE Trans. Robot.,2011

2. Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration;Kelly;Int. J. Robot. Res.,2010

3. Visual-inertial navigation, mapping and localization: A scalable real-time causal approach;Jones;Int. J. Robot. Res.,2011

4. Shen, S. (2014). Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles, University of Pennsylvania ProQuest Dissertations Publishing.

5. Sharing Heterogeneous Spatial Knowledge: Map Fusion Between Asynchronous Monocular Vision and Lidar or Other Prior Inputs;Lu;Robot. Res.,2019

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3