Implicit Neural Mapping for a Data Closed-Loop Unmanned Aerial Vehicle Pose-Estimation Algorithm in a Vision-Only Landing System
Author:
Liu Xiaoxiong1ORCID, Li Changze1ORCID, Xu Xinlong1, Yang Nan1, Qin Bin1
Affiliation:
1. School of Automation, Northwestern Polytechnical University, Xi’an 710129, China
Abstract
Due to their low cost, interference resistance, and concealment of vision sensors, vision-based landing systems have received a lot of research attention. However, vision sensors are only used as auxiliary components in visual landing systems because of their limited accuracy. To solve the problem of the inaccurate position estimation of vision-only sensors during landing, a novel data closed-loop pose-estimation algorithm with an implicit neural map is proposed. First, we propose a method with which to estimate the UAV pose based on the runway’s line features, using a flexible coarse-to-fine runway-line-detection method. Then, we propose a mapping and localization method based on the neural radiance field (NeRF), which provides continuous representation and can correct the initial estimated pose well. Finally, we develop a closed-loop data annotation system based on a high-fidelity implicit map, which can significantly improve annotation efficiency. The experimental results show that our proposed algorithm performs well in various scenarios and achieves state-of-the-art accuracy in pose estimation.
Funder
the National Natural Science Foundation of China the Aeronautical Science Foundation of China
Subject
Artificial Intelligence,Computer Science Applications,Aerospace Engineering,Information Systems,Control and Systems Engineering
Reference37 articles.
1. Kong, W., Zhou, D., Zhang, D., and Zhang, J. (2014, January 28–29). Vision-based autonomous landing system for unmanned aerial vehicle: A survey. Proceedings of the 2014 International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI), Beijing, China. 2. Kügler, M.E., Mumm, N.C., Holzapfel, F., Schwithal, A., and Angermann, M. (2019, January 7–11). Vision-augmented automatic landing of a general aviation fly-by-wire demonstrator. Proceedings of the AIAA Scitech 2019 Forum, San Diego, CA, USA. 3. Tang, C., Wang, Y., Zhang, L., Zhang, Y., and Song, H. (2022). Multisource fusion UAV cluster cooperative positioning using information geometry. Remote Sens., 14. 4. Yen-Chen, L., Florence, P., Barron, J.T., Rodriguez, A., Isola, P., and Lin, T.Y. (October, January 27). inerf: Inverting neural radiance fields for pose estimation. Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic. 5. Nerf: Representing scenes as neural radiance fields for view synthesis;Mildenhall;Commun. ACM,2021
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|