High-Altitude Precision Landing by Smartphone Video Guidance Sensor and Sensor Fusion
Author:
Silva Cotta Joao Leonardo1ORCID, Gutierrez Hector2ORCID, Bertaska Ivan R.3, Inness John P.3, Rakoczy John3
Affiliation:
1. Department of Aerospace Engineering, Physics and Space Sciences, Florida Institute of Technology, Melbourne, FL 32901, USA 2. Department of Mechanical and Civil Engineering, Florida Institute of Technology, Melbourne, FL 32901, USA 3. Control Systems Design and Analysis Branch, NASA Marshall Space Flight Center, Huntsville, AL 35812, USA
Abstract
This paper describes the deployment, integration, and demonstration of the Smartphone Video Guidance Sensor (SVGS) as novel technology for autonomous 6-DOF proximity maneuvers and high-altitude precision landing of UAVs via sensor fusion. The proposed approach uses a vision-based photogrammetric position and attitude sensor (SVGS) to support the precise automated landing of a UAV from an initial altitude above 100 m to ground, guided by an array of landing beacons. SVGS information is fused with other on-board sensors at the flight control unit to estimate the UAV’s position and attitude during landing relative to a ground coordinate system defined by the landing beacons. While the SVGS can provide mm-level absolute positioning accuracy depending on range and beacon dimensions, the proper operation of the SVGS requires a line of sight between the camera and the beacon, and readings can be disturbed by environmental lighting conditions and reflections. SVGS readings can therefore be intermittent, and their update rate is not deterministic since the SVGS runs on an Android device. The sensor fusion of the SVGS with on-board sensors enables an accurate and reliable update of the position and attitude estimates during landing, providing improved performance compared to state-of-art automated landing technology based on an infrared beacon, but its implementation must address the challenges mentioned above. The proposed technique also shows significant advantages compared with state-of-the-art sensors for High-Altitude Landing, such as those based on LIDAR.
Funder
NASA’s Marshall Space Flight Center Dual-Use Technology Development
Reference37 articles.
1. Silva Cotta, J.L., Rakoczy, J., and Gutierrez, H. (2023). Precision landing comparison between Smartphone Video Guidance Sensor and IRLock by hardware-in-the-loop emulation. Ceas Space J. 2. Bautista, N., Gutierrez, H., Inness, J., and Rakoczy, J. (2023). Precision Landing of a Quadcopter Drone by Smartphone Video Guidance Sensor in a GPS-Denied Environment. Sensors, 23. 3. Bo, C., Li, X.Y., Jung, T., Mao, X., Tao, Y., and Yao, L. (October, January 30). SmartLoc: Push the Limit of the Inertial Sensor Based Metropolitan Localization Using Smartphone. Proceedings of the 19th Annual International Conference on Mobile Computing & Networking, MobiCom ’13, Miami, FL, USA. 4. Zhao, B., Chen, X., Zhao, X., Jiang, J., and Wei, J. (2018). Real-Time UAV Autonomous Localization Based on Smartphone Sensors. Sensors, 18. 5. Low-cost Multi-UAV Technologies for Contour Mapping of Nuclear Radiation Field;Han;J. Intell. Robot. Syst.,2013
|
|