Tightly Coupled Visual–Inertial Fusion for Attitude Estimation of Spacecraft
-
Published:2024-08-20
Issue:16
Volume:16
Page:3063
-
ISSN:2072-4292
-
Container-title:Remote Sensing
-
language:en
-
Short-container-title:Remote Sensing
Author:
Yi Jinhui123ORCID, Ma Yuebo12, Long Hongfeng12ORCID, Zhu Zijian12ORCID, Zhao Rujin123
Affiliation:
1. Institute of Optics and Electronics of Chinese Academy of Sciences, Chengdu 610209, China 2. Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China 3. School of Electronic, Electrical and Communication Engineering, University of Chinese Academy of Sciences, Beijing 100049, China
Abstract
The star sensor boasts the highest accuracy in spacecraft attitude measurement. However, it is vulnerable to disturbances, including high-dynamic motion, stray light, and various in-orbit environmental factors. These disruptions may lead to a significant decline in attitude accuracy or even abnormal output, potentially inducing a state of disorientation in the spacecraft. Thus, it is usually coupled with a high-frequency gyroscope to compensate for this limitation. Nevertheless, the accuracy of long-term attitude estimation using a gyroscope decreases due to the presence of bias. We propose an optimization-based tightly coupled scheme to enhance attitude estimation accuracy under dynamic conditions as well as to bolster the star sensor’s robustness in cases like lost-in-space. Our approach commenced with visual–inertial measurement preprocessing and estimator initialization. Subsequently, the enhancement of attitude and bias estimation precision was achieved by minimizing visual and inertial constraints. Additionally, a keyframe-based sliding window approach was employed to mitigate potential failures in visual sensor measurements. Numerical tests were performed to validate that, under identical dynamic conditions, the proposed method achieves a 50% improvement in the accuracy of yaw, pitch, and roll angles in comparison to the star sensor only.
Funder
Outstanding Youth Science and Technology Talents Program of Sichuan West Light of Chinese Academy of Sciences Sichuan Province Science and Technology Support Program
Reference41 articles.
1. A method of autonomous orbit determination for satellite using star sensor;Ma;Sci. China Ser. G-Phys. Mech. Astron.,2005 2. Zhu, Z.J., Ma, Y.B., Dan, B.B., Liu, E.H., Zhu, Z.F., Yi, J.H., Tang, Y.P., and Zhao, R.J. (2023). An autonomous global star identification algorithm based on the fast mst index and robust multi-order cca pattern. Remote Sens., 15. 3. Han, J.L., Yang, X.B., Xu, T.T., Fu, Z.Q., Chang, L., Yang, C.L., and Jin, G. (2021). An end-to-end identification algorithm for smearing star image. Remote Sens., 13. 4. A blurred star image restoration method based on gyroscope data and enhanced sparse model;Yi;Meas. Sci. Technol.,2023 5. Lu, K.L., Liu, E.H., Zhao, R.J., Tian, H., and Zhang, H. (2022). A fast star identification algorithm of star sensors in the lis mode. Remote Sens., 14.
|
|