Affiliation:
1. Université Bourgogne, 21000 Dijon, France
2. ICB UMR CNRS 6303, Université Bourgogne, 21000 Dijon, France
Abstract
This paper presents a visual compass method utilizing global features, specifically spherical moments. One of the primary challenges faced by photometric methods employing global features is the variation in the image caused by the appearance and disappearance of regions within the camera’s field of view as it moves. Additionally, modeling the impact of translational motion on the values of global features poses a significant challenge, as it is dependent on scene depths, particularly for non-planar scenes. To address these issues, this paper combines the utilization of image masks to mitigate abrupt changes in global feature values and the application of neural networks to tackle the modeling challenge posed by translational motion. By employing masks at various locations within the image, multiple estimations of rotation corresponding to the motion of each selected region can be obtained. Our contribution lies in offering a rapid method for implementing numerous masks on the image with real-time inference speed, rendering it suitable for embedded robot applications. Extensive experiments have been conducted on both real-world and synthetic datasets generated using Blender. The results obtained validate the accuracy, robustness, and real-time performance of the proposed method compared to a state-of-the-art method.
Reference53 articles.
1. Uncalibrated downward-looking UAV visual compass based on clustered point features;Liu;Sci. China Inf. Sci.,2019
2. Anderson, P., and Hengst, B. (2014). Proceedings of the RoboCup 2013: Robot World Cup XVII 17, Springer.
3. Liu, Y., Tao, J., Kong, D., Zhang, Y., and Li, P. (2022). A Visual Compass Based on Point and Line Features for UAV High-Altitude Orientation Estimation. Remote Sens., 14.
4. Combined visual odometry and visual compass for off-road mobile robots localization;Gonzalez;Robotica,2012
5. Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visual systems;Campoy;Auton. Robot.,2010