Abstract
AbstractAn approach is proposed for recovering affine correspondences (ACs) from orientation- and scale-covariant, e.g., SIFT, features exploiting pre-estimated epipolar geometry. The method calculates the affine parameters consistent with the epipolar geometry from the point coordinates and the scales and rotations which the feature detector obtains. The proposed closed-form solver returns a single solution and is extremely fast, i.e., 0.5 $$\upmu $$
μ
seconds on average. Possible applications include estimating the homography from a single upgraded correspondence and, also, estimating the surface normal for each correspondence found in a pre-calibrated image pair (e.g., stereo rig). As the second contribution, we propose a minimal solver that estimates the relative pose of a vehicle-mounted camera from a single SIFT correspondence with the corresponding surface normal obtained from, e.g., upgraded ACs. The proposed algorithms are tested both on synthetic data and on a number of publicly available real-world datasets. Using the upgraded features and the proposed solvers leads to a significant speed-up in the homography, multi-homography and relative pose estimation problems with better or comparable accuracy to the state-of-the-art methods.
Funder
Swiss Federal Institute of Technology Zurich
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Software
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Progressive Keypoint Localization and Refinement in Image Matching;Lecture Notes in Computer Science;2024
2. Adaptive Feature Calibration Siamese Network for Rotating Object Tracking;2023 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML);2023-11-03
3. Geometric Mapping and Pose Estimation Techniques for Improving Safety and Efficiency in Offshore Crane Operations;2023 Latin American Robotics Symposium (LARS), 2023 Brazilian Symposium on Robotics (SBR), and 2023 Workshop on Robotics in Education (WRE);2023-10-09