Abstract
AbstractEvent cameras produce asynchronous discrete outputs due to the independent response of camera pixels to changes in brightness. The asynchronous and discrete nature of event data facilitate the tracking of prolonged feature trajectories. Nonetheless, this necessitates the adaptation of feature tracking techniques to efficiently process this type of data. In addressing this challenge, we proposed a hybrid data-driven feature tracking method that utilizes data from both event cameras and frame-based cameras to track features asynchronously. It mainly includes patch initialization, patch optimization, and patch association modules. In the patch initialization module, FAST corners are detected in frame images, providing points responsive to local brightness changes. The patch association module introduces a nearest-neighbor (NN) algorithm to filter new feature points effectively. The patch optimization module assesses optimization quality for tracking quality monitoring. We evaluate the tracking accuracy and robustness of our method using public and self-collected datasets, focusing on average tracking error and feature age. In contrast to the event-based Kanade–Lucas–Tomasi tracker method, our method decreases the average tracking error ranging from 1.3 to 29.2% and boosts the feature age ranging from 9.6 to 32.1%, while ensuring the computational efficiency improvement of 1.2–7.6%. Thus, our proposed feature tracking method utilizes the unique characteristics of event cameras and traditional cameras to deliver a robust and efficient tracking system.
Funder
Key Technologies Research and Development Program
Jiangsu Provincial Key Research and Development Program
National Natural Science Foundation of China
Publisher
Springer Science and Business Media LLC