Affiliation:
1. School of Electronic Information, Wuhan University, Wuhan 430072, China
Abstract
An event camera generates an event stream based on changes in brightness, retaining only the characteristics of moving objects, and addresses the high power consumption associated with using high-frame-rate cameras for high-speed eye-tracking tasks. However, the asynchronous incremental nature of event camera output has not been fully utilized, and there are also issues related to missing event datasets. Combining the temporal information encoding and state-preserving properties of a spiking neural network (SNN) with an event camera, a near-range eye-tracking algorithm is proposed as well as a novel event-based dataset for validation and evaluation. According to experimental results, the proposed solution outperforms artificial neural network (ANN) algorithms, while computational time remains only 12.5% of that of traditional SNN algorithms. Furthermore, the proposed algorithm allows for self-adjustment of time resolution, with a maximum achievable resolution of 0.081 ms, enhancing tracking stability while maintaining accuracy.
Funder
National Natural Science Foundation of China