Affiliation:
1. Rochester Institute of Technology, Rochester, New York, USA
Abstract
Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically involve tracking a feature of the eye that moves through the eye camera image in a way that covaries with the shifting gaze direction, such as the center or boundaries of the pupil. Tracking these features using traditional computer vision techniques can be difficult due to partial occlusion and environmental reflections. Although recent efforts to use machine learning (ML) for pupil tracking have demonstrated superior results when evaluated using standard measures of segmentation performance, little is known of how these networks may affect the quality of the final gaze estimate. This work provides an objective assessment of the impact of several contemporary ML-based methods for eye feature tracking when the subsequent gaze estimate is produced using either feature-based or model-based methods. Metrics include the accuracy and precision of the gaze estimate, as well as drop-out rate.
Funder
National Science Foundation
National Eye Institute of the National Institutes of Health
Publisher
Association for Computing Machinery (ACM)
Reference38 articles.
1. Pupil Tracking Under Direct Sunlight
2. Jack Brookes, Matthew Warburton, Mshari Alghadier, Mark Mon-Williams, and Faisal Mushtaq. 2020. Studying human behavior with virtual reality: The Unity Experiment Framework. Behavior research methods 52 (2020), 455--463.
3. Landmark-aware Self-supervised Eye Semantic Segmentation
4. Motion tracking of iris features for eye tracking
5. RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献