Model-Based 3D Gaze Estimation Using a TOF Camera
Author:
Shen Kuanxin1ORCID, Li Yingshun2, Guo Zhannan2, Gao Jintao1, Wu Yingjian1
Affiliation:
1. School of Chemical Process Automation, Shenyang University of Technology, Liaoyang 111003, China 2. School of Control Science and Engineering, Dalian University of Technology, Dalian 116024, China
Abstract
Among the numerous gaze-estimation methods currently available, appearance-based methods predominantly use RGB images as input and employ convolutional neural networks (CNNs) to detect facial images to regressively obtain gaze angles or gaze points. Model-based methods require high-resolution images to obtain a clear eyeball geometric model. These methods face significant challenges in outdoor environments and practical application scenarios. This paper proposes a model-based gaze-estimation algorithm using a low-resolution 3D TOF camera. This study uses infrared images instead of RGB images as input to overcome the impact of varying illumination intensity in the environment on gaze estimation. We utilized a trained YOLOv8 neural network model to detect eye landmarks in captured facial images. Combined with the depth map from a time-of-flight (TOF) camera, we calculated the 3D coordinates of the canthus points of a single eye of the subject. Based on this, we fitted a 3D geometric model of the eyeball to determine the subject’s gaze angle. Experimental validation showed that our method achieved a root mean square error of 6.03° and 4.83° in the horizontal and vertical directions, respectively, for the detection of the subject’s gaze angle. We also tested the proposed method in a real car driving environment, achieving stable driver gaze detection at various locations inside the car, such as the dashboard, driver mirror, and the in-vehicle screen.
Funder
Liaoning Provincial Department of Education Project
Reference53 articles.
1. A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms;Kar;IEEE Access,2017 2. Chandra, S., Sharma, G., Malhotra, S., Jha, D., and Mittal, A.P. (2015, January 17–19). Eye tracking based human computer interaction: Applications and their uses. Proceedings of the 2015 International Conference on Man and Machine Interfacing (MAMI), Bhubaneswar, India. 3. Kapp, S., Barz, M., Mukhametov, S., Sonntag, D., and Kuhn, J. (2021). ARETT: Augmented reality eye tracking toolkit for head mounted displays. Sensors, 21. 4. Valtakari, N.V., Hessels, R.S., Niehorster, D.C., Viktorsson, C., Nyström, P., Falck-Ytter, T., Kemner, C., and Hooge, I.T.C. (2023). A field test of computer-vision-based gaze estimation in psychology. Behav. Res. Methods, 1–16. 5. A review of various state of art eye gaze estimation techniques;Nandini;Adv. Comput. Intell. Commun. Technol. Proc. CICT,2021
|
|