Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

Author:

Drakopoulos Panagiotis1,Koulieris George-alex2,Mania Katerina1

Affiliation:

1. Technical University of Crete, Chania - Crete, Greece

2. Durham University, Durham, United Kingdom

Abstract

Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset’s lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system’s accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset’s field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.

Publisher

Association for Computing Machinery (ACM)

Subject

Experimental and Cognitive Psychology,General Computer Science,Theoretical Computer Science

Reference44 articles.

1. [n.d.]. OpenCV Library. Retrieved May 1 2020 from https://opencv.org/. [n.d.]. OpenCV Library. Retrieved May 1 2020 from https://opencv.org/.

2. [n.d.]. XCode. Retrieved September 9 2019 from https://developer.apple.com/xcode/. [n.d.]. XCode. Retrieved September 9 2019 from https://developer.apple.com/xcode/.

3. EyeSpyVR: Interactive eye sensing using off-the-shelf, smartphone-based VR headsets;Ahuja Karan;Proc. ACM Interact. Mobile Wearable Ubiq. Technol.,2018

4. Quantitative measurement of saccade amplitude, duration, and velocity

Cited by 17 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Exploring loyalty drivers for smartphone and mobile carriers;Humanities and Social Sciences Communications;2024-06-29

2. A review on personal calibration issues for video-oculographic-based gaze tracking;Frontiers in Psychology;2024-03-20

3. Improving Inclusion of Virtual Reality Through Enhancing Interactions in Low-Fidelity VR;2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW);2024-03-16

4. Eye-tracking on virtual reality: a survey;Virtual Reality;2024-02-05

5. Best low-cost methods for real-time detection of the eye and gaze tracking;i-com;2024-01-08

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3