Abstract
Research has shown that sensor data generated by a user during a VR experience is closely related to the user’s behavior or state, meaning that the VR user can be quantitatively understood and modeled. Eye-tracking as a sensor signal has been studied in prior research, but its usefulness in a VR context has been less examined, and most extant studies have dealt with eye-tracking within a single environment. Our goal is to expand the understanding of the relationship between eye-tracking data and user modeling in VR. In this paper, we examined the role and influence of eye-tracking data in predicting a level of cybersickness and types of locomotion. We developed and applied the same structure of a deep learning model to the multi-sensory data collected from two different studies (cybersickness and locomotion) with a total of 50 participants. The experiment results highlight not only a high applicability of our model to sensor data in a VR context, but also a significant relevance of eye-tracking data as a potential supplement to improving the model’s performance and the importance of eye-tracking data in learning processes overall. We conclude by discussing the relevance of these results to potential future studies on this topic.
Funder
Institute for Information & Communication Technology Planning & Evaluation
National 454 Research Foundation
Institute of Information & Communications Technology Planning & Evaluation
Publisher
Public Library of Science (PLoS)
Reference67 articles.
1. Understanding how adolescents with autism respond to facial expressions in virtual reality environments;E Bekele;IEEE transactions on visualization and computer graphics,2013
2. Bekele E, Wade J, Bian D, Fan J, Swanson A, Warren Z, et al. Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with Autism spectrum disorders (ASD). In: 2016 IEEE Virtual Reality (VR). IEEE; 2016. p. 121–130.
3. Design of a virtual reality based adaptive response technology for children with autism;U Lahiri;IEEE Transactions on Neural Systems and Rehabilitation Engineering,2012
4. Ranasinghe N, Jain P, Thi Ngoc Tram N, Koh KCR, Tolley D, Karwita S, et al. Season traveller: Multisensory narration for enhancing the virtual reality experience. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; 2018. p. 1–13.
5. A Physiology-based QoE Comparison of Interactive Augmented Reality, Virtual Reality and Tablet-based Applications;C Keighrey;IEEE Transactions on Multimedia,2020