Affiliation:
1. VISUS, University of Stuttgart Germany
Abstract
AbstractAugmented Reality (AR) provides new ways for situated visualization and human‐computer interaction in physical environments. Current evaluation procedures for AR applications rely primarily on questionnaires and interviews, providing qualitative means to assess usability and task solution strategies. Eye tracking extends these existing evaluation methodologies by providing indicators for visual attention to virtual and real elements in the environment. However, the analysis of viewing behavior, especially the comparison of multiple participants, is difficult to achieve in AR. Specifically, the definition of areas of interest (AOIs), which is often a prerequisite for such analysis, is cumbersome and tedious with existing approaches. To address this issue, we present a new visualization approach to define AOIs, label fixations, and investigate the resulting annotated scanpaths. Our approach utilizes automatic annotation of gaze on virtual objects and an image‐based approach that also considers spatial context for the manual annotation of objects in the real world. Our results show, that with our approach, eye tracking data from AR scenes can be annotated and analyzed flexibly with respect to data aspects and annotation strategies.
Funder
Deutsche Forschungsgemeinschaft
Subject
Computer Graphics and Computer-Aided Design
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. NMF-Based Analysis of Mobile Eye-Tracking Data;Proceedings of the 2024 Symposium on Eye Tracking Research and Applications;2024-06-04
2. Investigating the Gap: Gaze and Movement Analysis in Immersive Environments;Proceedings of the 2024 Symposium on Eye Tracking Research and Applications;2024-06-04
3. Eyes on the Task: Gaze Analysis of Situated Visualization for Collaborative Tasks;2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR);2024-03-16
4. Exploring Trajectory Data in Augmented Reality: A Comparative Study of Interaction Modalities;2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR);2023-10-16