Using a Head Pointer or Eye Gaze: The Effect of Gaze on Spatial AR Remote Collaboration for Physical Tasks

Author:

Wang Peng12,Bai Xiaoliang12,Billinghurst Mark132,Zhang Shusheng1,He Weiping1,Han Dechuan1,Wang Yue4,Min Haitao1,Lan Weiqi1,Han Shu1

Affiliation:

1. Cyber-Physical Interaction Laboratory, Northwestern Polytechnical University, Xi’an 710072, China

2. These authors contributed equally to this paper

3. Empathic Computing laboratory, University of South Australia, Mawson Lakes 5001, Australia

4. School of Advanced Manufacturing Engineering, Chongqing University of Posts and Telecommunications, ChongQing 400065, China

Abstract

AbstractThis paper investigates the effect of using augmented reality (AR) annotations and two different gaze visualizations, head pointer (HP) and eye gaze (EG), in an AR system for remote collaboration on physical tasks. First, we developed a spatial AR remote collaboration platform that supports sharing the remote expert’s HP or EG cues. Then the prototype system was evaluated with a user study comparing three conditions for sharing non-verbal cues: (1) a cursor pointer (CP), (2) HP and (3) EG with respect to task performance, workload assessment and user experience. We found that there was a clear difference between these three conditions in the performance time but no significant difference between the HP and EG conditions. When considering the perceived collaboration quality, the HP/EG interface was statistically significantly higher than the CP interface, but there was no significant difference for workload assessment between these three conditions. We used low-cost head tracking for the HP cue and found that this served as an effective referential pointer. This implies that in some circumstances, HP could be a good proxy for EG in remote collaboration. Head pointing is more accessible and cheaper to use than more expensive eye-tracking hardware and paves the way for multi-modal interaction based on HP and gesture in AR remote collaboration.

Funder

Civil Aircraft Special Project

Dongguan Science and Technology Equipment Project

National Key R&D Program of China

Natural Science Basic Research Plan in Shaanxi Province of China

111 Project

Publisher

Oxford University Press (OUP)

Subject

Human-Computer Interaction,Software

Reference49 articles.

1. ShowMe

2. GazeTorch: enabling gaze awareness in collaborative physical tasks;Akkil;CHI Conference on Human Factors in Computing Systems-Late-Breaking Work,2016

3. User experience and interaction performance in 2D/3D telecollaboration;Anton;Future Gener. Comput. Syst.,2017

4. Augmented telemedicine platform for real-time remote medical consultation. International Conference on Multimedia Modeling;Anton,2017

5. I see what you see: gaze awareness in mobile video collaboration;Akkil,2018

Cited by 25 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. A novel mixed reality remote collaboration system with adaptive generation of instructions;Computers & Industrial Engineering;2024-08

2. Characterizing head-gaze and hand affordances using AR for laparoscopy;Computers & Graphics;2024-06

3. Communication Cues for Remote Guidance;Computer‐Supported Collaboration;2024-05-24

4. Industrial Applications, Current Challenges, and Future Directions;Computer‐Supported Collaboration;2024-05-24

5. Delay Threshold for Social Interaction in Volumetric eXtended Reality Communication;ACM Transactions on Multimedia Computing, Communications, and Applications;2024-04-25

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3