Affiliation:
1. Center for Artificial Intelligence, Korea Institute of Science and Technology, 5 Hwarangro14-gil, Seongbuk-gu, Seoul 02792, South Korea
2. School of Mechanical Engineering, Yonsei University, 50 Yonsei-ro Seodaemun-gu, Seoul 03722, South Korea
Abstract
Abstract
Collaborating in a physically remote location saves time and money. Many remote collaboration systems have been studied and commercialized. Their capabilities have been confined to virtual objects and information. More recent studies have focused on collaborating in a physical environment and with physical objects. However, they have limitations including shaky and unstable views (scenes), view dependence, low scalability, and poor content expression. In this paper, we propose a web-based extended reality (XR) collaboration system that alleviates the aforementioned issues and enables effective, reproducible cooperation. Our proposed system comprises three parts: interaction device webization, which expands the web browser’s device interfaces; unified XR representation, which describes content interoperable in both virtual reality (VR) and augmented reality (AR); and unified coordinate creation, which enables presenting physical objects’ pose in world coordinates. With this system, a user in VR can intuitively instruct the manipulation of a physical object by manipulating a virtual object representative of the physical object. Conversely, a user in AR can catch up with the instruction by observing the augmented virtual object on the physical object. Moreover, as the pose of the physical object at the AR user’s worksite is reflected in the virtual object, the VR user can recognize the working progress and give feedback to the AR user. To improve remote collaboration, we surveyed XR collaboration studies and proposed a new method for classifying XR collaborative applications based on the virtual–real engagement and ubiquitous computing continuum. We implemented a prototype and conducted a survey among submarine crews, most of whom were positively inclined to use our system, to convey that the system would be helpful in improving their job performance. Furthermore, we suggested possible improvements to it to enhance each participant’s understanding of the other user’s context within the XR collaboration.
Funder
Ministry of Trade, Industry and Energy
KIST
Publisher
Oxford University Press (OUP)
Subject
Computational Mathematics,Computer Graphics and Computer-Aided Design,Human-Computer Interaction,Engineering (miscellaneous),Modelling and Simulation,Computational Mechanics
Reference104 articles.
1. RemoteFusion: real time depth camera fusion for remote collaboration on physical tasks;Adcock,2013
2. Extended reality approach for construction quality control;Al-Adhami,2019
3. A study of gestures in a video-mediated collaborative assembly task;Alem;Advances in Human–Computer Interaction,2011
4. Neural point-based graphics;Aliev,2020
5. AR quick look;Apple Inc.,2019
Cited by
38 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献