Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System
Author:
Luo Le1ORCID, Weng Dongdong1ORCID, Hao Jie1, Tu Ziqi1, Jiang Haiyan1
Affiliation:
1. Beijing Engineering Research Center of Mixed Reality and Advanced Display, Beijing Institute of Technology, Beijing 100081, China
Abstract
In mixed-reality (MR) telecollaboration, the local environment is remotely presented to a remote user wearing a virtual reality (VR) head-mounted display (HMD) via a video capture device. However, remote users frequently face challenges in naturally and actively manipulating their viewpoints. In this paper, we propose a telepresence system with viewpoint control, which involves a robotic arm equipped with a stereo camera in the local environment. This system enables remote users to actively and flexibly observe the local environment by moving their heads to manipulate the robotic arm. Additionally, to solve the problem of the limited field of view of the stereo camera and limited movement range of the robotic arm, we propose a 3D reconstruction method combined with a stereo video field-of-view enhancement technique to guide remote users to move within the movement range of the robotic arm and provide them with a larger range of local environment perception. Finally, a mixed-reality telecollaboration prototype was built, and two user studies were conducted to evaluate the overall system. User study A evaluated the interaction efficiency, system usability, workload, copresence, and user satisfaction of our system from the remote user’s perspective, and the results showed that our system can effectively improve the interaction efficiency while achieving a better user experience than two traditional view-sharing techniques based on 360 video and based on the local user’s first-person view. User study B evaluated our MR telecollaboration system prototype from both the remote-user side and the local-user side as a whole, providing directions and suggestions for the subsequent design and improvement of our mixed-reality telecollaboration system.
Funder
National Natural Science Foundation of China the Beijing Municipal Science & Technology Commission and 605 Administrative Commission of Zhongguancun Science Park under Grant
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference64 articles.
1. Teo, T., Lee, G.A., Billinghurst, M., and Adcock, M. (2018). Proceedings of the 30th Australian Conference on Computer-Human Interaction, Association for Computing Machinery. OzCHI ’18. 2. Teo, T., Lee, G.A., Billinghurst, M., and Adcock, M. (2019). Proceedings of the 17th International Conference on Virtual-Reality Continuum and Its Applications in Industry, Association for Computing Machinery. VRCAI ’19. 3. Teo, T., Lee, G.A., Billinghurst, M., and Adcock, M. (2019, January 23–27). Supporting visual annotation cues in a live 360 panorama-based mixed reality remote collaboration. Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan. 4. Augmented virtual teleportation for high-fidelity telecollaboration;Rhee;IEEE Trans. Vis. Comput. Graph.,2020 5. Lee, G., Kang, H., Lee, J., and Han, J. (2020, January 22–26). A user study on view-sharing techniques for one-to-many mixed reality collaborations. Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|