Affiliation:
1. Department of Mechatronics Engineering, Graduate School of Science and Technology, Meijo University, 501-1 Shiogamaguchi, Nagoya 468-8502, Japan
Abstract
Recently, research has been conducted on mixed reality (MR), which provides immersive visualization and interaction experiences, and on mapping human motions directly onto a robot in a mixed reality (MR) space to achieve a high level of immersion. However, even though the robot is mapped onto the MR space, their surrounding environment is often not mapped sufficiently; this makes it difficult to comfortably perform tasks that require precise manipulation of the objects that are difficult to see from the human perspective. Therefore, we propose a system that allows users to operate a robot in real space by mapping the task environment around the robot on the MR space and performing operations within the MR space.
Reference34 articles.
1. Zhang, T., McCarthy, Z., Jowl, O., Lee, D., Chen, X., Goldberg, K., and Abbeel, P. (2018, January 21–25). Deep Imitation Learning for Complex Manipulation Tasks from Virtual Reality Teleoperation. Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia.
2. Collaborative Work in Augmented Reality: A Survey;Sereno;IEEE Trans. Vis. Comput. Graph.,2020
3. Revisiting Collaboration through Mixed Reality: The Evolution of Groupware;Ens;Int. J. Hum. Comput. Stud.,2019
4. A Systematic Review of the Current State of Collaborative Mixed Reality Technologies: 2013–2018;Belen;AIMS Electron. Electr. Eng.,2019
5. A Taxonomy of Mixed Reality Visual Displays;Milgram;IEICE Trans. Inf. Syst.,1994