A Modular Haptic Agent System with Encountered-Type Active Interaction
-
Published:2023-04-30
Issue:9
Volume:12
Page:2069
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Dongye Xiaonuo1ORCID, Weng Dongdong1, Jiang Haiyan1, Feng Lulu1
Affiliation:
1. Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
Abstract
Virtual agents are artificial intelligence systems that can interact with users in virtual reality (VR), providing users with companionship and entertainment. Virtual pets have become the most popular virtual agents due to their many benefits. However, haptic interaction with virtual pets involves two challenges: the rapid construction of various haptic proxies, and the design of agent-initiated active interaction. In this paper, we propose a modular haptic agent (MHA) prototype system, enabling the tactile simulation and encountered-type haptic interaction of common virtual pet agents through a modular design method and a haptic mapping method. Meanwhile, the MHA system with haptic interaction is actively initiated by the agents according to the user’s intention, which makes the virtual agents appear more autonomous and provides a better experience of human–agent interaction. Finally, we conduct three user studies to demonstrate that the MHA system has more advantages in terms of realism, interactivity, attraction, and raising user emotions. Overall, MHA is a system that can build multiple companion agents, provide active interaction and has the potential to quickly build diverse haptic agents for an intelligent and comfortable virtual world.
Funder
the Key-Area Research and Development Program of Guangdong Province the National Natural Science Foundation of China
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference77 articles.
1. Guo, J., Weng, D., Zhang, Z., Jiang, H., Liu, Y., Wang, Y., and Duh, H.B.L. (2019, January 14–18). Mixed reality office system based on maslow’s hierarchy of needs: Towards the long-term immersion in virtual environments. Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Beijing, China. 2. The illusion of love: Does a virtual pet provide the same companionship as a real one?;Chesney;Interact. Stud.,2007 3. Freeman, G., Zamanifard, S., Maloney, D., and Adkins, A. (2020, January 25–30). My body, my avatar: How people perceive their avatars in social virtual reality. Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. 4. Lee, M., Norouzi, N., Bruder, G., Wisniewski, P.J., and Welch, G.F. (December, January 28). The physical-virtual table: Exploring the effects of a virtual human’s physical influence on social interaction. Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, Japan. 5. Liang, W., Yu, X., Alghofaili, R., Lang, Y., and Yu, L.F. (2021, January 8–13). Scene-aware behavior synthesis for virtual pets in mixed reality. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan.
|
|