Author:
Xu Jiayi,Yang Lei,Guo Meng
Abstract
Introduction
Virtual patient (VP) simulations have been widely used for healthcare training, education, and assessment. However, few VP systems have integrated emotion sensing and analyzed how a user's emotions may influence the overall training experience. This article presents a VP that can recognize and respond to 5 human emotions (anger, disgust, fear, joy, and sadness), as well as 2 facial expressions (smiling and eye contact).
Methods
The VP was developed by combining the capabilities of a facial recognition system, a tone analyzer, a cloud-based artificial intelligence chatbot, and interactive 3-dimensional avatars created in a high-fidelity game engine (Unity). The system was tested with healthcare professionals at Changzhou Traditional Chinese Medicine Hospital.
Results
A total of 65 participants (38 females and 27 males) aged between 23 and 57 years (mean = 38.35, SD = 11.48) completed the survey, and 19 participants were interviewed. Most participants perceived that the VP was useful in improving their communication skills, particularly their nonverbal communication skills. They also reported that adding users' affective states as an additional interaction increased engagement of the VP and helped them build connections with the VP.
Conclusions
The emotionally responsive VP seemed to be functionally complete and usable. However, some technical limitations need to be addressed before the system's official implementation in real-world clinical practice. Future development will include improving the accuracy of the speech recognition system, using more sophisticated emotion sensing software, and developing a natural user interface.
Publisher
Ovid Technologies (Wolters Kluwer Health)
Subject
Modeling and Simulation,Education,Medicine (miscellaneous),Epidemiology
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献