Abstract
Abstract
Digital agents with human-like characteristics have become ubiquitous in our society and are increasingly relevant in commercial applications. While some of them closely resemble humans in appearance (e.g., digital humans), they still lack many subtle social cues that are important for interacting with humans. Among them are the so-called microexpressions— facial expressions that are short, subtle, and involuntary. We investigate to what extent microexpressions in digital humans influence people's perceptions and decision-making in order to inform the practices of digital human's design. Our two experiments applied four types of microexpressions based on emotion type (happiness and anger) and intensity (normal and extreme). This paper is among the first to design and evaluate microexpressions with different intensity levels in digital humans. In particular, we leverage the possibilities of digitally (re)designing humans and human perception. These possibilities are feasible only in a digital environment, where it is possible to explore various microexpressions beyond real human beings' physical capabilities.
Funder
University of Liechtenstein
Publisher
Springer Science and Business Media LLC
Subject
Management of Technology and Innovation,Marketing,Computer Science Applications,Economics and Econometrics,Business and International Management
Reference84 articles.
1. Adam, M., Wessel, M., & Benlian, A. (2021). AI-based chatbots in customer service and their effects on user compliance. Electronic Markets, 31, 427–445. https://doi.org/10.1007/s12525-020-00414-7
2. Adamo, N., Dib, H. N., & Villani, N. J. (2019). Animated agents’ facial emotions: Does the agent design make a difference? In international conference on augmented reality, virtual reality and computer graphics (pp. 10–25). Springer. https://doi.org/10.1007/978-3-030-25965-5_2
3. Bailey, P. E., & Henry, J. D. (2009). Subconscious facial expression mimicry is preserved in older adulthood. Psychology and Aging, 24(4), 995. https://doi.org/10.1037/a0015789
4. Baltrusaitis, T., Zadeh, A., Lim, Y. C., & Morency, L. P. (2018). Openface 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE international conference on automatic face & gesture recognition (FG 2018) (pp. 59–66). IEEE. https://doi.org/10.1109/FG.2018.00019
5. Biocca, F., Harms, C., & Gregg, J. (2001). The networked minds measure of social presence: Pilot test of the factor structure and concurrent validity. In 4th annual international workshop on presence (pp. 1–9).
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献