Robots’ “Woohoo” and “Argh” Can Enhance Users’ Emotional and Social Perceptions: An Exploratory Study on Non-lexical Vocalizations and Non-linguistic Sounds

Author:

Liu Xiaozhen1ORCID,Dong Jiayuan1ORCID,Jeon Myounghoon1ORCID

Affiliation:

1. Virginia Polytechnic Institute and State University, USA

Abstract

As robots have become more pervasive in our everyday life, social aspects of robots have attracted researchers’ attention. Because emotions play a crucial role in social interactions, research has been conducted on conveying emotions via speech. Our study sought to investigate the synchronization of multimodal interaction in human-robot interaction (HRI). We conducted a within-subjects exploratory study with 40 participants to investigate the effects of non-speech sounds (natural voice, synthesized voice, musical sound, and no sound) and basic emotions (anger, fear, happiness, sadness, and surprise) on user perception with emotional body gestures of an anthropomorphic robot (Pepper). While listening to a fairytale with the participant, a humanoid robot responded to the story with recorded emotional non-speech sounds and gestures. Participants showed significantly higher emotion recognition accuracy from the natural voice than from other sounds. The confusion matrix showed that happiness and sadness had the highest emotion recognition accuracy, which is in line with previous research. The natural voice also induced higher trust, naturalness, and preference compared to other sounds. Interestingly, the musical sound mostly showed lower perception ratings, even compared to no sound. Results are discussed with design guidelines for emotional cues from social robots and future research directions.

Publisher

Association for Computing Machinery (ACM)

Subject

Artificial Intelligence,Human-Computer Interaction

Reference72 articles.

1. Review of Semantic-Free Utterances in Social Human–Robot Interaction

2. The Uncanny Valley: Effect of Realism on the Impression of Artificial Human Faces

3. J. Dong, A. Santiago-Anaya, and M. Jeon. 2022. Facial expressions increase emotion recognition accuracy and clarity on a humanoid robot without adding the uncanny valley. In Proceedings of the Human Factors and Ergonomics Society's 2022 International Annual Meeting (HFES ‘22).

4. Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths

5. Impression survey of the emotion expression humanoid robot with mental model based dynamic emotions

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3