Non-Facial and Non-Verbal Affective Expression for Appearance-Constrained Robots Used in Victim Management*

Author:

Bethel Cindy L.,Murphy Robin R.

Abstract

AbstractNon-facial and non-verbal methods of affective expression are essential for social interaction in appearance-constrained robots such as those used in search and rescue, law enforcement, and military applications. This research identified five main methods of non-facial and non-verbal affective expression (body movements, postures, orientation, color, and sound). Based on an extensive review of literature, prescriptive design recommendations were developed for the appropriate non-facial and non-verbal affective expression methods for three proximity zones of interest (intimate, personal, and social). These design recommendations serve as guidelines to add retroactively affective expression through software with minimal or no physical modification to a robot. A large-scale, complex human-robot interaction study was conducted to validate these design recommendations using 128 participants and four methods of evaluation. The study was conducted in a high-fidelity, confined-space simulated disaster site with all robot interactions performed in the dark. Statistically significant results indicated that participants felt the robots that exhibited affective expressions were more calming, friendly, and attentive, which improved the social human-robot interactions.

Publisher

Walter de Gruyter GmbH

Subject

Behavioral Neuroscience,Artificial Intelligence,Cognitive Neuroscience,Developmental Neuroscience,Human-Computer Interaction

Reference43 articles.

1. Breazeal, C.L.: Designing Sociable Robots. Intelligent Robots and Autonomous Agents. MIT Press, Cambridge, Mass. (2002)

2. Cañamero, L.D., Fredslund, J.: How Does It Feel? Emotional Interaction with a Humanoid Lego Robot. In: K. Dautenhahn (ed.) AAAI 2000 Fall Symposium — Socially Intelligent Agents: The Human in the Loop, vol. Technical Report FS-00-04, pp. 23–28. AAAI, Menlo Park, CA (2000)

3. Fong, T., Nourbakhsh, I., Dautenhahn, K.: A Survey of Socially Interactive Robots. Robotics and Autonomous Systems 42(3–4), 143 (2003)

4. Kirby, R., Forlizzi, J., Simmons, R.: Affective Social Robots. Robotics and Autonomous Systems 58, 322–332 (2010)

5. Mizoguchi, H., Sato, T., Takagi, K., Nakao, M., Hatamura, Y.: Realization of Expressive Mobile Robot. In: 1997 IEEE International Conference on Robotics and Automation, vol. 1, pp. 581–586 (1997)

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. At First Light: Expressive Lights in Support of Drone-Initiated Communication;Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems;2023-04-19

2. Using Emotions to Complement Multi-Modal Human-Robot Interaction in Urban Search and Rescue Scenarios;Proceedings of the 2020 International Conference on Multimodal Interaction;2020-10-21

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3