Abstract
AbstractIn recent years, the research of humanoid robots that can change users’ opinions has been conducted extensively. In particular, two robots have been found to be able to improve their persuasiveness by cooperating with each other in a sophisticated manner. Previous studies have evaluated the changes in opinions when robots showed consensus building. However, users did not participate in the conversations, and the optimal strategy may change depending on their prior opinions. Therefore, in this study, we developed a system that adaptively changes conversations between robots based on user opinions. We investigate the effect on the change in opinions when the discussion converges to the same position as the user and when it converges to a different position. We conducted two subject experiments in which a user and virtual robotic agents talked to each other using buttons in a crowded setting. The results showed that users with confidence in their opinions increased their confidence when the robot agents’ opinions converged to the same position and decreased their confidence when the robot agents’ opinions converged to a different position. This will significantly contribute to persuasion research using multiple robots and the development of advanced dialogue coordination between robots.
Publisher
Springer Science and Business Media LLC
Reference33 articles.
1. Aicher A, Gerstenlauer N, Feustel I, et al (2022) Towards building a spoken dialogue system for argument exploration. In: Proceedings of the Thirteenth Language Resources and Evaluation Conference. European Language Resources Association, Marseille, France, pp 1234–1241, https://aclanthology.org/2022.lrec-1.131
2. Arimoto T, Yoshikawa Y, Ishiguro H (2018) Multiple-robot conversational patterns for concealing incoherent responses. Int J Soc Robot. https://doi.org/10.1007/s12369-018-0468-5
3. Asai S, Yoshino K, Shinagawa S, et al (2022) Eliciting cooperative persuasive dialogue by multimodal emotional robot. In: Stoyanchev S, Ultes S, Li H (eds) Conversational AI for Natural Human-Centric Interaction, pp 143–158
4. Asch SE (1955) Opinions and social pressure. Sci Am 193(5):31–35
5. Bickmore T, Cassell J (2005) Social dialongue with embodied conversational agents, Springer Netherlands, Dordrecht, pp 23–54. https://doi.org/10.1007/1-4020-3933-6_2, https://doi.org/10.1007/1-4020-3933-6_2