Author:
Upadhyaya Nitish,Galizzi Matteo M.
Abstract
People are increasingly interacting with forms of artificial intelligence (AI). It is crucial to understand whether accepted evidence for human-human reciprocity holds true for human-bot interactions. In a pre-registered online experiment (N = 539) we first replicate recent studies, finding that the identity of a player's counterpart in a one-shot binary Trust Game has a significant effect on the rate of reciprocity, with bot counterparts receiving lower levels of returned amounts than human counterparts. We then explore whether individual differences in a player's personality traits—in particular Agreeableness, Extraversion, Honesty-Humility and Openness—moderate the effect of the identity of the player's counterpart on the rate of reciprocity. In line with literature on human-human interactions, participants exhibiting higher levels of Honesty-Humility, and to a lesser extent Agreeableness, are found to reciprocate more, regardless of the identity of their counterpart. No personality trait, however, moderates the effect of interacting with a bot. Finally, we consider whether general attitudes to AI affect the reciprocity but find no significant relationship.
Reference145 articles.
1. No rage against the machine: how computer agents mitigate human emotional processes in electronic negotiations;Adam;Group Dec. Negot.,2018
2. AgrawalA.
GansJ.
GoldfarbA. (eds).
The Economics of Artificial Intelligence: An Agenda. Chicago, IL: University of Chicago Press (National Bureau of Economic Research Conference Report)2019
3. “Chapter 2—Trust, Growth, and Well-Being: New Evidence and Policy Implications,”;Algan;Handbook of Economic Growth,2014
4. Trust games and beyond;Alós-Ferrer;Front. Neurosci,2019
5. The Theory of Extensive Form Games
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献