Abstract
AbstractWhen encountering social robots, potential users are often facing a dilemma between privacy and utility. That is, high utility often comes at the cost of lenient privacy settings, allowing the robot to store personal data and to connect to the internet permanently, which brings in associated data security risks. However, to date, it still remains unclear how this dilemma affects attitudes and behavioral intentions towards the respective robot. To shed light on the influence of a social robot’s privacy settings on robot-related attitudes and behavioral intentions, we conducted two online experiments with a total sample of N = 320 German university students. We hypothesized that strict privacy settings compared to lenient privacy settings of a social robot would result in more favorable attitudes and behavioral intentions towards the robot in Experiment 1. For Experiment 2, we expected more favorable attitudes and behavioral intentions for choosing independently the robot’s privacy settings in comparison to evaluating preset privacy settings. However, those two manipulations seemed to influence attitudes towards the robot in diverging domains: While strict privacy settings increased trust, decreased subjective ambivalence and increased the willingness to self-disclose compared to lenient privacy settings, the choice of privacy settings seemed to primarily impact robot likeability, contact intentions and the depth of potential self-disclosure. Strict compared to lenient privacy settings might reduce the risk associated with robot contact and thereby also reduce risk-related attitudes and increase trust-dependent behavioral intentions. However, if allowed to choose, people make the robot ‘their own’, through making a privacy-utility tradeoff. This tradeoff is likely a compromise between full privacy and full utility and thus does not reduce risks of robot-contact as much as strict privacy settings do. Future experiments should replicate these results using real-life human robot interaction and different scenarios to further investigate the psychological mechanisms causing such divergences.
Funder
Bundesministerium für Bildung und Forschung
Universität Bielefeld
Publisher
Springer Science and Business Media LLC
Subject
General Computer Science,Human-Computer Interaction,Philosophy,Electrical and Electronic Engineering,Control and Systems Engineering,Social Psychology
Reference70 articles.
1. Fosch-Villaronga E, Lutz C, Tamò-Larrieux A (2020) Gathering expert opinions for social robots’ ethical, legal, and societal concerns: findings from four international workshops. Int J Soc Robot 12:441–458. https://doi.org/10.1007/s12369-019-00605-z
2. Gupta SK (2015) Six recent trends in robotics and their implications. IEEE Spectrum. https://spectrum.ieee.org/six-recent-trends-in-robotics-and-their-implications. Accessed 24 June 2022
3. van den Berg B (2016) Mind the air gap. In: Gutwirth S, Leenes R, De Hert P (eds) Data protection on the move. Law, governance and technology series, vol 24. Springer, Dordrecht
4. Hassan T, Kopp S (2020) Towards an interaction-centered and dynamically constructed episodic memory for social robots. In: Companion of the 2020 ACM/IEEE international conference on human-robot interaction, pp 233–235. https://doi.org/10.1145/3371382.3378329
5. Horstmann B, Diekmann N, Buschmeier H, Hassan T (2020) Towards designing privacy-compliant social robots for use in private households: a use case based identification of privacy implications and potential technical measures for mitigation. In: Proceedings of the 29th IEEE international conference on robot and human interactive communication (RO-MAN), pp 869–876. https://doi.org/10.1109/RO-MAN47096.2020.9223556
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献