Affiliation:
1. Lancaster University, UK
Abstract
Extant literature has proposed an important role for trust in moderating people’s willingness to disclose personal information, but there is scant HCI literature that deeply explores the relationship between privacy and trust in apparent privacy paradox circumstances. Attending to this gap, this article reports a qualitative study examining how people account for continuing to use services that conflict with their stated privacy preferences, and how trust features in these accounts. Our findings undermine the notion that individuals engage in strategic thinking about privacy, raising important questions regarding the explanatory power of the well-known privacy calculus model and its proposed relationship between privacy and trust. Finding evidence of
hopeful
trust in participants’ accounts, we argue that trust allows people to morally account for their “paradoxical” information disclosure behavior. We propose that effecting greater alignment between people’s privacy attitudes and privacy behavior—or “un-paradoxing privacy”—will require greater regulatory assurances of privacy.
Funder
EPSRC
Lancaster University’s Faculty of Science and Technology Ethics Committee
Publisher
Association for Computing Machinery (ACM)
Subject
Human-Computer Interaction