Affiliation:
1. University of California, Santa Barbara, USA
Abstract
This chapter begins with a case study of Strava, a fitness app that inadvertently exposed sensitive military information even while protecting individual users' information privacy. The case study is analyzed as an example of how recent advances in algorithmic group inference technologies threaten privacy, both for individuals and for groups. It then argues that while individual privacy from big data analytics is well understood, group privacy is not. Results of an experiment to better understand group privacy are presented. Findings show that group and individual privacy are psychologically distinct and uniquely affect people's evaluations, use, and tolerance for a fictitious fitness app. The chapter concludes with a discussion of group-inference technologies ethics and offers recommendations for fitness app designers.
Reference51 articles.
1. Altman, I. (1975). The environment and social behavior: Privacy, personal space, territory, and crowding. Brooks/Cole Publishing.
2. Big data’s end run around anonymity and consent;S.Barocas;Privacy, big data and the public good: Frameworks for engagement,2014
3. News, B. B. C. (2018, January 29). Fitness app Strava lights up staff at military bases. Retrieved from https://www.bbc.com/news/technology- 42853072/
4. Privacy in the digital age: A review of information privacy research in information systems.;F.Bélanger;Management Information Systems Quarterly,2011
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献