Affiliation:
1. Ethics of Artificial Intelligence, Institut für Kognitionswissenschaft, Universität Osnabrück, Osnabrück, Germany
Abstract
Big data and artificial intelligence pose a new challenge for data protection as these techniques allow predictions to be made about third parties based on the anonymous data of many people. Examples of predicted information include purchasing power, gender, age, health, sexual orientation, ethnicity, etc. The basis for such applications of “predictive analytics” is the comparison between behavioral data (e.g. usage, tracking, or activity data) of the individual in question and the potentially anonymously processed data of many others using machine learning models or simpler statistical methods. The article starts by noting that predictive analytics has a significant potential to be abused, which manifests itself in the form of social inequality, discrimination, and exclusion. These potentials are not regulated by current data protection law in the EU; indeed, the use of anonymized mass data takes place in a largely unregulated space. Under the term “predictive privacy,” a data protection approach is presented that counters the risks of abuse of predictive analytics. A person's predictive privacy is violated when personal information about them is predicted without their knowledge and against their will based on the data of many other people. Predictive privacy is then formulated as a protected good and improvements to data protection with regard to the regulation of predictive analytics are proposed. Finally, the article points out that the goal of data protection in the context of predictive analytics is the regulation of “prediction power,” which is a new manifestation of informational power asymmetry between platform companies and society.
Subject
Library and Information Sciences,Information Systems and Management,Computer Science Applications,Communication,Information Systems
Reference56 articles.
1. Abadi M, Chu A, Goodfellow I, et al. (2016) Deep learning with differential privacy. In: Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security - CCS’16, pp.308–318. DOI:10.1145/2976749.2978318.
2. Article 29 Data Protection Working Party (2007) Opinion 4/2007 on the concept of personal data. 01248/07/EN WP 136. Available at: https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2007/wp140_en.pdf.
3. Article 29 Data Protection Working Party (2018) Guidelines on automated individual decision-making and Profiling for the purposes of Regulation 2016/679. 17/EN WP251rev.01. Available at: https://ec.europa.eu/newsroom/article29/items/612053/en.
4. Big data analytics and the limits of privacy self-management
5. Data Pollution
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献