Affiliation:
1. Princeton University, Princeton, NJ, USA
2. MIT, Cambridge, MA, USA
Abstract
We perform a socio-computational interrogation of the google search by image algorithm, a main component of the google search engine. We audit the algorithm by presenting it with more than 40 thousands faces of all ages and more than four races and collecting and analyzing the assigned labels with the appropriate statistical tools. We find that the algorithm reproduces white male patriarchal structures, often simplifying, stereotyping and discriminating females and non-white individuals, while providing more positive descriptions of white men. By drawing from Bourdieu’s theory of cultural reproduction, we link these results to the attitudes of the algorithm’s designers, owners, and the dataset the algorithm was trained on. We further underpin the problematic nature of the algorithm by using the ethnographic practice of studying-up: We show how the algorithm places individuals at the top of the tech industry within the socio-cultural reality that they shaped, many times creating biased representations of them. We claim that the use of social-theoretic frameworks such as the above are able to contribute to improved algorithmic accountability, algorithmic impact assessment and provide additional and more critical depth in algorithmic bias and auditing studies. Based on the analysis, we discuss the scientific and design implications and provide suggestions for alternative ways to design just socio-algorithmic systems.
Subject
Law,Library and Information Sciences,Computer Science Applications,General Social Sciences
Reference80 articles.
1. Abebe R., Barocas S., Kleinberg J., Levy K., Raghavan M., Robinson D. G. (2020). Roles for computing in social change. In: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona Spain, 27–30 January 2020. https://doi.org/10.1145/3351095.3372871. pp. 252–260
2. The promises of computational ethnography: Improving transparency, replicability, and validity for realist approaches to ethnographic analysis
3. Big Data, Thick Mediation, and Representational Opacity
4. UMDFaces: An annotated face dataset for training deep networks
5. Barabas C., Doyle C., Rubinovitz J., Dinakar K. (2020). Studying up: Reorienting the study of algorithmic fairness around issues of power. In: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona Spain, 27–30 January 2020. pp. 167–176.
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献