Algorithmic profiling as a source of hermeneutical injustice

Author:

Milano SilviaORCID,Prunkl Carina

Abstract

AbstractIt is well-established that algorithms can be instruments of injustice. It is less frequently discussed, however, how current modes of AI deployment often make the very discovery of injustice difficult, if not impossible. In this article, we focus on the effects of algorithmic profiling on epistemic agency. We show how algorithmic profiling can give rise to epistemic injustice through the depletion of epistemic resources that are needed to interpret and evaluate certain experiences. By doing so, we not only demonstrate how the philosophical conceptual framework of epistemic injustice can help pinpoint potential, systematic harms from algorithmic profiling, but we also identify a novel source of hermeneutical injustice that to date has received little attention in the relevant literature, what we call epistemic fragmentation. As we detail in this paper, epistemic fragmentation is a structural characteristic of algorithmically-mediated environments that isolate individuals, making it more difficult to develop, uptake and apply new epistemic resources, thus making it more difficult to identify and conceptualise emerging harms in these environments. We thus trace the occurrence of hermeneutical injustice back to the fragmentation of the epistemic experiences of individuals, who are left more vulnerable by the inability to share, compare and learn from shared experiences.

Funder

Wellcome Trust

Sloan Foundation

Department of Health and Social Care

Luminate Group

Publisher

Springer Science and Business Media LLC

Subject

Philosophy

Reference41 articles.

1. Alcoff, L. M. (1996). The problem of speaking for others who can speak? Authority and critical identity. In R. Wiegman (Ed.), J Roofand. University of Illinois Press.

2. Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review, 104, 671.

3. Bui, M. L., & Noble, S. U. (2020). We’re missing a moral framework of justice in Artificial Intelligence: On the limits, failings, and ethics of fairness. The Oxford Handbook of Ethics of A, I, 163–180.

4. Collins, P. H. (2000). Black feminist thought: knowledge, consciousness, and the politics of empowerment. Routledge.

5. Crerar, C. (2016). Taboo, hermeneutical injustice, and expressively free environments. Episteme, 13(2), 195–207.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3