Affiliation:
1. Stanford University, Stanford, CA, USA
Abstract
Like many modes of rationalized governance, algorithms depend on rendering people as cases: discrete entities defined by regularized, atemporal attributes. This enables the computation behind the behavioral predictions organizations increasingly use to allocate benefits and burdens. Yet it elides another foundational way of understanding people: as actors in the unfolding narratives of their lives. This has epistemic implications because each cultural form entails a distinct information infrastructure. In this article, I argue that construing people as cases carries consequences for moral reasoning as well because different moral standards require different information. While rendering people as cases affords adjudications of comparative justice, parsing noncomparative justice often necessitates narrative. This explains why people frequently reach for stories that sit beyond the representations of individuals found in records and databases. With this argument, I contribute to the sociology of categorization/classification and draw broader conclusions about modern systems of bureaucratic, computational, and quantitative governance.
Subject
Sociology and Political Science
Reference152 articles.
1. Angwin Julia, Larson Jeff, Mattu Surya, Kirchner Lauren. 2016. “Machine Bias.” ProPublica, May 23. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
2. Imagined futures: fictional expectations in the economy
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献