Abstract
AbstractThe third stage of the prediction supply chain is to find patterns. Comparisons dominate data science, often without a mechanism to draw attention to the assumptions behind what determines similarity. Pattern matching is rarely discussed as an extension of social orders despite the history of political oppression based on rigid population divisions. Ethical data science is aware that comparisons can imply hierarchical patterns that may generate stigma. Furthermore, not everyone fits neatly into a category. Computers can count above two but often insist on historic binary choices that do little more than enforce societal norms and limit autonomy. This chapter examines the material force of categories, alternatives beyond the binary, and the promise of standardized consistency. Understanding the values that drive pattern matching can lead to more accurate intellectual order in data science.
Publisher
Oxford University PressNew York
Reference249 articles.
1. C3P107Arcas, Blaise Aguera, Alexander Todorov, and Margaret Mitchell. 2018. “Do Algorithms Reveal Sexual Orientation or Just Expose Our Stereotypes?” Medium Microsoft Research (blog). January 18, 2018. https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477.
2. C3P108Bender, Emily M., et al. 2021. “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency . Virtual Event Canada: ACM, 610–23. https://doi.org/10.1145/3442188.3445922.
3. C3P111Eubanks, Virginia. 2018. “A Child Abuse Prediction Model Fails Poor Families.” Wired, January 15, 2018. https://www.wired.com/story/excerpt-from-automating-inequality/.