Ethical Redress of Racial Inequities in AI: Lessons from Decoupling Machine Learning from Optimization in Medical Appointment Scheduling
-
Published:2022-10-20
Issue:4
Volume:35
Page:
-
ISSN:2210-5433
-
Container-title:Philosophy & Technology
-
language:en
-
Short-container-title:Philos. Technol.
Author:
Shanklin RobertORCID, Samorani Michele, Harris Shannon, Santoro Michael A.
Abstract
AbstractAn Artificial Intelligence algorithm trained on data that reflect racial biases may yield racially biased outputs, even if the algorithm on its own is unbiased. For example, algorithms used to schedule medical appointments in the USA predict that Black patients are at a higher risk of no-show than non-Black patients, though technically accurate given existing data that prediction results in Black patients being overwhelmingly scheduled in appointment slots that cause longer wait times than non-Black patients. This perpetuates racial inequity, in this case lesser access to medical care. This gives rise to one type of Accuracy-Fairness trade-off: preserve the efficiency offered by using AI to schedule appointments or discard that efficiency in order to avoid perpetuating ethno-racial disparities. Similar trade-offs arise in a range of AI applications including others in medicine, as well as in education, judicial systems, and public security, among others. This article presents a framework for addressing such trade-offs where Machine Learning and Optimization components of the algorithm are decoupled. Applied to medical appointment scheduling, our framework articulates four approaches intervening in different ways on different components of the algorithm. Each yields specific results, in one case preserving accuracy comparable to the current state-of-the-art while eliminating the disparity.
Publisher
Springer Science and Business Media LLC
Subject
History and Philosophy of Science,Philosophy
Reference90 articles.
1. Akee, R., Jones, M. R., & Porter, S. R. (2019). Race matters: Income shares, income inequality, and income mobility for all US races. Demography, 56(3), 999–1021. 2. Alexander, M. (2010). The New Jim Crow: Mass Incarceration in the Age of Colorblindness (Revised). The New Press. 3. Allen, A., Mataraso, S., Siefkas, A., Burdick, H., Braden, G., Dellinger, R. P., McCoy, A., Pellegrini, E., Hoffman, J., Green-Saxena, A., Barnes, G., Calvert, J., & Das, R. (2020). A racially unbiased, machine learning approach to prediction of mortality: Algorithm development study. JMIR Public Health and Surveillance, 6(4), e22400. https://doi.org/10.2196/22400 4. Arrighi, B. (Ed.). (2001). Understanding Inequality: The Intersection of Race/ethnicity, Class, and Gender. Rowan & Littlefield. 5. Baldwin, J. (1962). The Fire Next Time. Random House.
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|