Abstract
AbstractTo monitor and prevent bias in AI systems, we can use a wide range of (statistical) fairness measures. However, it is mathematically impossible to optimize all of these measures at the same time. In addition, optimizing a fairness measure often greatly reduces the accuracy of the system (Kozodoi et al., Eur J Oper Res 297:1083–1094, 2022). As a result, we need a substantive theory that informs us how to make these decisions and for what reasons. I show that by using Rawls’ notion of justice as fairness, we can create a basis for navigating fairness measures and the accuracy trade-off. In particular, this leads to a principled choice focusing on both the most vulnerable groups and the type of fairness measure that has the biggest impact on that group. This also helps to close part of the gap between philosophical accounts of distributive justice and the fairness literature that has been observed by (Kuppler et al. Distributive justice and fairness metrics in automated decision-making: How much overlap is there? arXiv preprint arXiv:2105.01441, 2021), and to operationalise the value of fairness.
Publisher
Springer Science and Business Media LLC
Subject
General Earth and Planetary Sciences
Reference30 articles.
1. Arneson, R. Equality of opportunity. In: Zalta, E.N. (ed.) The Stanford Encyclopedia of Philosophy. Summer 2015 ed., Metaphysics Research Lab, Stanford University (2015)
2. Barsotti, F., Koçer, R.G.: MinMax fairness: from Rawlsian Theory of Justice to solution for algorithmic bias. AI Society, 1–14 (2022)
3. Carey, A.N., Wu, X.: The statistical fairness field guide: perspectives from social and formal sciences. AI Ethics, 1–23 (2022)
4. Corbett-Davies, S. Pierson, E., Feller, A., Goel, S., Huq, A.: Algorithmic decision making and the cost of fairness. In: Proceedings of the 23rd acm sigkdd international conference on knowledge discovery and data mining. 797–806 (2017)
5. Deng, W.H., Nagireddy, M., Lee, M.S.A., Singh, J., Wu, Z.S., Holstein, K., Zhu, H.: Exploring how machine learning practitioners (try to) use fairness toolkits. In: 2022 ACM Conference on Fairness, Accountability, and Transparency. 473–484 (2022)
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献