Affiliation:
1. Marketing Unit, Harvard Business School, Harvard University, Boston, MA 02163
Abstract
Significance
Decision makers now use algorithmic personalization for resource allocation decisions in many domains (e.g., medical treatments, hiring decisions, product recommendations, or dynamic pricing). An inherent risk of personalization is disproportionate targeting of individuals from certain protected groups. Existing solutions that firms use to avoid this bias often do not eliminate the bias and may even exacerbate it. We propose BEAT (bias-eliminating adapted trees) to ensure balanced allocation of resources across individuals—guaranteeing both group and individual fairness—while still leveraging the value of personalization. We validate our method using simulations as well as an online experiment with
N
= 3,146 participants. BEAT is easy to implement in practice, has desirable scalability properties, and is applicable to many personalization problems.
Publisher
Proceedings of the National Academy of Sciences
Reference34 articles.
1. Implementing Anti-Discrimination Policies in Statistical Profiling Models
2. Disparate Impact of Artificial Intelligence Bias in Ridehailing Economy's Price Discrimination Algorithms
3. G. Goh A. Cotter M. Gupta M. Friedlander Satisfying real-world goals with dataset constraints. arXiv [Preprint] (2017). https://arxiv.org/abs/1606.07558v2 (Accessed 6 July 2021).
4. A. Agarwal, A. Beygelzimer, M. Dudik, J. Langford, H. Wallach, “A reductions approach to fair classification” in Proceedings of the 35th International Conference on Machine Learning, J. Dy, A. Krause, Eds. (PMLR, 2018), vol. 80, pp. 60–69.
5. M. Feldman S. A. Friedler J. Moeller C. Scheidegger S. Venkatasubramanian “Certifying and removing disparate impact” in Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Association for Computing Machinery New York NY 2015) pp. 259–268.
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献