FairAW – Additive weighting without discrimination

Author:

Radovanović Sandro1,Petrović Andrija2,Dodevska Zorica3,Delibašić Boris1

Affiliation:

1. Faculty of Organizational Sciences, University of Belgrade, Serbia

2. Singidunum University, Serbia

3. The Institute for Artificial Intelligence Research and Development of Serbia, Serbia

Abstract

With growing awareness of the societal impact of decision-making, fairness has become an important issue. More specifically, in many real-world situations, decision-makers can unintentionally discriminate a certain group of individuals based on either inherited or appropriated attributes, such as gender, age, race, or religion. In this paper, we introduce a post-processing technique, called fair additive weighting (FairAW) for achieving group and individual fairness in multi-criteria decision-making methods. The methodology is based on changing the score of an alternative by imposing fair criteria weights. This is achieved through minimization of differences in scores of individuals subject to fairness constraint. The proposed methodology can be successfully used in multi-criteria decision-making methods where the additive weighting is used to evaluate scores of individuals. Moreover, we tested the method both on synthetic and real-world data, and compared it to Disparate Impact Remover and FA*IR methods that are commonly used in achieving fair scoring of individuals. The obtained results showed that FairAW manages to achieve group fairness in terms of statistical parity, while also retaining individual fairness. Additionally, our approach managed to obtain the best equality in scoring between discriminated and privileged groups.

Publisher

IOS Press

Subject

Artificial Intelligence,Computer Vision and Pattern Recognition,Theoretical Computer Science

Reference44 articles.

1. Machine bias;Angwin;ProPublica, May,2016

2. A. Asudeh, H. Jagadish, J. Stoyanovich and G. Das, Designing fair ranking schemes, in: Proceedings of the 2019 International Conference on Management of Data, 2019, pp. 1259–1276.

3. A.J. Biega, K.P. Gummadi and G. Weikum, Equity of attention: Amortizing individual fairness in rankings, in: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 2018, pp. 405–414.

4. R. Binns, Fairness in machine learning: Lessons from political philosophy, in: Conference on Fairness, Accountability and Transparency, 2018, pp. 149–159.

5. Perceptions of fairness and justice: The shared aims & occasional conflicts of legitimacy and moral credibility;Bowers;Wake Forest Law Review,2012

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3