Awareness of Racial and Ethnic Bias and Potential Solutions to Address Bias With Use of Health Care Algorithms

Author:

Jain Anjali1,Brooks Jasmin R.2,Alford Cleothia C.1,Chang Christine S.1,Mueller Nora M.13,Umscheid Craig A.1,Bierman Arlene S.45

Affiliation:

1. Evidence-based Practice Center Division, Center for Evidence and Practice Improvement, Agency for Healthcare Research and Quality, Rockville, Maryland

2. Department of Psychology, University of Houston, Houston, Texas

3. Division of Practice Improvement, Center for Evidence and Practice Improvement, Agency for Healthcare Research and Quality, Rockville, Maryland

4. Office of the Director, Agency for Healthcare Research and Quality, Rockville, Maryland

5. Center for Evidence and Practice Improvement, Agency for Healthcare Research and Quality, Rockville, Maryland

Abstract

ImportanceAlgorithms are commonly incorporated into health care decision tools used by health systems and payers and thus affect quality of care, access, and health outcomes. Some algorithms include a patient’s race or ethnicity among their inputs and can lead clinicians and decision-makers to make choices that vary by race and potentially affect inequities.ObjectiveTo inform an evidence review on the use of race- and ethnicity-based algorithms in health care by gathering public and stakeholder perspectives about the repercussions of and efforts to address algorithm-related bias.Design, Setting, and ParticipantsQualitative methods were used to analyze responses. Responses were initially open coded and then consolidated to create a codebook, with themes and subthemes identified and finalized by consensus. This qualitative study was conducted from May 4, 2021, through December 7, 2022. Forty-two organization representatives (eg, clinical professional societies, universities, government agencies, payers, and health technology organizations) and individuals responded to the request for information.Main Outcomes and MeasuresIdentification of algorithms with the potential for race- and ethnicity-based biases and qualitative themes.ResultsForty-two respondents identified 18 algorithms currently in use with the potential for bias, including, for example, the Simple Calculated Osteoporosis Risk Estimation risk prediction tool and the risk calculator for vaginal birth after cesarean section. The 7 qualitative themes, with 31 subthemes, included the following: (1) algorithms are in widespread use and have significant repercussions, (2) bias can result from algorithms whether or not they explicitly include race, (3) clinicians and patients are often unaware of the use of algorithms and potential for bias, (4) race is a social construct used as a proxy for clinical variables, (5) there is a lack of standardization in how race and social determinants of health are collected and defined, (6) bias can be introduced at all stages of algorithm development, and (7) algorithms should be discussed as part of shared decision-making between the patient and clinician.Conclusions and RelevanceThis qualitative study found that participants perceived widespread and increasing use of algorithms in health care and lack of oversight, potentially exacerbating racial and ethnic inequities. Increasing awareness for clinicians and patients and standardized, transparent approaches for algorithm development and implementation may be needed to address racial and ethnic biases related to algorithms.

Publisher

American Medical Association (AMA)

Subject

General Earth and Planetary Sciences,General Environmental Science

Reference30 articles.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3