Inclusion in clinical research: cross-sectional study assessing potential barriers to informed consent in randomized controlled trials published in top general and internal medical journals

Author:

Pranić Shelly Melissa1ORCID,Baždarić Ksenija2ORCID,Pérez-Neri Iván3ORCID,Estêvão Maria Dulce4ORCID,Mishra Vinayak5ORCID,McGriff Joanne A.6,Pulumati Anika7ORCID

Affiliation:

1. University of Split School of Medicine

2. University of Rijeka Faculty of Health Studies

3. National Institute of Neurology and Neurosurgery Manuel Velasco Suárez

4. Escola Superior de Saúde da Universidade do Algarve

5. University of Liverpool

6. Emory University

7. University of Missouri-Kansas City School of Medicine

Abstract

Abstract Objective: Racial and ethnic minority groups are underrepresented in clinical research. Racially diverse individuals that speak languages other than English or have limited proficiency may be hindered from participation in randomized clinical trials (RCTs) through eligibility criteria. This study sought to assess English language requirements for enrollment in registered and published RCTs. Design: In a cross-sectional design, we searched for RCTs in the top 10 first-quartile general and internal medicine journals in 2017 on May 4, 2022, with at least one US site comparing heart disease, stroke, cancer, asthma, influenza and pneumonia, diabetes, HIV/AIDS, and COVID-19 drug interventions with standard or usual care or placebo with ClinicalTrials.gov registration and protocols. We assessed whether English or another language was required for trial enrollment in the eligibility criteria in protocols and ClinicalTrials.gov records. Good agreement was achieved by independent selection by two reviewers for inclusion (κ = 0.85; 95% CI, 0.75-0.95) and both the identification of language requirements and data extraction in RCTs (κ = 0.98; 95% CI, 0.87-1.00) from a sample of 50 RCTs. The primary outcome was the frequency of RCTs with English language requirements in eligibility criteria in protocols and ClinicalTrials.gov records by disease and funder type (industry funders had at least one industry funder, while non-industry funders had no industry funding). Secondary outcomes were readability of eligibility criteria in ClinicalTrials.gov records and reporting of race as a demographic variable. Readability was assessed with Flesch-Kincaid grade (FKG) level (ranges from grades 0 to 18 [college graduate]) and Gunning-Fog (GF) (ranges from grades 0 to 20 [college graduate]), where lower grades correspond to easier readability. Mann-Whitney tests compared readability with a 2-tailed P-value set at less than 0.05. Results: A total of 39 of 5995 RCTs from Annals of Internal Medicine (n = 2), JAMA (n = 14), JAMA Internal Medicine (n = 3), Lancet (n = 11), PLoS Medicine (n = 1), and New England Journal of Medicine (n = 8) were found. Trials mostly studied COVID-19 (n=18/39, 46%) and were industry-funded (n=23/39, 59%). The eligibility criteria in publications or ClinicalTrials.gov made no explicit statements about English or any other language required for enrollment. The lack of explicit statements about languages required for enrollment were common in both industry-funded (n=17/39, 44%) and non-industry funded (n=8/39, 21%) described in protocols. Eligibility criteria in protocols of 3 out of 39 (8%) non-industry funded RCTs restricted participation to English-speaking participants. Ten (26%) industry-funded and non-industry funded trials (both n=5/39, 13%) mentioned providing non-English languages. Participant race was reported in 37 (95%) articles and ClinicalTrials.gov records that comprised American Indian (median [interquartile range (IQR)], 1 [0-6]), Asian (14 [5-69]), Black (44 [36-100]), Latinx (45 [5-117]), Native Hawaiian (0 [0-1]), and White (229 [106-207]) participants. There were 17/39 (44%) RCTs with at least one difference in the reporting of race in the article and ClinicalTrials.gov. Eligibility criteria in protocols had a median (IQR) FKG of 11.5 (10.7-13.0) and GF of 13.0 (11.7-14.5) and in ClinicalTrials.gov, the median (IQR) FKG was 13.0 (11.0-14.0) and GF was 13.7 (IQR 11.7-14.7). In protocols, readability did not differ by funder (FKG for non-industry; 12.1 (11.4-13.3) vs. FKG for industry; 11.0 (10.3-12.6) and GF for non-industry; 13.4 (12.2-14.7) vs. GF for industry; 12.90 (11.6-14.5)), P=0.092 and, (P=0.567), respectively. In ClinicalTrials.gov, readability did not differ by funder (FKG for non-industry; 12.9 (11.7-13.9) vs. FKG for industry; 13.5 (10.7-14.6) and GF for non-industry; 14.5 (11.7-15.1) vs. GF for industry; 13.4 (12.2-15.7), P=0.575 and GF P=0.338, respectively. Conclusions: There was low explicit reporting of required languages in RCT eligibility criteria, and readability levels of eligibility criteria were low. Ethics committees and funders should obligate the inclusion of the explicit reporting of languages and high readability of information for participants. Accordingly, responsibility rests with ethics committees, funders, and trialists to conceive inclusive trials to strive toward health equity.

Publisher

Research Square Platform LLC

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3