Reproducibility of Published Meta-Analyses on Clinical-Psychological Interventions

Author:

López-Nicolás Rubén1ORCID,Lakens Daniel2ORCID,López-López Jose A.1,Rubio-Aparicio Maria3,Sandoval-Lentisco Alejandro1ORCID,López-Ibáñez Carmen1ORCID,Blázquez-Rincón Desirée1,Sánchez-Meca Julio1ORCID

Affiliation:

1. Department of Basic Psychology and Methodology, University of Murcia, Murcia, Spain

2. Department of Industrial Engineering and Innovation Sciences, Eindhoven University of Technology, Eindhoven, The Netherlands

3. Department of Health Psychology, University of Alicante, San Vicente del Raspeig, Spain

Abstract

Meta-analysis is one of the most useful research approaches, the relevance of which relies on its credibility. Reproducibility of scientific results could be considered as the minimal threshold of this credibility. We assessed the reproducibility of a sample of meta-analyses published between 2000 and 2020. From a random sample of 100 articles reporting results of meta-analyses of interventions in clinical psychology, 217 meta-analyses were selected. We first tried to retrieve the original data by recovering a data file, recoding the data from document files, or requesting it from original authors. Second, through a multistage workflow, we tried to reproduce the main results of each meta-analysis. The original data were retrieved for 67% (146/217) of meta-analyses. Although this rate showed an improvement over the years, in only 5% of these cases was it possible to retrieve a data file ready for reuse. Of these 146, 52 showed a discrepancy larger than 5% in the main results in the first stage. For 10 meta-analyses, this discrepancy was solved after fixing a coding error of our data-retrieval process, and for 15 of them, it was considered approximately reproduced in a qualitative assessment. In the remaining meta-analyses (18%, 27/146), different issues were identified in an in-depth review, such as reporting inconsistencies, lack of data, or transcription errors. Nevertheless, the numerical discrepancies were mostly minor and had little or no impact on the conclusions. Overall, one of the biggest threats to the reproducibility of meta-analysis is related to data availability and current data-sharing practices in meta-analysis.

Funder

ministerio de ciencia e innovación

ministerio de universidades

fundación séneca

Publisher

SAGE Publications

Subject

General Psychology

Reference38 articles.

1. Allaire J., Xie Y., McPherson J., Luraschi J., Ushey K., Atkins A., Wickham H., Cheng J., Chang W., Iannone R. (2022). Rmarkdown: Dynamic documents for r. https://github.com/rstudio/rmarkdown

2. The reproducibility of statistical results in psychological research: An investigation using unpublished raw data.

3. Aust F., Barth M. (2022). papaja: Prepare reproducible APA journal articles with R Markdown. https://github.com/crsh/papaja

4. Growth rates of modern science: a latent piecewise growth curve approach to model publication numbers from established and new literature databases

5. Chamberlain S., Zhu H., Jahn N., Boettiger C., Ram K. (2023). Rcrossref: Client for various ‘CrossRef’ ‘APIs’ (R package Version 1.2.0) [Computer software]. Retrieved from https://CRAN.R-project.org/package=rcrossref

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3