Replicability of simulation studies for the investigation of statistical methods: the RepliSims project

Author:

Luijken K.12ORCID,Lohmann A.1,Alter U.3,Claramunt Gonzalez J.4,Clouth F. J.56,Fossum J. L.78,Hesen L.1,Huizing A. H. J.9,Ketelaar J.1,Montoya A. K.7ORCID,Nab L.1,Nijman R. C. C.1,Penning de Vries B. B. L.110,Tibbe T. D.7,Wang Y. A.11,Groenwold R. H. H.110

Affiliation:

1. Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands

2. Department of Epidemiology, Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, University Utrecht, Utrecht, The Netherlands

3. Department of Psychology, York University, Toronto, Ontario, Canada

4. Methodology and Statistics Unit, Institute of Psychology, Leiden University, Leiden, The Netherlands

5. Department of Methodology and Statistics, Tilburg University, Tilburg, The Netherlands

6. Netherlands Comprehensive Cancer Organisation (IKNL), Utrecht, The Netherlands

7. Department of Psychology, University of California, Los Angeles, CA, USA

8. Department of Psychology, Seattle Pacific University, Seattle, WA, USA

9. TNO (Netherlands Organization for Applied Scientific Research), Expertise Group Child Health, Leiden, The Netherlands

10. Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden, The Netherlands

11. Department of Psychology, University of Toronto, Toronto, Ontario, Canada

Abstract

Results of simulation studies evaluating the performance of statistical methods can have a major impact on the way empirical research is implemented. However, so far there is limited evidence of the replicability of simulation studies. Eight highly cited statistical simulation studies were selected, and their replicability was assessed by teams of replicators with formal training in quantitative methodology. The teams used information in the original publications to write simulation code with the aim of replicating the results. The primary outcome was to determine the feasibility of replicability based on reported information in the original publications and supplementary materials. Replicasility varied greatly: some original studies provided detailed information leading to almost perfect replication of results, whereas other studies did not provide enough information to implement any of the reported simulations. Factors facilitating replication included availability of code, detailed reporting or visualization of data-generating procedures and methods, and replicator expertise. Replicability of statistical simulation studies was mainly impeded by lack of information and sustainability of information sources. We encourage researchers publishing simulation studies to transparently report all relevant implementation details either in the research paper itself or in easily accessible supplementary material and to make their simulation code publicly available using permanent links.

Funder

German Academic Scholarship Foundation

National Science Foundation

National Science Foundation Graduate Research Fellowship

Leiden University Medical Center

ZonMw

Publisher

The Royal Society

Subject

Multidisciplinary

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3