Computational Reproducibility in Finance: Evidence from 1,000 Tests

Author:

Pérignon Christophe1,Akmansoy Olivier2,Hurlin Christophe3,Dreber Anna4,Holzmeister Felix5,Huber Jürgen5,Johannesson Magnus6,Kirchler Michael5,Menkveld Albert J7,Razen Michael5,Weitzel Utz8

Affiliation:

1. HEC Paris , France , and cascad, France

2. CNRS , France , and cascad, France

3. University of Orléans , France , and cascad, France

4. Stockholm School of Economics , Sweden , and University of Innsbruck, Austria

5. University of Innsbruck , Austria

6. Stockholm School of Economics , Sweden

7. Vrije Universiteit Amsterdam, Netherlands, and Tinbergen Institute , Netherlands

8. Vrije Universiteit Amsterdam , Netherlands , Radboud University, Netherlands, and Tinbergen Institute, Netherlands

Abstract

Abstract We analyze the computational reproducibility of more than 1,000 empirical answers to 6 research questions in finance provided by 168 research teams. Running the researchers’ code on the same raw data regenerates exactly the same results only 52% of the time. Reproducibility is higher for researchers with better coding skills and those exerting more effort. It is lower for more technical research questions, more complex code, and results lying in the tails of the distribution. Researchers exhibit overconfidence when assessing the reproducibility of their own research. We provide guidelines for finance researchers and discuss implementable reproducibility policies for academic journals.

Publisher

Oxford University Press (OUP)

Reference55 articles.

1. Asset pricing with liquidity risk;Acharya;Journal of Financial Economics,2005

2. Illiquidity and stock returns: Cross-section and time-series effects;Amihud;Journal of Financial Markets,2002

3. Redefine statistical significance;Benjamin;Nature Human Behaviour,2018

4. A preanalysis plan to replicate sixty economics research papers that worked half of the time;Chang;American Economic Review,2017

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Heterogeneity in effect size estimates;Proceedings of the National Academy of Sciences;2024-07-30

2. A framework for evaluating reproducibility and replicability in economics;Economic Inquiry;2024-07

3. Nonstandard Errors;The Journal of Finance;2024-04-17

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3