Quantifying and addressing the prevalence and bias of study designs in the environmental and social sciences
-
Published:2020-12
Issue:1
Volume:11
Page:
-
ISSN:2041-1723
-
Container-title:Nature Communications
-
language:en
-
Short-container-title:Nat Commun
Author:
Christie Alec P.ORCID, Abecasis DavidORCID, Adjeroud Mehdi, Alonso Juan C.ORCID, Amano TatsuyaORCID, Anton AlvaroORCID, Baldigo Barry P.ORCID, Barrientos RafaelORCID, Bicknell Jake E.ORCID, Buhl Deborah A., Cebrian JustORCID, Ceia Ricardo S.ORCID, Cibils-Martina LucianaORCID, Clarke Sarah, Claudet JoachimORCID, Craig Michael D., Davoult Dominique, De Backer AnneliesORCID, Donovan Mary K.ORCID, Eddy Tyler D., França Filipe M.ORCID, Gardner Jonathan P. A.ORCID, Harris Bradley P., Huusko Ari, Jones Ian L., Kelaher Brendan P., Kotiaho Janne S.ORCID, López-Baucells AdriàORCID, Major Heather L.ORCID, Mäki-Petäys Aki, Martín Beatriz, Martín Carlos A., Martin Philip A., Mateos-Molina DanielORCID, McConnaughey Robert A.ORCID, Meroni Michele, Meyer Christoph F. J.ORCID, Mills Kade, Montefalcone Monica, Noreika NorbertasORCID, Palacín Carlos, Pande Anjali, Pitcher C. RolandORCID, Ponce Carlos, Rinella Matt, Rocha RicardoORCID, Ruiz-Delgado María C., Schmitter-Soto Juan J.ORCID, Shaffer Jill A.ORCID, Sharma ShaileshORCID, Sher Anna A.ORCID, Stagnol Doriane, Stanley Thomas R., Stokesbury Kevin D. E., Torres Aurora, Tully Oliver, Vehanen TeppoORCID, Watts Corinne, Zhao Qingyuan, Sutherland William J.
Abstract
AbstractBuilding trust in science and evidence-based decision-making depends heavily on the credibility of studies and their findings. Researchers employ many different study designs that vary in their risk of bias to evaluate the true effect of interventions or impacts. Here, we empirically quantify, on a large scale, the prevalence of different study designs and the magnitude of bias in their estimates. Randomised designs and controlled observational designs with pre-intervention sampling were used by just 23% of intervention studies in biodiversity conservation, and 36% of intervention studies in social science. We demonstrate, through pairwise within-study comparisons across 49 environmental datasets, that these types of designs usually give less biased estimates than simpler observational designs. We propose a model-based approach to combine study estimates that may suffer from different levels of study design bias, discuss the implications for evidence synthesis, and how to facilitate the use of more credible study designs.
Publisher
Springer Science and Business Media LLC
Subject
General Physics and Astronomy,General Biochemistry, Genetics and Molecular Biology,General Chemistry
Reference78 articles.
1. Donnelly, C. A. et al. Four principles to make evidence synthesis more useful for policy. Nature 558, 361–364 (2018). 2. McKinnon, M. C., Cheng, S. H., Garside, R., Masuda, Y. J. & Miller, D. C. Sustainability: map the evidence. Nature 528, 185–187 (2015). 3. Rubin, D. B. For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2, 808–840 (2008). 4. Peirce, C. S. & Jastrow, J. On small differences in sensation. Mem. Natl Acad. Sci. 3, 73–83 (1884). 5. Fisher, R. A. Statistical methods for research workers. (Oliver and Boyd, 1925).
Cited by
65 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|