Author:
Crane Melanie,Bohn-Goldbaum Erika,Grunseit Anne,Bauman Adrian
Abstract
Abstract
Background
Natural experiments are increasingly valued as a way to assess the health impact of health and non-health interventions when planned controlled experimental research designs may be infeasible or inappropriate to implement. This study sought to investigate the value of natural experiments by exploring how they have been used in practice. The study focused on obesity prevention research as one complex programme area for applying natural experiment studies.
Methods
A literature search sought obesity prevention research from January 1997 to December 2017 and identified 46 population health studies that self-described as a natural experiment.
Results
The majority of studies identified were published in the last 5 years, illustrating a more recent adoption of such opportunities. The majority of studies were evaluations of the impact of policies (n = 19), such as assessing changes to food labelling, food advertising or taxation on diet and obesity outcomes, or were built environment interventions (n = 17), such as the impact of built infrastructure on physical activity or access to healthy food. Research designs included quasi-experimental, pre-experimental and non-experimental methods. Few studies applied rigorous research designs to establish stronger causal inference, such as multiple pre/post measures, time series designs or comparison of change against an unexposed group. In general, researchers employed techniques to enhance the study utility but often were limited in the use of more rigorous study designs by ethical considerations and/or the particular context of the intervention.
Conclusion
Greater recognition of the utility and versatility of natural experiments in generating evidence for complex health issues like obesity prevention is needed. This review suggests that natural experiments may be underutilised as an approach for providing evidence of the effects of interventions, particularly for evaluating health outcomes of interventions when unexpected opportunities to gather evidence arise.
Funder
National Health and Medical Research Council of Australia
Publisher
Springer Science and Business Media LLC
Reference78 articles.
1. Crane M, Bauman A, Lloyd B, McGill B, Rissel C, Grunseit A. Applying pragmatic approaches to complex program evaluation: A case study of implementation of the New South Wales Get Healthy at Work program. Health Prom J Austrl. 2019;30(3):422..
2. Rutter H, Savona N, Glonti K, Bibby J, Cummins S, Finegood DT, et al. The need for a complex systems model of evidence for public health. Lancet. 2017;390(10112):2602–4.
3. Butland BJS, Kopelman P, McPherson K, Thomas S, Mardell J, Parry V. Tackling Obesities: Future Choices. Project Report. 2nd edition: Government Office for Science. London: Department of Innovation, Universities and Skills; 2007.
4. Roberto CA, Swinburn B, Hawkes C, Huang TTK, Costa SA, Ashe M, et al. Patchy progress on obesity prevention: emerging examples, entrenched barriers, and new thinking. Lancet. 2015;385(9985):2400–9.
5. Swinburn BA, Sacks G, Hall KD, McPherson K, Finegood DT, Moodie ML, et al. The global obesity pandemic: shaped by global drivers and local environments. Lancet. 2011;378(9793):804–14.