Abstract
AbstractBackgroundUnderstanding implementation fidelity, or adherence to the intervention-as-intended, is essential to interpreting the results of evaluations. In this paper, we propose a longitudinal, explanatory approach to implementation fidelity through a realist evaluation lens. We apply this approach to a mixed-method assessment of implementation fidelity to an electronic decision support system intervention to improve the quality of antenatal care in Nepal.MethodsThe tablet-based electronic decision support system was implemented in 19 primary care facilities in Nepal. As part of the project’s process evaluation, we used four data sources – monitoring visit checklists and fieldnotes, software backend data, and longitudinal case studies in four facilities – to examine three components of fidelity: use at the point of care, use for all antenatal visits, and quality of data entry. Quantitative data were analysed descriptively. Qualitative data were analysed thematically using template analysis to examine descriptive findings across the three fidelity components and later to develop and reflect on the causal mechanisms. Findings were synthesised, drawing on Normalization Process Theory, to understand the processes driving the different patterns of fidelity observed.ResultsFidelity to point-of-care use declined over time with healthcare providers often entering data after antenatal visits had ended because providers understood the intervention as primarily about recordkeeping rather than decision support. Even in facilities with higher fidelity to point-of-care use, software decision-support prompts were largely ignored. Low antenatal client caseloads and the suggestion by fieldworkers to practice back-entering data from previous antenatal visits undermined understanding of the intervention’s purpose for decision support.ConclusionsOur assessment explains how and why patterns of implementation fidelity occurred, yielding more nuanced understanding of the project evaluation’s null result that moves beyond intervention vs implementation failure. Our findings demonstrate the importance of discussing intervention theory in terms fieldworkers and participants understand so as not to undermine fidelity.
Publisher
Cold Spring Harbor Laboratory
Reference36 articles.
1. Improving Maternal Health: Getting What Works To Happen;Reprod Health Matters,2007
2. Medical Research Council. Process evaluation of complex interventions [Internet]. 2015 [cited 2019 Nov 22]. Available from: https://mrc.ukri.org/documents/pdf/mrc-phsrn-process-evaluation-guidance-final/
3. Process evaluation in randomised controlled trials of complex interventions
4. A conceptual framework for implementation fidelity
5. Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda;Adm Policy Ment Health Ment Health Serv Res,2011