Affiliation:
1. University of Alabama at Birmingham
2. Virginia Commonwealth University
3. University of Oklahoma
Abstract
Abstract
Background: Empirical research is inconsistent regarding the relationship between the number of implementation strategies and the implementation of evidence-based interventions. One potential explanation for inconsistent relationships is an assumption that different types of strategies will have a similar impact on different implementation outcomes. Likewise, relatively little research has considered whether greater (or fewer) numbers of implementation strategies may be more (or less) effective under certain conditions, despite general recognition of the role that implementation strategies can play in overcoming contextual barriers to implementation. The purpose of this paper was to address these gaps by answering three related questions: 1) What is the relationship between the number of implementation strategies and implementation outcomes?; 2) Does the relationship between implementation strategies and implementation outcomes differ for clinic-focused and patient-focused strategies?; and 3) To what extent does the organizational climate strengthen or attenuate the relationship between the number of implementation strategies and implementation outcomes?
Methods: Based on administrative and survey data from 15 U.S. rheumatology clinics that were implementing an evidence-based decision aid for patients with lupus, we used random intercept mixed-effects regression models to examine the association between the total number of implementation strategies (and separately for clinic-focused vs. patient-focused strategies) and clinic staff’ perceptions of decision-aid acceptability, appropriateness, and feasibility. Extensions of the previous models examined whether these relationships were moderated by a clinic’s change readiness and learning climates.
Results: Our analysis suggests that, in aggregate, more strategies do not necessarily result in more positive perceptions of decision-aid acceptability, appropriateness, or feasibility. Additional analyses, however, suggest that the effect of increasing numbers of implementation strategies differ depending on the audience at which the strategy is focused – clinic staff vs. patients. Our moderation analysis also suggests that organizational climate accentuates the relationship between implementation strategies and outcomes in some cases and attenuates the relationship in others.
Conclusions: Collectively, these findings highlight the difficulty of simple, standardized recommendations – e.g., ‘increase the number of implementation strategies’ or ‘clinics should strengthen the readiness or learning climate’. Under some circumstances, increasing the number of implementation strategies may, in fact, have detrimental effects on implementation outcomes.
Trial registration: ClinicalTrials.gov ID: NCT03735238
Publisher
Research Square Platform LLC
Reference41 articles.
1. Getting a clinical innovation into practice: an introduction to implementation strategies;Kirchner JE;Psychiatry Res,2020
2. A compilation of strategies for implementing clinical innovations in health and mental health;Powell BJ;Med care Res Rev,2012
3. Colditz GA, Emmons KM. The promise and challenges of dissemination and implementation research. Dissemination and implementation research in health: Translating science to practice. 2012;2:1–17.
4. The nonspread of innovations: the mediating role of professionals;Ferlie E;Acad Manag J,2005
5. The quality of health care delivered to adults in the United States;McGlynn EA;N Engl J Med,2003