Affiliation:
1. Bradford Institute for Health Research
2. The University of Sheffield
3. Yorkshire and Humber Academic Health Science Network
Abstract
Abstract
Background
Implementation evaluation should focus on implementation success, guided by theories and frameworks, rather than on intervention effectiveness. While implementation evaluations tend to rely on surveys or interviews, it is important to consider alternative methods to avoid adding to healthcare professionals’ burden. This paper presents a cross-sectional rapid evaluation of a handheld medical device designed for remote examinations, which was piloted in Northern England. By using downloaded device data and administrative records mapped to domains from the implementation outcomes framework, this evaluation offers a pragmatic example of assessing implementation success guided by a framework and using readily available data.
Methods
The pilot design was pragmatic: sites volunteered, decided which services to use the device in, and launched the device on a rolling basis. Evaluation lasted one year. Data was downloaded from the devices, and administrative records for the pilot accessed. Variables were mapped to the implementation outcomes framework and reviewed by the device manufacturer and pilot team.
Results
N = 352 care episodes were recorded using the device with 223 patients. Out of 19 sites ‘signed-up’ to the pilot, 5 launched and delivered 10 of 35 proposed projects: a site and project adoption rate of 26% and 29%, respectively. Twenty six of the 71 trained clinicians used the device: a penetration rate of 37%. Six sites signed-up to an extension period; three had launched and three had not during the original timelines, indicating some sustainability. Feasibility was high, with few device error messages. Fidelity of device usage as planned was low for two of the eight available device examinations. Device and staffing costs were high but potential cost savings were attributable to fewer in-person appointments.
Conclusions
Through using device and administrative data, this evaluation minimised burden on busy healthcare staff yet was still guided by an evaluation framework. Six out of the eight implementation outcomes were measured, including sustainability and costs. The findings give insight into implementation challenges, particularly around adoption and penetration. For future research, it is recommended to engage with staff to prioritise outcome measurements and to focus on the interpretation and robustness of indicators.
Publisher
Research Square Platform LLC