Abstract
ObjectivesThe skill of the debriefer is known to be the strongest independent predictor of the quality of simulation encounters yet educators feel underprepared for this role. The aim of this review was to identify frameworks used for debriefing team-based simulations and measures used to assess debriefing quality.MethodsWe systematically searched PubMed, CINAHL, MedLine and Embase databases for simulation studies that evaluated a debriefing framework. Two reviewers evaluated study quality and retrieved information regarding study methods, debriefing framework, outcome measures and debriefing quality.ResultsA total of 676 papers published between January 2003 and December 2017 were identified using the search protocol. Following screening of abstracts, 37 full-text articles were assessed for eligibility, 26 studies met inclusion criteria for quality appraisal and 18 achieved a sufficiently high-quality score for inclusion in the evidence synthesis. A debriefing framework was used in all studies, mostly tailored to the study. Impact of the debrief was measured using satisfaction surveys (n=11) and/or participant performance (n=18). Three themes emerged from the data synthesis: selection and training of facilitators, debrief model and debrief assessment. There was little commonality across studies in terms of participants, experience of faculty and measures used.ConclusionsA range of debriefing frameworks were used in these studies. Some key aspects of debrief for team-based simulation, such as facilitator training, the inclusion of a reaction phase and the impact of learner characteristics on debrief outcomes, have no or limited evidence and provide opportunities for future research particularly with interprofessional groups.
Funder
Higher Education Academy, UK
Subject
Health Informatics,Education,Modelling and Simulation
Cited by
21 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献