Author:
Plackett Ruth,Kassianos Angelos P.,Mylan Sophie,Kambouri Maria,Raine Rosalind,Sheringham Jessica
Abstract
Abstract
Background
Use of virtual patient educational tools could fill the current gap in the teaching of clinical reasoning skills. However, there is a limited understanding of their effectiveness. The aim of this study was to synthesise the evidence to understand the effectiveness of virtual patient tools aimed at improving undergraduate medical students’ clinical reasoning skills.
Methods
We searched MEDLINE, EMBASE, CINAHL, ERIC, Scopus, Web of Science and PsycINFO from 1990 to January 2022, to identify all experimental articles testing the effectiveness of virtual patient educational tools on medical students’ clinical reasoning skills. Quality of the articles was assessed using an adapted form of the MERSQI and the Newcastle–Ottawa Scale. A narrative synthesis summarised intervention features, how virtual patient tools were evaluated and reported effectiveness.
Results
The search revealed 8,186 articles, with 19 articles meeting the inclusion criteria. Average study quality was moderate (M = 6.5, SD = 2.7), with nearly half not reporting any measurement of validity or reliability for their clinical reasoning outcome measure (8/19, 42%). Eleven articles found a positive effect of virtual patient tools on reasoning (11/19, 58%). Four reported no significant effect and four reported mixed effects (4/19, 21%). Several domains of clinical reasoning were evaluated. Data gathering, ideas about diagnosis and patient management were more often found to improve after virtual patient use (34/47 analyses, 72%) than application of knowledge, flexibility in thinking and problem-solving (3/7 analyses, 43%).
Conclusions
Using virtual patient tools could effectively complement current teaching especially if opportunities for face-to-face teaching or other methods are limited, as there was some evidence that virtual patient educational tools can improve undergraduate medical students’ clinical reasoning skills. Evaluations that measured more case specific clinical reasoning domains, such as data gathering, showed more consistent improvement than general measures like problem-solving. Case specific measures might be more sensitive to change given the context dependent nature of clinical reasoning. Consistent use of validated clinical reasoning measures is needed to enable a meta-analysis to estimate effectiveness.
Funder
School for Public Health Research
Health Foundation
NIHR Policy Research Unit in Cancer Awareness, Screening and Early Diagnosis
Publisher
Springer Science and Business Media LLC
Subject
Education,General Medicine
Reference56 articles.
1. Cleland JA, Abe K, Rethans J-J. The use of simulated patients in medical education: AMEE Guide No 42. Med Teach. 2009;31(6):477–86.
2. Higgs J, Jones MA, Loftus S, Christensen N. Clinical Reasoning in the Health Professions. UK: Elsevier; 2008.
3. The Special Interest Group of the Wolfson Research Institute for Health & Wellbeing Durham University. Page G, Matthan J, Silva A, McLaughlin D. Mapping the delivery of ‘Clinical Reasoning’ in UK undergraduate medical curricula. 2016. http://clinical-reasoning.org/resources/pdfs/Mapping-CR-UK-undergrad.pdf. Accessed 3 May 2022.
4. Io M. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press; 2015.
5. Cook DA, Triola MM. Virtual patients: a critical literature review and proposed next steps. Med Educ. 2009;43(4):303–11.