Author:
Tan Jean-Yin,Ma Irene W.Y.,Hunt Julie A.,Kwong Grace P.S.,Farrell Robin,Bell Catriona,Read Emma K
Abstract
The Objective Structured Clinical Examination (OSCE) is a valid, reliable assessment of veterinary students’ clinical skills that requires significant examiner training and scoring time. This article seeks to investigate the utility of implementing video recording by scoring OSCEs in real-time using live examiners, and afterwards using video examiners from within and outside the learners’ home institution. Using checklists, learners (n=33) were assessed by one live examiner and five video examiners on three OSCE stations: suturing, arthrocentesis, and thoracocentesis. When stations were considered collectively, there was no difference between pass/fail outcome between live and video examiners (χ2 = 0.37, p = .55). However, when considered individually, stations (χ2 = 16.64, p < .001) and interaction between station and type of examiner (χ2 = 7.13, p = .03) demonstrated a significant effect on pass/fail outcome. Specifically, learners being assessed on suturing with a video examiner had increased odds of passing the station as compared with their arthrocentesis or thoracocentesis stations. Internal consistency was fair to moderate (0.34–0.45). Inter-rater reliability measures varied but were mostly moderate to strong (0.56–0.82). Video examiners spent longer assessing learners than live raters (mean of 21 min/learner vs. 13 min/learner). Station-specific differences among video examiners may be due to intermittent visibility issues during video capture. Overall, video recording learner performances appears reliable and feasible, although there were time, cost, and technical issues that may limit its routine use.
Publisher
University of Toronto Press Inc. (UTPress)
Subject
General Veterinary,Education,General Medicine
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献