Abstract
Abstract
Background
Bristol Medical School has adopted a near peer-led teaching approach to deliver Basic Life Support training to first year undergraduate medical students. Challenges arose when trying to identify early in the course which candidates were struggling with their learning, in sessions delivered to large cohorts. We developed and piloted a novel, online performance scoring system to better track and highlight candidate progress.
Methods
During this pilot, a 10-point scale was used to evaluate candidate performance at six time-points during their training. The scores were collated and entered on an anonymised secure spreadsheet, which was conditionally formatted to provide a visual representation of the score. A One-Way ANOVA was performed on the scores and trends analysed during each course to review candidate trajectory. Descriptive statistics were assessed. Values are presented as mean scores with standard deviation (x̄±SD).
Results
A significant linear trend was demonstrated (P < 0.001) for the progression of candidates over the course. The average session score increased from 4.61 ± 1.78 at the start to 7.92 ± 1.22 at the end of the final session. A threshold of less than 1SD below the mean was used to identify struggling candidates at any of the six given timepoints. This threshold enabled efficient highlighting of struggling candidates in real time.
Conclusions
Although the system will be subject to further validation, our pilot has shown the use of a simple 10-point scoring system in combination with a visual representation of performance helps to identify struggling candidates earlier across large cohorts of students undertaking skills training such as Basic Life Support. This early identification enables effective and efficient remedial support.
Publisher
Springer Science and Business Media LLC
Subject
Education,General Medicine
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献