Affiliation:
1. Assessment Consultant and Psychometrician at Construct Measures, An Organization with a Focus on Assessment within Regulated Health Professions Toronto Ontario Canada
2. Department of Medicine at McMaster University in Hamilton Hamilton Ontario Canada
3. Research and Analysis department at Touchstone Institute Toronto Ontario Canada
Abstract
AbstractRationaleObjective Structured Clinical Examinations (OSCEs) are widely used for assessing clinical competence, especially in high‐stakes environments such as medical licensure. However, the reuse of OSCE cases across multiple administrations raises concerns about parameter stability, known as item parameter drift (IPD).Aims & ObjectivesThis study aims to investigate IPD in reused OSCE cases while accounting for examiner scoring effects using a Many‐facet Rasch Measurement (MFRM) model.MethodData from 12 OSCE cases, reused over seven administrations of the Internationally Educated Nurse Competency Assessment Program (IENCAP), were analyzed using the MFRM model. Each case was treated as an item, and examiner scoring effects were accounted for in the analysis.ResultsThe results indicated that despite accounting for examiner effects, all cases exhibited some level of IPD, with an average absolute IPD of 0.21 logits. Three cases showed positive directional trends. IPD significantly affected score decisions in 1.19% of estimates, at an invariance violation of 0.58 logits.ConclusionThese findings suggest that while OSCE cases demonstrate sufficient stability for reuse, continuous monitoring is essential to ensure the accuracy of score interpretations and decisions. The study provides an objective threshold for detecting concerning levels of IPD and underscores the importance of addressing examiner scoring effects in OSCE assessments. The MFRM model offers a robust framework for tracking and mitigating IPD, contributing to the validity and reliability of OSCEs in evaluating clinical competence.