BACKGROUND
Competence-based medical education requires robust data to link competence with clinical experiences. The SARS-CoV-2 pandemic abruptly altered the standard trajectory of clinical exposure in medical training programs. Residency program directors were tasked with identifying and addressing the resultant gaps in each trainee’s experiences using existing tools.
OBJECTIVE
To demonstrate a feasible and efficient method to capture electronic health record (EHR) data that measures the volume and variety of pediatric resident clinical experiences from a continuity clinic; generate individual-, class-, and graduate-level benchmark data; and create a visualization for learners to quickly identify gaps in clinical experiences.
METHODS
This study was conducted in a large, urban pediatric residency program from 2016-2022. Through consensus, five pediatric faculty identified diagnostic groups pediatric residents should see to be competent in outpatient pediatrics. Institution business analysts used ICD-10 codes corresponding with each diagnostic group to extract EHR patient encounter data as an indicator of exposure to the specific diagnosis. The frequency (volume) and diagnosis types (variety) seen by active residents (classes of 2020-2022) were compared to class and graduated resident (classes of 2016-2019) averages. These data were converted to percentages and translated to a radar chart visualization for residents to quickly compare their current clinical experiences to peers and graduates. Residents were surveyed on utility of these data and the visualization to identify training gaps.
RESULTS
Patient encounter data about clinical experiences for 102 residents (N=52 graduates) were extracted. Active residents (N=50) received data reports with radar graphs biannually: three for the classes of 2020 and 2021 and two for the class of 2022. Radar charts distinctly demonstrated gaps in diagnoses exposure compared to classmates and graduates. Residents found the visualization useful in setting learning goals.
CONCLUSIONS
This pilot describes an innovative method of capturing and presenting data about resident clinical experiences, compared to peer and graduate benchmarks, to identify learning gaps that may result from disruptions or modifications in medical training. This methodology can be aggregated across specialties and institutions and potentially inform competence-based medical education.