Evaluating the Multiple-Choice Questions Quality at the College of Medicine, University of Bisha, Saudi Arabia: A Three-Year Experience

Author:

Eleragi A. M. S.1,Miskeen Elhadi1,Hussein Kamal1,Rezigalla Assad Ali1,Adam Masoud I. E.1,Al-Faifi Jaber Ahmed1,Alhalafi Abdullah1,Ameer Ahmed Y. Al1,Mohammed Osama A.1

Affiliation:

1. University of Bisha

Abstract

Abstract

Background Assessment is a central tool that drives and shapes students learning. Multiple choice questions (MCQs) are crucial in medical education assessment because they evaluate knowledge across large cohorts. Good quality items will help to achieve the learning objectives and provide trustful results. This study aims to evaluate the quality of MCQs utilized in the final exams of the Principal of Diseases (PRD) course over three academic years at the College of Medicine at The University of Bisha, Saudi Arabia. Method This cross-sectional institutional-based study used the final exams from the PRD course for the academic years 2016–2019. It was conducted at the College of Medicine, University of Bisha (UBCOM), Saudi Arabia (SA). The analysis process used item analysis (IA) of the PRD final theoretical examinations of the 2016–2017, 2017–2018, and 2018–2019 academic years. 80, 70, and 60 MCQ items were used per test in the above-mentioned years, respectively (210 total). The IA targets the reliability (KR20), difficulty index (DIF), discrimination index (DI), and distractor effectiveness (DE). The generated data were analyzed using SPSS (version 25.0), and statistical significance was set at P < 0.05. Results The exams included 210 items. The reliability (KR20) ranged from 0.804 to 0.906. The DI indicated that 56.7% of items were excellent, 20.9% were good, 13.8% were poor, and 8.6% were defective. The DIF showed that 50.5% of items had acceptable difficulty, 37.6% were easy, and 11.9% were difficult. DE analysis revealed that 70.2% of distractors were functional, with a significant correlation between DI, DIF, and DE (P < 0.05). Conclusion The quality of the analyzed MCQs in this study has good discrimination and acceptable difficulty, making them generally of high quality. The study accentuates the importance of continuous item analysis to maintain and improve the quality of assessment tools used in medical education.

Publisher

Springer Science and Business Media LLC

Reference33 articles.

1. Item and Test Analysis to Identify Quality Multiple Choice Questions (MCQs) from an Assessment of Medical Students of Ahmedabad, Gujarat;Gajjar S;Indian J Community Med,2014

2. Item analysis of multiple choice questions in physiology examination;Kolte V;Indian J Basic Appl Med Res,2015

3. Study on item and test analysis of multiple choice questions amongst undergraduate medical students;Ingale AS;Int J Community Med Public Health,2017

4. AI in medical education: uses of AI in construction type A MCQs;Rezigalla AA;BMC Med Educ,2024

5. Singh T. Principles of assessment in medical education. 2 ed. New Delhi, India: Jaypee Brothers Medical; 2021.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3