Affiliation:
1. GAZI UNIVERSITY, GAZİ FACULTY OF EDUCATION
2. HACETTEPE UNIVERSITY
Abstract
This study aimed to compare estimated item difficulty based on expert opinion with real item difficulty based on data. For security reasons, some high-stakes tests are not pre-tested and item difficulty is estimated by teachers in classroom assessments, so it is necessary to examine the extent to which experts make accurate predictions. In this study, we developed a 12-item assessment test like the Turkish teacher certification exam. Item difficulty was estimated and compared separately based on 1165 student responses and the opinions of 12 experts. The study revealed that the experts had a good ability to estimate item difficulty for items of moderate difficulty. However, they tended to underestimate item difficulty for items categorized as medium-easy.
Publisher
Egitimde ve Psikolojide Olcme ve Degerlendirme Dergisi
Subject
Developmental and Educational Psychology,Education
Reference39 articles.
1. Afrashteh, M. Y. (2021). Comparison of the validity of bookmark and Angoff standard setting methods in medical performance tests. Bmc Medical Education, 21(1). https://doi.org/10.1186/s12909-020-02436-3
2. AITSL, A. I. f. T. a. S. L. (2022). AITSL, Australian Professional Standards for Teachers. https://www.aitsl.edu.au/tools-resources/resource/australian-professional-standards-for-teachers
3. Attali, Y., Saldivia, L., Jackson, C., Schuppan, F., & Wanamaker, W. (2014). Estimating item difficulty with comparative judgments. ETS Research Report Series, 2014(2), 1-8. http://dx.doi.org/10.1002/ets2.12042
4. Beinborn, L., Zesch, T., & Gurevych, I. (2014). Predicting the difficulty of language proficiency tests. Transactions of the Association for Computational Linguistics, 2, 517-530. https://doi.org/10.1162/tacl_a_00200
5. Chon, Y. V., & Shin, T. (2010). Item difficulty predictors of a multiple-choice reading test. English Teaching, 65(4), 257-282. http://journal.kate.or.kr/wp-content/uploads/2015/02/kate_65_4_11.pdf