Author:
Moradi Elahe,Ghabanchi Zargham,Pishghadam Reza
Abstract
AbstractGiven the significance of the test fairness, this study aimed to investigate a reading comprehension test for evidence of differential item functioning (DIF) based on English as a Foreign Language (EFL) learners’ gender and their mode of learning (conventional vs. distance learning). To this end, 514 EFL learners were asked to take a 30-item multiple-choice reading comprehension test. After establishing the unidimensionality and local independence of the data as prerequisites to DIF analyses, Rasch model-based DIF analysis was conducted across learners’ gender and their mode of learning. The results showed that there were two gender-DIF items which functioned differentially in favor of female respondents. Also, DIF analysis in terms of EFL learners’ mode of learning revealed that there were no DIF items across the two target groups indicating that the reading comprehension test functioned the same way for learners who enjoyed conventional learning compared to those who experienced distance and self-directed learning. In the end, the findings were discussed and implications were provided.
Publisher
Springer Science and Business Media LLC
Subject
Linguistics and Language,Language and Linguistics
Reference47 articles.
1. Alderson, J. C. (2000). Assessing reading. Cambridge: Cambridge University Press.
2. Allalouf, A., & Abramzon, A. (2008). Constructing better second language assessment based on. Language Assessment Quarterly, 5(2), 120–141.
3. American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (AERA/APA/NCME). (2014). Standards for educational and psychological testing. Washington, DC: AERA.
4. Aryadoust, V., Goh, C., & Lee, O. K. (2011). An investigation of differential item functioning in the MELAB Listening Test. Language Assessment Quarterly, 8(4), 361–385.
5. Banerjee, J., & Papageorgiou, S. (2016). What’s in a topic? Exploring the interaction between test-taker age and item content in high-stakes testing. International Journal of Listening, 30(2), 8–24. https://doi.org/10.1080/10904018.2015.1056876.