Abstract
The purpose of this study was to examine and improve differential item functioning (DIF) across gender and language groups in the VERA 8 tests. We used multigroup concurrent calibration with full and partial invariance based on the Rasch and two-parameter logistic (2PL) models, and classified students into proficiency levels based on their test scores and previously defined cut scores. The results indicated that some items showed gender- and language-specific DIF when using the Rasch model, but we did not detect large misfit items (suspected as DIF) when using the 2PL model. When the item parameters were estimated using the 2PL model with partial invariance assumption (PI-2PL), only small or negligible misfit items were found in the overall tests for both groups. It is argued in this study that the 2PL model should be preferred because both of its approaches provided less bias. However, especially in the presence of unweighted sample sizes of German and non-German students, the non-German students had the highest misfit item proportions. Although the items with medium or small misfit did not have a significant effect on the scores and performance classifications, the items with large misfit changed the proportions of students at the highest and lowest performance levels.
Funder
BRIGITTE-SCHLIEBEN-LANGE-PROGRAMM
Reference75 articles.
1. An Empirical Investigation of the Potential Impact of Item Misfit on Test Scores
2. Confirmatory factor analysis and item response theory: Two approaches for exploring measurement invariance.
3. Perspectives on differential item functioning methodology;Angoff,1993
4. DIF detection and description: Mantel-Haenszel and standardization;Dorans,1993
5. Differential item and test functioning;Raju,2002
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献