Affiliation:
1. University of Notre Dame, IN, USA
2. National Chiao Tung University, Hsinchu City, Taiwan
3. Educational Testing Service, Princeton, NJ, USA
4. University of Illinois at Urbana–Champaign, USA
Abstract
Differential item functioning (DIF) analysis is an important step in the data analysis of large-scale testing programs. Nowadays, many such programs endorse matrix sampling designs to reduce the load on examinees, such as the balanced incomplete block (BIB) design. These designs pose challenges to the traditional DIF analysis methods. For example, as difficulty levels often vary across booklets, examinees with same booklet scores may be disparate in ability. Consequently, DIF procedures based on matching total scores at the booklet level may cause misplacement of examinees and inflation in measurement errors. Therefore, modification to traditional DIF procedures to better accommodate the BIB design becomes important. This article introduces modification of current simultaneous item bias test (SIBTEST) procedure for the DIF analysis method when multiple booklets are used. More specifically, examinees will be pooled across booklets, and the matching will be based on transformed booklet scores after common block equating/linking. Simulations are conducted to compare the performance of this new method, the equated pooled booklet method against that of the current pooled booklet method, in terms of both Type I error control and power. Four factors are considered in the simulation—the DIF effect size, item difficulty, impact, and the length of common block. Results show that the equated pooled booklet method in general improves power while keeping Type I error under control. The advantage of the new method is the most pronounced when the traditional method struggles, for example, when the item is difficult or there is impact.
Subject
Psychology (miscellaneous),Social Sciences (miscellaneous)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献