Abstract
AbstractFair performance assessment requires consideration of the effects of rater severity on scoring. The many-facet Rasch model (MFRM), an item response theory model that incorporates rater severity parameters, has been widely used for this purpose. Although a typical MFRM assumes that rater severity does not change during the rating process, in actuality rater severity is known to change over time, a phenomenon called rater severity drift. To investigate this drift, several extensions of the MFRM have been proposed that incorporate time-specific rater severity parameters. However, these previous models estimate the severity parameters under the assumption of temporal independence. This introduces inefficiency into the parameter estimation because severities between adjacent time points tend to have temporal dependency in practice. To resolve this problem, we propose a Bayesian extension of the MFRM that incorporates time dependency for the rater severity parameters, based on a Markov modeling approach. The proposed model can improve the estimation accuracy of the time-specific rater severity parameters, resulting in improved estimation accuracy for the other rater parameters and for model fitting. We demonstrate the effectiveness of the proposed model through simulation experiments and application to actual data.
Funder
Japan Society for the Promotion of Science
Publisher
Springer Science and Business Media LLC
Subject
General Psychology,Psychology (miscellaneous),Arts and Humanities (miscellaneous),Developmental and Educational Psychology,Experimental and Cognitive Psychology
Reference76 articles.
1. Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19(6), 716–723.
2. Almond, R.G. (2014). A comparison of two MCMC algorithms for hierarchical mixture models. In Proceedings of the uncertainty in artificial intelligence conference on Bayesian modeling applications workshop (pp. 1–19).
3. Andrich, D. (1978). A rating formulation for ordered response categories. Psychometrika, 43(4), 561–573.
4. Baker, F., & Kim, S.H. (2004) Item response theory: Parameter estimation techniques. Boca Raton: CRC Press.
5. Bertrand, Q., Klopfenstein, Q., Massias, M., Blondel, M., Vaiter, S., Gramfort, A., & Salmon, J. (2022). Implicit differentiation for fast hyperparameter selection in non-smooth convex learning. Journal of Machine Learning Research, 23, 1–43.
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献