Use of Rasch Modeling and Focus Groups to Inform the Training of Teacher Evaluators
-
Published:2021-04-30
Issue:2
Volume:13
Page:213-227
-
ISSN:1834-0806
-
Container-title:International Journal of Multiple Research Approaches
-
language:
-
Short-container-title:IJMRA
Author:
Lambert Richard G.1, Moore C. Missy2, Bottoms Bryndle L.1, Vestal Amanda1, Taylor Heather3
Affiliation:
1. University of North Carolina at Charlotte, Charlotte, NC, USA 2. University of Georgia, Athens, GA, USA 3. Appalachian State University, Boone, NC, USA
Abstract
There are few empirical studies of teacher performance evaluation systems. Teachers are rightfully concerned about the degree to which evaluators’ idiosyncratic biases might undermine the process. Training evaluators thoroughly and monitoring the reliability, validity, fairness, and cultural sensitivity of their ratings are essential steps towards promoting strong performance evaluation systems. This study examined the process of evaluating early childhood teachers to inform evaluator training. The researchers sought to determine the degree to which the expectations of those who develop training materials and conduct evaluator trainings differ from the typical performance ratings given by evaluators in the field. Researchers used several methods to prompt a systematic examination of the evaluator training process across four sequential phases of investigation: (a) quantitative panel ratings of item difficulty, (b) panel discussion and consensus building (a qualitative phase), (c) examining expected versus empirical item difficulty (a quantitative phase), and (d) presenting the empirical difficulty levels to the panel for discussion (a qualitative phase). In this last phase, researchers presented results of Rasch modeling to the panel, along with levels of agreement between the empirical and expected difficulty levels. Panel members reported that the process of discussing their perceptions of expected item difficulty levels was valuable. They also reported that such discussion prompted them to reevaluate the training materials, the resource manuals, and other professional development resources. The study methods presented can be used to investigate and to improve other personnel evaluation systems.
Publisher
Dialectical Publishing
Reference40 articles.
1. Artman-Meekr, K., Fettig, A., Baton, E. E., Penney, A., & Zeng, S. (2015). Applying an evidence-based framework to the early childhood coaching literature. Topics in Early Childhood Special Education, 35(3), 183-196. https://doi.org/10.1177/0271-121415595550 2. Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Lawrence Erlbaum Associates. 3. Clauser, B. E., Harik, P., Margolis, M. J., McManus, I. C., Mollon, J., Chis, L., & Williams, S. (2008). An empirical examination of the impact of group discussion and examinee performance information on judgments made in the Angoff standard-setting procedure. Applied Measurement in Education, 22(1), 1-21. https://doi.org/10.1080/08957340802558318 4. Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Sage. 5. David, S. L., Hitchcock, J. H., Ragan, B., Brooks, G., & Starkey, C. (2018). Mixing interviews and Rasch modeling: Demonstrating a procedure used to develop an instrument that measures trust. Journal of Mixed Methods Research, 12(1), 75-94. http-s://doi.org/10.1177/1558689815624586
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|