Abstract
<p style="text-align: justify;">The accuracy of assessing the capabilities of the process and product in chemical practice activities requires appropriate measurement procedures to be followed. It is crucial to identify the components that can introduce bias while measuring student abilities during the measurement process. This study aims to identify the components or criteria used by teachers to assess student performance in practicum activities and analyze the quality of the rubrics developed. The study was conducted with the participation of three raters, 27 high school students, and nine assessment criteria. A quantitative descriptive approach was employed using the many-facet Rasch model (MFRM) analysis for measurement. The results of the MFRM analysis show no significant measurement bias, with data measurement facets fitting the MFRM model. The reliability of all the facets meets the criteria, and the scale predictor functions appropriately. While all students can easily pass four out of nine items, five items can only be partially passed by students. The assessment criteria that require special attention include communication skills, tools and assembly, interpretation, cleanliness, and accuracy when performing practicums. These criteria provide feedback for teachers and students to ensure successful practicum activities. The Discussion section of this study delves into the findings and their implications.</p>
Publisher
Eurasian Society of Educational Research
Reference82 articles.
1. Adams, C. J. (2020). A constructively aligned first-year laboratory course. Journal of Chemical Education, 97(7), 1863–1873. https://doi.org/10.1021/acs.jchemed.0c00166
2. Aiken, L. R. (1985). Three coefficients for analyzing the reliability and validity of ratings. Educational and Psychological Measurement, 45(1), 131–141. https://doi.org/10.1177/0013164485451012
3. Almarshoud, A. F. (2011). Developing a rubric-based framework for measuring the ABET outcomes achieved by students of electric machinery courses. International Journal of Engineering Education, 27(4), 859–866.
4. Aryadoust, V. (2016). Gender and academic major bias in peer assessment of oral presentations. Language Assessment Quarterly, 13(1), 1–24. https://doi.org/10.1080/15434303.2015.1133626
5. Asmorowati, D., Wardani, S., & Mahatmanti, F. (2021). Analysis of student science process skills in the practicum of physical chemistry based on linguistic and interpersonal intelligence. International Journal of Active Learning, 6(1), 34–40. https://www.learntechlib.org/p/218989/