Author:
Sbeglia Gena C.,Nehm Ross H.
Abstract
Abstract
Background
Policy documents like Vision and Change and the Next Generation Science Standards emphasize the importance of using constructed-response assessments to measure student learning, but little work has examined the extent to which administration conditions (e.g., participation incentives, end-of-course timing) bias inferences about learning using such instruments. This study investigates potential biases in the measurement of evolution understanding (one time point) and learning (pre-post) using a constructed-response instrument.
Methods
The constructed-response ACORNS instrument (Assessment of COntextual Reasoning about Natural Selection) was administered at the beginning of the semester, during the final exam, and at end of the semester to large samples of North American undergraduates (N = 488–1379, 68–96% participation rate). Three ACORNS scores were studied: number of evolutionary core concepts (CC), presence of evolutionary misconceptions (MIS), and presence of normative scientific reasoning across contexts (MODC). Hierarchical logistic and linear models (HLMs) were used to study the impact of participation incentives (regular credit vs. extra credit) and end-of-course timing (final exam vs. post-test) on inferences about evolution understanding (single time point) and learning (pre-post) derived from the three ACORNS scores. The analyses also explored whether results were generalizable across race/ethnicity and gender.
Results
Variation in participation incentives and end-of-course ACORNS administration timing did not meaningfully impact inferences about evolution understanding (i.e., interpretations of CC, MIS, and MODC magnitudes at a single time point); all comparisons were either insignificant or, if significant, considered to be small effect sizes. Furthermore, participation incentives and end-of-course timing did not meaningfully impact inferences about evolution learning (i.e., interpretations of CC, MIS, and MODC changes through time). These findings were consistent across race/ethnicity and gender groups.
Conclusion
Inferences about evolution understanding and learning derived from ACORNS scores were in most cases robust to variations in participation incentives and end-of-course timing, suggesting that educators may have some flexibility in terms of when and how they deploy the ACORNS instrument.
Funder
This study was funded by the Howard Hughes Medical Institute Inclusive Excellence science education award.
Publisher
Springer Science and Business Media LLC
Subject
Education,Ecology, Evolution, Behavior and Systematics
Reference56 articles.
1. American Association for the Advancement of Science. Vision and change in undergraduate biology education: a call to action. Washington, DC: Directorate for Biological Sciences; 2011.
2. Andrews TM, Leonard MJ, Colgrove CA, Kalinowski ST. Active learning not associated with student learning in a random sample of college biology courses. CBE Life Sci Educ. 2011;10:394–405.
3. Bates D, Maechler M, Bolker B, Walker S, Christensen RHB, Singmann H, Dai B. lme4: Linear mixed-effects models using 'eigen' and S4. R package. 2021.
4. Beggrow EP, Ha M, Nehm RH, Pearl D, Boone WJ. Assessing scientific practices using machine-learning methods: How closely do they match clinical interview performance? J Sci Educ Technol. 2014;23:160–82.
5. Bishop BA, Anderson CW. Student conceptions of natural selection and its role in evolution. J Res Sci Teach. 1990;27:415–27.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献