Affiliation:
1. The Chinese University of Hong Kong, Shatin, NT, Hong Kong, SAR
2. Johns Hopkins University, Baltimore, MD
Abstract
As evidence becomes increasingly important in educational policy, it is essential to understand how research design might contribute to reported effect sizes in experiments evaluating educational programs. A total of 645 studies from 12 recent reviews of evaluations of preschool, reading, mathematics, and science programs were studied. Effect sizes were roughly twice as large for published articles, small-scale trials, and experimenter-made measures, compared to unpublished documents, large-scale studies, and independent measures, respectively. Effect sizes were significantly higher in quasi-experiments than in randomized experiments. Excluding tutoring studies, there were no significant differences in effect sizes between elementary and middle/high studies. Regression analyses found that effects of all factors maintained after controlling for all other factors. Explanations for the effects of methodological features on effect sizes are discussed, as are implications for evidence-based policy.
Publisher
American Educational Research Association (AERA)
Cited by
343 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献