Affiliation:
1. Empirical Education Inc, 212 University Avenue, Suite 729, Berkeley, CA, 94704 USA
Abstract
In the current socio-political climate, there is an extra urgency to evaluate whether program impacts are distributed fairly across important student groups in education. Both experimental and quasi-experimental designs (QEDs) can contribute to answering this question. This work demonstrates that QEDs that compare outcomes across higher-level implementation units, such as schools, are especially well-suited to contributing evidence on differential program effects across student groups. Such designs, by differencing away site-level (macro) effects, on average produce estimates of the differential impact that are closer to experimental benchmark results than are estimates of average impact based on the same design. This work argues for the importance of routine evaluation of moderated impacts, describes the differencing procedure, and empirically tests the methodology with seven impact evaluations in education. The hope is to encourage broader use of this design type to more-efficiently develop the evidence base for differential program effects, particularly for underserved students.
Subject
Strategy and Management,Sociology and Political Science,Education,Health (social science),Social Psychology,Business and International Management
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献