Affiliation:
1. Claremont Graduate University, Claremont, CA, USA
Abstract
This article explores the viability of online crowdsourcing for creating matched-comparison groups. This exploratory study compares survey results from a randomized control group to survey results from a matched-comparison group created from Amazon.com ’s MTurk crowdsourcing service to determine their comparability. Study findings indicate that online crowdsourcing, a process that allows access to many participants to complete specific tasks, is a potentially viable resource for evaluation designs where access to comparison groups, large budgets, and/or time are limited. The article highlights the strengths and limitations of the online crowdsourcing approach and describes ways that it could potentially be used in evaluation practice.
Subject
Strategy and Management,Sociology and Political Science,Education,Health(social science),Social Psychology,Business and International Management
Cited by
27 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献