Abstract
AbstractThis study explored challenges associated with online crowdsourced data collection, particularly focusing on longitudinal tasks with time-sensitive outcomes like response latencies. The research identified two significant sources of bias: technical shortcomings such as low, variable frame rates, and human factors, contributing to high attrition rates. The study also explored potential solutions to these problems, such as enforcing hardware acceleration and defining study-specific frame rate thresholds, as well as pre-screening participants and monitoring hardware performance and task engagement over each experimental session. This study provides valuable insights into improving the quality and reliability of data collected via online crowdsourced platforms and emphasizes the need for researchers to be cognizant of potential pitfalls in online research.
Publisher
Cold Spring Harbor Laboratory