Affiliation:
1. Department of Psychology, University of Michigan
Abstract
People often rely on scientific findings to help them make decisions—however, failing to report effect magnitudes might lead to a potential bias in assuming findings are practically significant. Across two online studies (Prolific; N = 800), we measured U.S. adults’ endorsements of expensive interventions described in media reports that led to effects that were small, large, or of unreported magnitude between groups. Participants who viewed interventions with unreported effect magnitudes were more likely to endorse interventions compared with those who viewed interventions with small effects and were just as likely to endorse interventions as those who viewed interventions with large effects, suggesting a practical significance bias. When effect magnitudes were reported, participants on average adjusted their evaluations accordingly. However, some individuals, such as those with low numeracy skills, were more likely than others to act on small effects, even when explicitly prompted to first consider the meaningfulness of the effect.
Funder
US Department of Education’s institute for Educational Sciences