Abstract
Online social media platforms constantly struggle with harmful content such as misinformation and violence, but how to effectively moderate and prioritize such content for billions of global users with different backgrounds and values presents a challenge. Through an international survey with 1,696 internet users across 8 different countries across the world, this empirical study examines how international users perceive harmful content online and the similarities and differences in their perceptions. We found that across countries, the perceived severity consistently followed an exponential growth as the harmful content became more severe, but what harmful content were perceived as more or less severe varied significantly. Our results challenge platform content moderation’s status quo of using a one-size-fits-all approach to govern international users, and provide guidance on how platforms may wish to prioritize and customize their moderation of harmful content.
Publisher
Public Library of Science (PLoS)
Reference39 articles.
1. Fick M, Dave P. Facebook’s flood of languages leave it struggling to monitor content. Reuters. 2019;.
2. Facebook. Community Standards;. Available from: https://www.facebook.com/communitystandards/.
3. A modest test of cross-cultural differences in sexual modesty, embarrassment and self-disclosure;HW Smith;Qualitative Sociology,1980
4. Behind the Screen
Cited by
33 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献