Affiliation:
1. Queensland University of Technology, Australia
Abstract
This article reports on a thematic content analysis of 486 newsroom posts published between 2016 and 2021 by five prominent digital platforms (Facebook, Tinder, YouTube, TikTok, and Twitter). We aimed to understand how these platforms frame and define the issues of harm and safety, and to identify the interventions they publicly report introducing to address these issues. We found that platforms respond to and draw upon external controversies and media panics to selectively construct matters of concern related to safety and harm. They then reactively propose solutions that serve as justification for further investment in and scaling up of automated, data-intensive surveillance and verification technologies. We examine four key themes in the data: locating harm with bad actors and discrete content objects (Theme 1), framing surveillance and policing as solutions to harm (Theme 2), policing “borderline” content through suppression strategies (Theme 3), and performing diversity and inclusion (Theme 4).
Funder
Australian Research Council Centre of Excellence for Automated Decision-Making and Society
Subject
Computer Science Applications,Communication,Cultural Studies
Reference48 articles.
1. Doing Diversity
2. Ananny M., Gillespie T. (2017). Public platforms: Beyond the cycle of shocks and exceptions [Conference paper]. Communication Research and Practice. 67th Annual Conference of the International Communications Association, San Diego, CA, United States. http://blogs.oii.ox.ac.uk/ipp-conference/sites/ipp/files/documents/anannyGillespie-publicPlatforms-oii-submittedSept8.pdf
3. Automating Surveillance
4. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram
5. Platform transience: changes in Facebook’s policies, procedures, and affordances in global electoral politics
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献