Affiliation:
1. Information School, University of Washington, Seattle, WA, USA
2. Center for an Informed Public, University of Washington, Seattle, WA, USA
Abstract
Recent work has demonstrated how content moderation practices on social media may unfairly affect marginalized individuals, for example by censoring women's bodies and misidentifying reclaimed terms as hate speech. This study documents and explores the direct experiences of marginalized creators who have been impacted by discriminatory content moderation on Instagram. Collaborating with our participants for over a year, we contribute five co-constructed narratives of discriminatory content moderation from advocates in trauma-informed care, LGBTQ+ sex education, anti-racism education, and beauty and body politics. In sharing these detailed personal accounts, not only do we shed light on their experiences with being blocked, banned, or deleted unfairly, but we delve deeper into the lasting impacts of these experiences to their livelihoods and mental health. Reflecting on their stories, we observe that content moderation on social media is deeply entangled with the situated experiences of offline discrimination. As such, we document how each participant experiences moderation through the lens of their often intersectional identities. Using participatory research methods, we collectively strategize ways to learn from these individual accounts and resist discriminatory content moderation, as well as imagine possibilities for repair and accountability.
Publisher
Association for Computing Machinery (ACM)
Reference98 articles.
1. ORIENTATIONS: Toward a Queer Phenomenology
2. Research Note: Comparing the Gay and Trans Panic Defenses
3. Carolina Are. 2020. How Instagram's algorithm is censoring women and vulnerable users but helping online abusers. Feminist media studies 20, 5 (2020), 741--744.
4. Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Feminist Media Studies (2021), 1--18.
5. Andrew Arsht and Daniel Etcovitch. 2018. The human cost of online content moderation. Harvard Law Review Online, Harvard University, Cambridge, MA, USA. Retrieved from https://jolt. law.harvard.edu/digest/the-human-cost-of-online-content-moderation (2018).