Experiences of moderation, moderators, and moderating by online users who engage with self-harm and suicide content

Author:

Haime ZoëORCID,Kennedy Laura,Grace Lydia,Biddle Lucy

Abstract

AbstractOnline mental health spaces require effective content moderation for safety. Whilst policies acknowledge the need for proactive practices and moderator support, expectations and experiences of internet users engaging with self-harm and suicide content online remain unclear. Therefore, this study aimed to explore participant accounts of moderation, moderators and moderating when engaging online with self-harm/suicide (SH/S) related content.Participants in the DELVE study were interviewed about their experiences with SH/S content online. N=14 participants were recruited to interview at baseline, with n=8 completing the 3-month follow-up, and n=7 the 6 month follow-up. Participants were also asked to complete daily diaries of their online use between interviews. Thematic analysis, with deductive coding informed by interview questions, was used to explore perspectives on moderation, moderators and moderating from interview transcripts and diary entries.Three key themes were identified: ‘content reporting behaviour’, exploring factors influencing decisions to report SH/S content; ‘perceptions of having content blocked’, exploring participant experiences and speculative accounts of SH/S content moderation; and ‘content moderation and moderators’, examining participant views on moderation approaches, their own experiences of moderating, and insights for future moderation improvements.This study revealed challenges in moderating SH/S content online, and highlighted inadequacies associated with current procedures. Participants struggled to self-moderate online SH/S spaces, showing the need for proactive platform-level strategies. Additionally, whilst the lived experience of moderators was valued by participants, associated risks emphasised the need for supportive measures. Policymakers and industry leaders should prioritise transparent and consistent moderation practice.Author SummaryIn today’s digital world, ensuring the safety of online mental health spaces is vital. Yet, there’s still a lot we don’t understand about how people experience moderation, moderators, and moderating in self-harm and suicide online spaces. Our study set out to change that by talking to 14 individuals who engage with this content online. Through interviews and diaries, we learned more about their experiences with platform and online community moderation.Our findings showed some important things. Firstly, individuals with declining mental health struggled to use tools that might keep them safe, like reporting content. This emphasised the need for effective moderation in online mental health spaces, to prevent harm. Secondly, unclear communication and inconsistent moderation practices lead to confusion and frustration amongst users who reported content, or had their own content moderated. Improving transparency and consistency will enhance user experiences of moderation online. Lastly, users encouraged the involvement of mental health professionals into online moderating teams, suggesting platforms and online communities should provide training and supervision from professionals to their moderation staff. These findings support our recommendations for ongoing changes to moderation procedures across online platforms.

Publisher

Cold Spring Harbor Laboratory

Reference20 articles.

1. Roberts, S. T . (2019). Behind the screen. Yale University Press.

2. Moderating the public sphere;Human rights in the age of platforms,2019

3. Human-Machine Collaboration for Content Regulation

4. Kiesler, S. , Kraut, R. , and Resnick. P. (2012). Regulating behavior in online communities. In Building Successful Online Communities: Evidence-Based Social Design. MIT Press.

5. The Instagram/Facebook ban on graphic self-harm imagery: A sentiment analysis and topic modeling approach;Policy & Internet,2022

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3