BACKGROUND
Digital technologies have become part of the fabric of daily life in the United States, where a majority of adults use at least one social media site (SMS). Many of the leading SMS have published regulations, including their “Terms of Use” and “Community Guidelines,” that provide guidance on the posting of content by users and the actions SMS will take in response to violations.
OBJECTIVE
The goal of this analysis is to determine how public health related topics are addressed by SMS in their policy documents.
METHODS
A list of leading SMS in the US was compiled: Facebook, Twitter, Instagram, YouTube, Pinterest, LinkedIn, and Snapchat. Both the Terms of Use and the Community Guidelines documents were analyzed for each site (N=7). A coding scheme was developed using an iterative process to identify guidelines set by SMS regarding content areas relevant to public health. The final coding system was comprised of 8 distinct guidelines (3 related to regulations and 5 related to actions taken in response to violations) covering 11 public health content areas. Documents were coded by two independent coders for the presence of each guideline.
RESULTS
With respect to regulations, in the Terms of Use, all 7 sites had general regulations that prohibited the posting of objectionable content generally, while 3 or fewer sites had regulations for specific content areas. In the Community Guidelines, 6 out of 7 sites had general regulations, and all sites had regulations against content that included threats and violence, bullying/harassment, hate speech, sexual content overall and sexual content related to children, and that came from dangerous organizations. No sites had prohibitions against misinformation. With respect to actions taken in response to violations in the Terms of Use, all sites stated they would remove content that violated regulations and 6 sites stated the same in the Community Guidelines. Additional actions that were commonly mentioned in the Terms of Use included suspending accounts or banning users. Less common in the Terms of Use was referral to law enforcement and providing resources. Additional actions that were commonly mentioned in the Community Guidelines included suspending accounts or banning users, providing resources and referral to law enforcement. Less common, though present, was adding a warning.
CONCLUSIONS
SMS policy documents prohibit the posting of public health related content by its users and specify actions that will be taken by the service provider for violations. A better understanding of SMS policies could allow for the identification of omitted topics of interest and advocacy for their inclusion.