Affiliation:
1. Beijing Academy of Social Sciences, China
2. Renmin University of China, China
Abstract
How platforms moderate online content remains “black-boxed” due to diverging and sometimes conflicting logics in the governance network of different forces and stakeholders. The key to unpacking the “black-box” lies with meaningful human agency in balancing these logics in contextualized practice. This article examines human moderators’ decision-making based on qualitative data drawn from Chinese leading platforms. Grounded theory analysis theorizes various contributing (f)actors and introduces a model of “bounded discretion and tacit interaction in an elastic dynamic” to address moderation decision-making. It argues that it is though bounded human agency that platform governance is enacted, negotiated, and further “black-boxed” as inevitable human inconsistencies and contingencies are molded into governance. This study provides useful categories for analyzing platform moderation, encodes meaningful human agency as bounded discretion in the governance network, and unpacks the “black-box” of platform governance as an unfolding process of interactions among institutions, forces, and most importantly, human participants.
Funder
National Social Science Fund of China