Defaulting to boilerplate answers, they didn’t engage in a genuine conversation: Dimensions of Transparency Design in Creator Moderation

Image credit: Unsplash

Abstract

Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators’ digital labor, how transparency design could support creators’ learning, as well as implications for transparency design of other creator platforms.

Publication
PACM on Human Computer Interaction, CSCW
Renkai Ma
Renkai Ma
HCI Researcher focusing on HCI, social computing, trust & safety

I use human‑centered desgin approaches and mixed methods to study platform moderation with users, moderators, and policy experts to direct better design and policy‑making changes for online communities.