Individual Submission Summary
Share...

Direct link:

Transparency in Content Moderation Is More than the Provision of Information

Sat, September 7, 12:00 to 1:30pm, Marriott Philadelphia Downtown, 405

Abstract

The EU’s Digital Services Act (DSA) aims at increasing transparency in mostly opaque content moderation practices of social media platforms. But what exactly does transparency mean in this context? Existing research and policies conceptualize transparency as the provision of information about procedures, measures, tools, and the scope of content moderation. We argue that this is too narrow and that it is essential to also consider how information about prohibited content is communicated to users. Community guidelines are written to familiarize users with the platforms’ rules in a comprehensible manner, which means that transparent provision of information would require users to be able to clearly understand what behavior is prohibited and what is admissible. Building on a novel dataset comprising content moderation policies of the most popular social media platforms, we measure the complexity of community guidelines for comparative analysis. We rely on computational methods such as large language models (LLMs) to categorize the prohibited content across platforms and to determine the complexity of the guidelines’ structure and language. We derive recommendations where current regulation regarding transparency in content moderation could be improved.

Authors