What is a content moderation strategy?

04/11/2021

Susanne Kopf, Assistant Professor at the Institute for English Business Communication

What is a content moderation strategy?

“Content moderation strategy” is a concept that is often mentioned in connection with social media. Social media platforms such as Facebook, YouTube, and Twitter usually have policies in place that regulate what types of content users may share on these platforms and what rules of conduct they need to observe. In this context, the term “content moderation strategy” can refer to three different aspects.

What content is welcome? What are the consequences of non-compliance?

The first aspect concerns the boundaries that social media platforms set for their users. In this context, content moderation strategy refers to the question of what types of content and behavior social media companies want to see on their platforms and what types of content and behavior are not tolerated. The second aspect concerns the potential consequences of non-compliance with the applicable policies, for example deleting unwanted content, blocking user profiles, and, in extreme cases, initiating legal proceedings against certain users.

Top-down or bottom-up enforcement?

The third aspect that is relevant in connection with content moderation strategies is the question of who monitors the content and enforces the rules, and how they do it. In my research, I distinguish between top-down, community-based, and community-informed enforcement. Top-down enforcement means that social media platforms exclusively use their own systems (e.g. automatic filters based on artificial intelligence or manual controls). Community-based strategies take the opposite route: In these cases, the members of the user community enforce the rules themselves, as is for example the case in the community of Wikipedia contributors. Community-informed enforcement is a combination of top-down and community-based approaches. This means that users report non-compliant behavior to the platform administrators, who then go on to check the content in question and take further steps as appropriate.

Censorship versus user protection

On a fundamental level, this topic raises a number of complex questions. To what extent can we leave content moderation on social media up to private companies like YouTube and Facebook? After all, content moderation regulates what can and may be expressed in the public sphere. On the other hand, however, we also need to ask ourselves what other methods are available to protect users from harassment, hate speech, and misinformation on the web.

Susanne Kopf, Assistant Professor at the Institute for English Business Communication

Back to overview