What types of moderation tools should a community platform provide?

What types of moderation tools should a community platform provide?

In online communities it is necessary to moderate the content that the community has created and relationships between the users. The tools for content moderation must be separated from tools to moderate relationships.

  • Content moderation should be available to a wide range of potential contributors. Any reliable user who knows community standards and wants to keep the community clean should be able to contribute, at least by reporting a violation they observe.

  • Only people who are trusted by the other users and know how to talk to people in tricky situations should have the ability to moderate relationships between users. In most communities those are selected or elected “moderators”.

To make moderation more effective some common categories of violations should be automatically detected and reported to the users who are helping moderate the community.


This is a fragment of a draft of the book “Lessons Learned While Working On Stack Overflow”. Read the full book on kindle or the paperback version.