Moderating comments on social media is no job for the faint of heart. It has been likened to “herding cats,” by some who do the work and is a widely known stressor among community managers which can quickly, and often does, lead to burnout. For the social media manager, online community manager or anyone whose role requires communicating and engaging with the public on social platforms, (owned communities or otherwise) dealing with and moderating comments can be the most difficult part of the job.
To the inexperienced, it may not seem like a very big deal. You read or review the comments, make a decision about the content and either allow it to remain or delete it, right? Maybe in theory, but not always in practice. I, for one wish it had been that simple when I was Managing Editor of User-Generated Content for the top news organization in North Carolina, managing a team that moderated comments on news stories in real-time, and was single handedly responsible for an online community that included user-generated blog posts, images and image galleries, all of which allowed user-generated comments – and there were a ton.
Given that we had comment policies and rules of engagement that were very clear, it probably should have been a lot simpler. But after all, there are people behind comments, many of whom believe they have a right to post as they please, no matter how inappropriate, and some with little regard for or perhaps a different view of the policies.This behavior can quickly create an environment where a comment removal ascends to accusations of censorship and infringements on one’s’ freedom of speech and levels of disruption in the community that can wreak havoc on those managing the community and even the brand itself. I often had to post in the news community the definition of freedom of speech, to help those complaining understand why this freedom and their ability to “post as they pleased” wasn’t protected on a privately-owned website and encourage them to adhere to the community rules if they wanted to continue participating.
Now, it’s important to note that not all communities are created equal when it comes to the issue of comments. I’m sure you’re aware of some sites, communities or groups where the comments sections are where you find the most value. Maybe it’s a niche community of moms who can respectfully disagree and debate, a group of knitting enthusiasts and fiber artists, parents of students who attend a specific school or a community of professionals who are passionate about a shared career. I certainly know of several I find quite valuable.
But I also know this has been a hot button issue for well over a decade now, and communities have evolved during this time, as have the tools provided to community admins by the mainstream social networks. Consider how on some platforms, comments can be disabled or hidden or admins can set up automated rules for post moderation, saving them time sorting through posts. This was not always the case and comments sections once had a reputation of being the wild, wild west of the web and labeled “toxic” even, especially on some websites, which shall not be named.
While I believe these moderation tools are a good thing and can be very helpful, what they don’t do for anyone managing social media and moderating comments is set the tone for the environment desired or lay the foundation for users on the type of community they wish to create and cultivate. The way to do that is with clear community guidelines, rules of engagement or a moderation policy. Community guidelines and rules of engagement may be used interchangeably in my opinion, but no matter how it’s characterized, what matters most is the content and the actions taken to support it.
I must say, the community I managed improved immensely over time, as they all do, (except for a few outliers, of course) with consistent moderation and adherence to your own policies and moderation guidelines.
If a comments section is riddled with name-calling and disrespectful posts for instance, a community guideline that prohibits such is meaningless. On the flip-side, if there is no mention of posts being deleted or that comments are moderated at all, and it happens on a regular basis, or even once-in-awhile, participants will be taken off-guard when their content is removed and have a bad experience. If your policy calls for mutual respect and explicitly asks users not to flame, bait or troll others, flaming, baiting and trolling should not be allowed.
Social Media Moderation Guidelines
So how exactly do you craft a moderation policy, community guidelines or rules of engagement that can be enforced and make moderation less of a headache?
- You can start with desired behaviors and what it is you’d like people to do such as “treat others with respect.” Let them know what is acceptable and preferred, before going into what is prohibited.
- Craft a statement or a sentence or two that lays out the community’s purpose. Share your why in an effort to build a sense of belonging.
- Mention that you welcome differences of opinion or spirited debate, but be specific about what does not constitute either and will not be accepted.
- Be explicit. If you do not allow “dismissive responses” for instance, provide examples of such. Try to leave as little room for interpretation as possible.
- If there are moderators or admins who have the final say, communicate that.
- And finally, If you are unable to moderate much at all, or must be very careful because you’re a public entity and content is protected by the first amendment, find out what you can moderate and make that part of your community guidelines and policy.
Let’s face it. No matter your opinion on comment moderation, engagement is the holy grail metric of which those responsible for any social channel or online community is evaluated, and comments equal engagement. Without them, we take the social out of social media.
Interested in more comment moderation resources to reduce risk? Check out these helpful resources:
Learn more about comment moderation and your public record responsibilities!