The critical take-away from that post is summarized in the closing:
The ideal moderator does as little as possible. But those little actions may be powerful and highly concentrated. Judiciously limiting your use of moderator powers to selectively prune and guide the community -- now that's the true art of moderation.
This is the essential theory of moderation: when it's done well, it's hard for most users to tell that anything is being done at all.
This implies a few things that, unfortunately, are not always true:
- The majority of posts will not need to be moderated
- The bulk of moderation won't be done by moderators
- The majority of a moderator's actions will not be controversial (in the sense of dividing the community) - rather, they will be exemplary, implementing the community's wishes in a way that is understandable.
It's not hard to find a site where one or more of these ideals isn't met. But those remain the ideals that we should be working toward. From that same post:
We designed the Stack Exchange network engine to be mostly self-regulating, in that we amortize the overall moderation cost of the system across thousands of teeny-tiny slices of effort contributed by regular, everyday users.
On a site that gets a question or two a day, this is mostly just talk; a nice ideal perhaps, but not an essential one. But on sites that get hundreds or thousands of questions each day, with answers to go with them, this ideal is what allows that to keep happening! I wrote about this at length a couple of years ago:
By rights, Stack Overflow should have died already, turned into an irredeemable cesspool by a combination of outsider influx and insider burnout. You can argue (and many do) that we're headed that way - but we've been headed that way since day one. The best we can hope for is a stable orbit, forever falling but never crashing. I believe there are two major reasons why Stack Overflow has managed to scale far beyond the expected limits of a group:
Conversations not required. When a question is asked on a traditional forum, answering it often demands some amount of participation from at least a portion of the community. Details are fleshed out, the problem is clarified, solutions are proposed and debated, others with similar problems chime in with their experiences, tangential points are made, and eventually - anywhere from hours to months later - the conversation dies out. It's a very social, very natural way to interact. And it suffers mightily from the problem that Shirky talked about: all that back-and-forth and associated latency kills any hope of scale. On Stack Overflow, we close or delete questions that can't be answered straight away - it's not very sociable, but it scales wonderfully by effectively enabling a vast, human-powered computational grid.
Tools that allow decoupling moderation from communication without separating moderators and users. While Stack Overflow does have a powerful "moderator class" elected by the community, a fairly large portion of the actual moderation is performed by individual members of the site, those who've participated enough to demonstrate sufficient familiarity with the community. While this has been a fundamental part of the system for a very long time, I didn't fully appreciate how it relates to scale until I started working with very small Stack Exchange sites: the proportional cost of moderation is much higher, even though the total volume of work is lower. Many hands make (relatively) light work... As long as the system puts tools in those hands.
Moderators on Stack Overflow handle thousands of flags every day, but those represent only a fraction of the flags raised (or about 15% over the last month). Thousands of posts get flagged every day on Stack Overflow, but only a fraction of the posts created get flagged. And while you can usually find a discussion or two about some moderator action on Meta Stack Overflow, the vast, vast majority of moderator actions go unnoticed and undisputed.
Routine exceptions
The problem that folks usually run into with that blog post is twofold:
- They don't read the whole thing. No, really, they don't. That's why I started here by quoting the ending, because I'm pretty sure an awful lot of the folks who quote from it don't make it that far.
- They don't realize that exceptions happen all the time. If 3 out of every 100 posts might need a moderator's attention, that doesn't sound like a lot... Until you're getting 600K posts every month.
"Exceptional" and "as little as possible" don't mean that moderators shouldn't act when necessary; rather they mean that we've hopefully designed the system such that most of the time it won't be necessary for moderators to act. That design is an ongoing process, and sometimes we fall well short of that ideal... But that's the goal we should all be working toward: the folks designing the system, the folks using their moderation privileges to help shoulder the load, and the moderators themselves. Because when you see one person doing too much, that's a sign that someone else isn't doing enough: the balance is upset, and the only solution is for more of us to step up and help.
Something to work toward
In conclusion, no I don't think we need to update that post, or the help center page that mostly mirrors it. It's the theory on which this whole system is built, and more importantly it's a set of practical goals to keep in mind when we continue building - whether that involves new tooling, new rules, or new guidelines. It may not be particularly useful as a low-level instructional guide, but those are better left elsewhere; this is "A Theory of Moderation", not "A guide to moderating comments" or "why is everything on the Progse homepage closed?" after all.
And as mission statements go, I'd have to say that it's remarkably grounded.