Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It's not a bad metaphor. If your community is ignoring trolls instead of banning them, it's effectively sending the message that the trolls are very welcome. I've never observed that a no-moderation community is free of trolls, and ignoring trolls often results in them just taking over because they just start talking to each other and everyone else leaves.

These things are caused poison because even a small amount can cause serious problems if left unchecked, and the effect creeps out across an area (how many people are hit) like poison spreading.

I think what you're missing is that a very large amount of people simply do not enjoy a certain style of discourse to the point that they'll opt out of a community that doesn't ban that discourse. You may not like it, you may think those people are weak or something, but the rest of us want to talk to them, and we want them to feel welcome.



sort by: page size:

Over the years I've come across a few online communities that have noticeably higher conversation quality than the rest of the internet. The one thing all of those places had in common is a very strict moderation policy.

Not every forum/community/website should tolerate literally all people. It's never helpful to engage with trolls and if people aren't willing to argue in good faith than they need to go. I've seen tons of communities slowly lose their identity because they were too accepting of counterproductive conversations and bad faith arguments.

The idea of this form of tolerance comes from the goal be open minded and listen to opposite arguments, as well as to prevent a community from turning into an echo-chamber (or a cult), but I think it's better to lean towards heavy moderation. Banning someone who would be a good fit for the community is unfortunate, but ultimately not a huge loss. Not banning even one troll can drag down a whole community.


I think that's the recurring lesson of online communities: moderation works. Without it, the community will eventually degenerate into a haven for trolls. Unless it was always meant to be a haven for trolls; then it will become a haven for different trolls.

Maybe this is a sad lesson. I would prefer if a completely unmoderated discussion could remain civil and constructive at all times, but there's probably a reason why even panel discussions with only a handful of people need a good moderator.


Thanks for the link! I fully agree that banning is an important tool, though oftentimes a softer approach is more effective. Like a polite, firm warning, for disruptive people who aren't trolling for entertainment nor complete "assholes". And you should request people not to feed trolls; pointing out that if they find themselves feeling a strong emotional urge to respond, it's a sign they're dealing with a troll.

(Being a civil moderator is important because it sets an example — you won't overreact over some tiny irritant. And of course, recreational trolls want you to overrespond; that's their entire game.)

If you do encounter truly disruptive people, then aggressively removing them from the forum is important, like you'd remove a harasser from a meeting.

I like this case study of how a feminist forum had to deal with conflicting internal forces keeping them from maintaining a useful forum. (http://inkido.indiana.edu/research/onlinemanu/papers/herring...)


Not really. Moderation is a local matter, something you sort out with the moderators of the instance you're on. Instances that don't moderate well are blacklisted. It's not like there haven't been trolls before. In practice the system works fine.

There are moderated lists. When someone gets out of hand give them a temporary ban, and tell everyone why there were banned. People will learn that if they want to participate they will have to behave. After time, when the community scares away the trolls, they can loosen the moderation rules. There are plenty of communities that self moderate after having a strong leader show them what was socially acceptable and what was not.

Part of the problem is that many people don't have (or want) any reputation on most sites they might comment on and creating accounts is generally trivial.

I tend to worry, though, that what people label as trolling ranges from the idiots who spam bigoted nonsense to anyone who doesn't believe what the moderator does. I think it's better to moderate only incivility, bigotry and the like, as bans on "trolling" appear to have morphed into bans on dissent at some point, possibly due to some corollary of Poe's Law, even though what used to be labelled as trolling is still just as present and obnoxious.

I say that as someone who moderated quite a lot of different forums and who has banned all manner of spammers, flamers, etc., but I'd never ban someone for expressing, in a civil manner, why they did not agree with me and I've been left feeling in the minority in that regard.


Not necessarily, but it can be for sure. I think in the context of reddit threads debating the need for moderation, a large proportion, if not most of the free speech folks were working backward from wanting to defend trolls toward an interpretation of free speech principles that effectively worked to shelter the trolling behavior.

The bit about tone policing is under appreciated IMHO. If you are very careful to never offend anyone then you will never be influential. Influence is about changing people's mind and people may be offended when you challenge their preconceived notions.

Now obviously you need moderation on large discussion platforms. The disproportionate effect of trolls can't be ignored, but it is possible and even easy at times to go overboard and create a space where nothing of value can be discussed. It's the sort of thing you see when people are complaining that a space feels stuffy and corporate.


The biggest reason IMO for moderation in the first place is because if you don't block/censor some people, they will block/censor others. Either by spamming, making others feel intimidated or unwelcome, making others upset, creating "bad vibes" or a boring atmosphere, etc.

So in theory, passing on moderation to the users seems natural. The users form groups where they decide what's ok and what's banned, and people join the groups where they're welcome and get along. Plus, what's tolerable for some people is offensive or intimidating for others and vice versa: e.g. "black culture", dark humor.

If you choose the self-moderation route you still have to deal with legal implications. Fortunately, I believe what's blatantly illegal on the internet is more narrow, and you can employ a smaller team of moderators to filter it out. Though I can't speak much to that.

In practice, self-moderation can be useful, and I think it's the best and only real way to allow maximum discourse. But self-moderation alone is not enough. Bad communities can still taint your entire ecosystem and scare people away from the good ones. Trolls and spammers make up the minority of people, but they have outsized influence and even more outsized coverage coverage from news etc.. Not to mention they can brigade and span small good communities and easily overwhelm moderators who are doing this for volunteering.

The only times I've really seen moderation succeed are when the community is largely good, reasonable, dedicated people, so the few bad people get overwhelmed and pushed out. I suspect Second Life is of this category. If your community is mostly toxic people, there's no form of moderation which will make your product viable: you need to basically force much of your userbase out and replace them, and probably overhaul your site in the process.


Everything is moderated to the extent that you can choose to remove yourself from the situation. Moderation in some form has to exist to limit the chaos. It's not black and white either, sure anyone of a certain level of popularity will get trolls to some extent, you are right. That happens in real life too. I'm just saying that it matters 'where' the discourse is taking place and it matters who is involved. If you and I where having this conversation on Reddit, somebody would probably have trolled us already.

Unfortunately this is the nature of moderated communities. And of course unmoderated communities tend to attract trolls and extremists. I don't think this is a solved problem yet.

It is not the only the only thing that can be observed or that matters for moderation. You prioritization of civility above all else is being exploited by trolls, something I've pointed out on many occasions.

This paper goes into considerable detail about how the suppression of disputes actually fosters an increase in bad behavior by preventing community members from repelling raids by trolls. By not allowing anyone other than yourself to call out bullshit, you're soliciting more of it. You acknowledged above that this is an inflammatory topic but you are spending your efforts on restraining the very people pointing that out.

https://dl.acm.org/doi/fullHtml/10.1145/3178876.3186141


Internet moderators should be about removing trolls. However, it has turned into down voting and silencing opposing viewpoints and political opinions.

The end results is even more trollish behavior because many feel they can't even state an opinion without being silenced. Reddit is a good example of this.


I co-run a discourse community that basically operates on these rules. Absolutely nobody has been banned and weve had many...odd people come by at one point or another. Id rather not name it but I think the kind of thing youre describing is less rare than you think. It just relies on your existing userbase having thick skin and the willingness to self-moderate and come up with creative solutions for problematic users.

This sort of thing seems to be a hacker answer to user problems. On the upside, it accomplishes the goal of the forum: To be left the hell alone by people deemed to be troublemakers, and it does it for a minimum of effort and hassle on the part of the moderators. You don't have to listen to these folks whine, harass you, etc. It's ruthless but effective. On the downside, it has other unintended community (and reputation) effects that some folks seem to fail to recognize. If you either don't see those effects or believe that tolerating them is "the lesser evil" compared to what you would have to do for some other solution, well, then silent banning is apparently convenient for the moderators/forum owners.

I'm not suggesting it is wrong or that doing things some other way would be "better". I have difficulty imagining that it's something I would choose to do. What I do has its upside but the downside is that it is labor intensive and emotionally wearing. And my approach has never been tested (at least not by me) as a moderator method for a large, high-traffic forum like this. So I don't know if it is even practical, much less if it would really accomplish the goal(s) behind current moderation policy here. So it is possible that my approach to social conflict doesn't scale well...etc... Which means I'm not terribly interested in judging this as "right" or "wrong" for this forum, much less in some absolute sense. <shrug>


For starters, when most forums ban people it's because they want them to leave. Allowing trolls and assholes free rein as long they're rich kind of runs counter to the whole point.

I agree somewhat and I disagree somewhat. At one point, I helped to moderate a religious discussion community, several hundred listed members. The Community Founders had set up a bunch of odd rules which sort of made sense given the history. For example, when you mix religion and Internet, civility becomes an issue. But after some situations where both of the insult-flingers were telling the mods "hey, cut it out, we're having fun here!" they had decided that we would only get involved when someone complained. That worked until people started trolling other members and then tried to complain when they were abused in return, as if they hadn't been trying to pick a fight in the first place. So they had also decreed that we might issue warnings for wasting our time, to get rid of these new trolls. But then people became afraid of complaining, because this process was difficult to make honest and transparent.

The "lower level" mods among us mostly tried to keep the peace without disciplinary actions, then: we would just say, "hey, this discussion is getting too heated: cool down, come back tomorrow, we're all humans here." This wasn't too bad, but some of our mods had these sorts of "itching-for-a-showdown" personalities, and I guess people always were a little worried about the powers we technically held. In the end it was gaining a reputation for being too abrasive, which meant good-hearted religious people didn't come by anymore. It reminded me of nothing so much as Sartre's statement, "we have the war we deserve."

I guess the lesson that I'm trying to articulate is that, especially in Internet communities, transparency is necessary to alleviate fear, but inaction can lead to your community passively falling astray from its goals, discouraging the members who you want.

When community is at stake, as it is in Facebook and Google, there is a higher-order priority. You have to decrease worldsuck and foster awesome[1]. That is a somewhat arbitrary priority, and creating rules and transparency around it can seem like an impossible task.

[1] Cheers, nerdfighters.


How has this never been the case?

This is just being clear about a reality that exists in all communities with moderation standards. Moderation has to be enforced by someone and there isn't always an objective test to determine harassment or trolling. Yet most communities that are successful in the long run have some policy like this, whether explicit or implicit.


This seems to be borrowing from the 'your right to swing your fist ends at my nose' cliche. So, let's put it in those terms. 'Your right to post a comment on an open internet forum ends when it meets my right to only have things I value on my screen.' Surely most folks can see why that doesn't work at all. Moderation is already ubiquitous, and that is not the same thing at all. That's the forum's owner exercising their right to boot your account and delete your comment. I'd be interested to hear any sensible proposals for dealing with trolls that we don't already have everywhere, i.e. moderation and user-centric filtering tools.
next

Legal | privacy