Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

For starters, when most forums ban people it's because they want them to leave. Allowing trolls and assholes free rein as long they're rich kind of runs counter to the whole point.


sort by: page size:

Why do people who are banned always think this is about them, as if they were the centre of the universe? Banning people is not for their benefit, to somehow make them better people, keep them from forming radical views or nonsense like that. It's solely for the benefit of the remaining users and the forums themselves. That means you have to strike a balance between banning/removing too much or banning/removing too few. It's not about making the world a better place, it's about making the forums a better place.

Besides, why should I, if I am a forum administrator or company running forums, force myself to run forums for users with world views that are completely opposed to what I can stand behind with my conscience? By providing them with a forum I support those people and groups. Why should I support people whose views I find disgusting and appalling, and, in addition to that, not helpful for the discussions of the majority of regular users? I cannot find any reason why.


Banning people is an upside. Setting boundaries of acceptable behaviour is necessary for having a space where people won't be troll-brigaded and hate-swarmed, and where the worst people on the internet don't drive out everyone else.

Online forums for random topics used to be the last refuge of the mentally unstable. I've moderated several forums and saw this behavior first hand; a banned user would spend so much time and effort trying to stir the pot and get back at the forum's owners for their Injustice. I often wondered why would they choose to waste so much time in a niche forum rather than go do something productive.

Naturally, nowadays they have online forums that _cater_ to this kinds of individual (4chan, some subreddits, etc). Nature, um.. finds a way.


If you have forums that haven't been raided by trolls and idiots, it's because they have a good moderation system in place.

That includes being selective about who you allow to have an account.


unmoderated forums tend to drive off anyone but those holding the most extreme and toxic views.

Banning users from a forum is literally the second right I list.

> Both users and communities have the right to filter and organize the content they consume and host

https://anewdigitalmanifesto.com/#right-to-filter


The fact that official moderators (read: people with super-user powers) are needed on a forum indicates that the regulating powers are broken by design.

Back in late 2009 I built a forum that allowed every user to temporarily ban any other user. There were restrictions... bans lengths were temporary (and voted on democratically), expiring after the voted time limit, you could only ban a person you replied to, and everyone would be able to see that you were the person who banned the parent post. Other than that, the forum was completely anonymous, and it was able to regulate large numbers of trolls (mostly from 4chan). They seemed to appreciate the equality and the natural regulation.


Why can't a forum cater to everybody? E.g., you could hide troll comments using a filter.

> Forums shouldn't be policing content

Why not?


Part of the problem is that many people don't have (or want) any reputation on most sites they might comment on and creating accounts is generally trivial.

I tend to worry, though, that what people label as trolling ranges from the idiots who spam bigoted nonsense to anyone who doesn't believe what the moderator does. I think it's better to moderate only incivility, bigotry and the like, as bans on "trolling" appear to have morphed into bans on dissent at some point, possibly due to some corollary of Poe's Law, even though what used to be labelled as trolling is still just as present and obnoxious.

I say that as someone who moderated quite a lot of different forums and who has banned all manner of spammers, flamers, etc., but I'd never ban someone for expressing, in a civil manner, why they did not agree with me and I've been left feeling in the minority in that regard.


On top of that, historically internet forums have required fairly strict moderation to remain civil for any significant amount of time. Without that, one quickly ends up with an incredibly unpleasant space that repels more people than it attracts.

Over the years I've come across a few online communities that have noticeably higher conversation quality than the rest of the internet. The one thing all of those places had in common is a very strict moderation policy.

Not every forum/community/website should tolerate literally all people. It's never helpful to engage with trolls and if people aren't willing to argue in good faith than they need to go. I've seen tons of communities slowly lose their identity because they were too accepting of counterproductive conversations and bad faith arguments.

The idea of this form of tolerance comes from the goal be open minded and listen to opposite arguments, as well as to prevent a community from turning into an echo-chamber (or a cult), but I think it's better to lean towards heavy moderation. Banning someone who would be a good fit for the community is unfortunate, but ultimately not a huge loss. Not banning even one troll can drag down a whole community.


I think that's the recurring lesson of online communities: moderation works. Without it, the community will eventually degenerate into a haven for trolls. Unless it was always meant to be a haven for trolls; then it will become a haven for different trolls.

Maybe this is a sad lesson. I would prefer if a completely unmoderated discussion could remain civil and constructive at all times, but there's probably a reason why even panel discussions with only a handful of people need a good moderator.


It's not a bad metaphor. If your community is ignoring trolls instead of banning them, it's effectively sending the message that the trolls are very welcome. I've never observed that a no-moderation community is free of trolls, and ignoring trolls often results in them just taking over because they just start talking to each other and everyone else leaves.

These things are caused poison because even a small amount can cause serious problems if left unchecked, and the effect creeps out across an area (how many people are hit) like poison spreading.

I think what you're missing is that a very large amount of people simply do not enjoy a certain style of discourse to the point that they'll opt out of a community that doesn't ban that discourse. You may not like it, you may think those people are weak or something, but the rest of us want to talk to them, and we want them to feel welcome.


Why exactly? You _are_ free from consequences .. from the government. But how would you expect to be free from consequences when you offend millions of people using the Internet? Why should I not be able to ban people from my forum if I and most of my users don't like them and they are dragging down the quality of the forum?

Those are consequences, and I don't see how you can have some utopia where that isn't viable. Then you just live in a world where you are forced to listen to the broadcasted thoughts of idiots.


It works for 4chan's users or they wouldn't hang out there. But the level of moderation is a competitive feature. Users decide where to hang out based on the quality of discussion, and moderation has a direct effect on that (in both directions - too heavy-handed or too hands-off).

Also, in the end, the people who host a forum get to choose whether they really want to host a cesspool or not. If it's not working for them, they can shut it down, or maybe outsource moderation to Facebook or Disqus.


Why should the bar be so high? These are private forums, after all. If I don't want someone who exhibits antisocial behavior in my community, I shouldn't have to allow it.

This seems to be borrowing from the 'your right to swing your fist ends at my nose' cliche. So, let's put it in those terms. 'Your right to post a comment on an open internet forum ends when it meets my right to only have things I value on my screen.' Surely most folks can see why that doesn't work at all. Moderation is already ubiquitous, and that is not the same thing at all. That's the forum's owner exercising their right to boot your account and delete your comment. I'd be interested to hear any sensible proposals for dealing with trolls that we don't already have everywhere, i.e. moderation and user-centric filtering tools.

I "run" an forum that is an offshoot of a very popular professional software development blog. The original forum was closed by the blog author and my site replaced it. This forum has rules but they're pretty minor (no doxxing, etc) and users cannot be filtered away, only posts can be removed.

Having run this forum for over a decade, I have a lot of insight into running an anonymous free form nearly censorship free place on the Internet.

1. There are many mentally disturbed people on the Internet. So point #1 of the article, "We are all adults. Capable of having adult discussions." is already off the mark. And even perfectly normal people tend to be worse human beings online (The Greater Internet Fuckwad Theory). And these are people, like you and me, who earn their living making software.

2. "We accept everyones contributions." Yes, but bytes being free creates a situation where some contributions will overshadow other everyone else's contributions. 1 person can literally post 10,000x more than anyone else. And they will, see point #1. This creates an unbalanced situation that is very hard to resolve.

In my opinion, if you want to be inclusive it's pretty damn hard without any rules, moderation, censorship, limited access, etc.

next

Legal | privacy