Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I don't believe the situation is untenable.

Really, given the tenor of some of the online communities i've been involved in, what passes for unacceptable behavior here is still practically civil. I understand the desire to 'engineer' the problem away but I don't think that's really possible, apart from the community itself being more tolerant and the moderators being more strict but fair. It is a public forum, with a practically nonexistent bar for entry. And i've seen older, established posters act with vicious condescension and newer posters being polite and considerate. You have to accept a certain ground level of chaos, bigotry, backbiting, trolling and noise as part of the system.



sort by: page size:

This is kind of what I mean. This seems like shifting goal posts to me. They had the whole harassment patrol thing on Twitter which I found sort of silly, personally, but I don't know how that would be described other than an effort of some sort. I certainly wouldn't call it an abject refusal to even try.

Like I said, I think it's a larger problem of shitty behavior on the internet in general, and I don't think anyone has a real solution at this point. I find it unfortunate that people use that behavior as an excuse to talk past each other.


I think this sort of thinking is far too fatalistic and basically throws people who cannot deal with abuse and hate under the bus to boot.

The platforms hate and abuse are delivered on are currently under human control. The idea we have no control over whether or not we receive hate and abuse is simply not true. The solution, moderation, isn't even new and is a core feature of all polite online spaces. HN is a good example.


I think it is one thing to receive the odd insult in internet discussions, and another thing for somebody to do some sort of campaign against you. I don't know the details of the original problem, but I can imagine that there are some sorts of online harassment that call for more severe countermeasures than stableness.

Er, what? The harasser is aggressive and bitter? They harass people in real life? It sounds like they have some problems. Maybe they can go work on themselves and come back to the community when they're ready to be civil.

We wouldn't tolerate sexual harassment, etc., in the workplace or in civil society. I don't see why we wouldn't expect a basic level of decorum in respectable online communities.


depends on how you define "acceptable". it's hard to publicly punish someone for bad behaviour on a blog but the wave of indignation shows that it's not accepted if someone crosses a line

Very well put. Though there is still some refuge in smaller online communities, if you know where to look. If you limit your exposure to specific places and avoid the rampant toxicity elsewhere, it's not that bad. :)

don't we have decades of evidence from online communities showing that?

From usenet to reddit, many communities that tolerate abusive conduct degenerate, as the "bad behaviour" is normalized and becomes accepted.

The fact that you can't create a child-proof YT doesn't mean you shouldn't strive to keep negative behaviour at bay.


The real problem here is, we still aren't, and probably never will be wired for 'online'. To that I mean, take a standard community. A community that has existed for, quite literally, through our ancestors for countless generations.

In such a community, sure... people could say anything they wanted. People can do so now! That is, you go for a walk, and people can approach you, say whatever, there is a wide range of what you can say in public.

But... if you act too aggressively, you get a punch in the head. If you yell and scream obscenities, people ignore you, if you do it too much, you again get a punch in the head. If you say 'crazy things' but are polite, people make excuses "Sorry, have to go look at this bush over here... talk to you later, have a nice day!". If you stand in the middle of a park and start screaming, again ... if you don't stop ... punch to the head.

(In more modern times, 'punch to the head' may be replaced by 'officials that come and make you stop via force if necessary', but the result is the same)

And to this, if you have people sitting in a park at a picnic table, enjoying a conversation, there is a limit to what people will tolerate.. if you just amble on over, start talking, and at the same time don't "fit". If you join a church, same thing. If you join a club, same thing.

Imagine if people got together, a group of 5 friends a picnic table at a park weekly, to talk about hockey, and some guy came over every day and started talking about how hockey is really fascist, and you're all asshoes, and blah blah.

Would that go over well? HELL NO!

In short, there is no moderation in traditional society, but there is also no tolerance for people shoving their face in your business. People that do not fit are not tolerated.

People confuse "how people interact in real life" with "people can say anything they want in real life".

Taking a step back, people have never had limitless ability to force others to hear their crap. You could take out ads in a periodical / newspaper... IF the newspaper thought it was not going to get many people mad at them! You could print your own "stuff", and pay to have it delivered, or hand deliver it to people on the street. You could print stuff and try to get people to buy it. There are loads of delivery methods, but even this is all new, 200 years ago no average person could do this.

So some online forum where you're talking about turtle eggs, and endless people liken it to "MY GUY!" or "POLITICAL THING" or "BUY MY APPLES" is absolutely non-real in terms of how humans have ever acted before!

Something like /., with its moderation system, and its meta-moderation system was a good start. "No! Go away from my picnic table!" with the comment still there but gone, is much like real life. But even that is not completely real, for eventually people "punch you in the head" if you persist.

As this thread discusses, most moderation systems seem unable to handle this. The truly sad thing is, I think that:

* until sock puppeting is impossible

* until identity is linked to your actions immutably

We won't ever get past this.

And this means that anonymity needs to end, and a person's actions need to count, because in the real world everyone can see "Oh crap, Bob's coming over here again", and "eventually you go to jail" is how we deal with human interaction in real life.

NOTE: none of this detracts from people wanting to, in real life, have a club of "completely open ideas". Yet that is quite rare!


> What makes this happen?

I'll answer the, perhaps rhetorical question, by defining a new law: "Any online community that doesn't actively encourage its users to be nice, and civil, to each other is bound to become a cesspool.".


Interesting. Do you have experience moderating a comparable community, or what leads you to believe that it's not as much of a problem as they make it out to be?

I'm sure it will always be possible to find a way to harass someone. But do you believe putting barriers in place has no effect on the amount of harassment that targets have to endure? What do you think of the computer security concept of "defense in depth"?


Sure - I get that. I understand with enough people in a semi-anonymous forum that it could get toxic quickly from just a few bad actors.

However, a lot of the commentary here has been around in-person interactions and not on an online forum.


It shouldn't surprise you that online communities are a reflection of human nature. The behaviors we see are the effects of groupthink taken to its extreme. Even in "civilized" forums such as this one, you'll notice people conforming with the group mentality, and deleting their posts if they get downvoted. When there is little to no moderation, and viciousness is acceptable within the group, this behavior will spiral out of control, as individuals try to one-up each other, leading to vigilantism, manufactured outrage, cancel culture, racism, and all sort of abhorrent behavior an individual wouldn't do in isolation. We all seek validation in our ideas, after all.

All of this still exists offline, BTW. This is how riots happen. What the internet does is make it more possible by connecting more people than we traditionally could communicate with, and providing a safe space for any kind of discussion to happen. It's a tool our tribal monkey brains aren't ready for.


While driving the marginal cost of communication to zero would seem to bring people together, what it really does is bring similar people together but also exposes significant differences between people.

The essential reason for the toxicity is the contestation of online spaces, which are virtual territory over which proxy wars can be fought as a low-cost substitute for physical violence (although there is a direct nexus to physical violence, and the threat thereof, by both individual and state actors, is a factor in the aforementioned contestation.

There are 3 basic approaches to encroaching toxicity:

  a. ignore it aka 'don't feed the trolls.
  b. implement technical solutions to manage it.
  c. Fight it aka flame wars.
Ignoring it doesn't work. It just tells the most vulnerable members of a community that they don't matter and that if they are repeatedly harassed other community members will sympathize but not really do anything to help.

Technical solutions originate with the California preference for systems thinking, and are reflective of the legislative and administrative technology in which they've been incubated. They are somewhat effective, but any system can be gamed. Most sites opt for a mix of technical means and hands-on moderation by a benevolent* dictatorship which works moderately well but is not responsive or effective against determined attack.

* benevolent in terms of close alignment with the ethos of the forum, whatever that happens to be

dictatorship in terms of being arbitrary rather than mechanistic, semi-transparent, and unilateral

Flame wars are upsetting to everyone, and people in the first 2 camps view as the worst-case outcome because they take over the thread/forum/platform where they occur and are destructive of comity, much like their real-world analogs. However, they can be effective in repelling invasive toxicity - if a sufficient majority of the forum regulars participate cooperatively. If too few participate or forum norms inhibit or punish participation, then toxicity will prevail or advance.

Here's some empirical evidence supporting this based on data collected from raiding behavior on Reddit, which is similar enough to HN to serve as a useful comparison (includes links to papers, slides): https://snap.stanford.edu/conflict/

Cross-platform raiding behavior has existed as long as bulletin boards, and has been systematized and refined in line with the systematization and refinement of game and software development strategies. Here's a (somewhat offensive) overview from some years ago of trolling strategies, summarized near the end in a convenient flowchart: https://digitalvomit.wordpress.com/the-ultimate-guide-to-the...

The sophistication of raiding tactics, documentation, and so on has increased significantly since that was published. Ultimately toxicity online is neither a product of technology or the exposure of an inherent flaw of human nature, but the visible manifestation of multilateral information warfare, which is itself preliminary maneuvering and battlespace preparation for more overt forms of conflict like cyberwarfare, open economic warfare, and kinetic warfare.


It's certainly not just status and influence.

Sometimes you watch a group descend into being a cesspool of awfulness. The light stuff is death by boring memes, insults galore, "just an opinion" that are high-emotion crazy rants.

But the heavy stuff is worse. Watching individuals be egregiously insulted and threatened. Watching people delete their informative and well-meaning posts and never come back, because the first comment was some evil asshole having another content-free belittling attack because that's just what they do, no reason. Watching people be told they are incompetent, useless, should die, are the wrong gender, that sort of thing.

It is perhaps worse on locality-specific forums. Because then you are in view of people you know in your area, or are likely to interact with in future. Assholes don't care, and seem to enjoy the "status" that comes from putting down other people in the local area.

It does not help that such interactions seem to act like magnets in local forums, where putting people down through petty and unjustified insults attracts more of the same, and substantive interactions are lost. I guess that's because easy putdowns are low effort entertainment for some people.

But people finding themselves on the "unwinnable debate" receiving end of that sort of thing, end up leaving forums to get away from it, or are highly stressed whenever they have something to say, and may spend hours trying to carefully word everything just to avoid petty flamage from assholes and get substantive replies instead.

So some people are motivated to try to keep community areas a little gentler and cleaner. To protect and welcome others who appear to be joining in good faith, from the occasional blast of shitty replies. Like another commenter said, it's like clearing up litter when you go for a walk. Would be nice if it wasn't necessary, but because some people litter, others want to clear it up.

That's enough of a motivation for some. I've seen people spend hours moderating out of nothing more than a sense of civic duty and a desire to protect other people and encourage them to stay and feel welcomed and defended, to grow the community in a more friendly directionm, and to try to make the forum a less stressful place for new participants who have doing nothing wrong.

Like any volunteer working for something they feel is a good cause.


IMHO, it's just the way of things everywhere on the internet. It wont ever change, and the entire population will eventually develop a thick skin. I personally have zero tolerance for abusive verbal behavior, but I've become very comfortable with it in text.

On the other hand, user moderated communities like Hacker News and Reddit seem to cut back on this a lot. Only funny or informative abusive comments are shown.


No doubt it will be a constant complicated debate, as social norms tend to be. But we don't throw up our hands and say "we can't have social norms! Who picks them? How do we vote?!"

Unfortunately I don't think we'll come up with a simple set of universally applicable rules and move on.

I do know that my abuse-free internet experience is very much a bubble, and every time I have been given the opportunity to see the experience of a "woman with an opinion" online I get to realize this. So let's as a society say "this isn't acceptable" and then deal with the much more difficult question of "what to do about it".

In any other commercial venue if a patron started shouting racist and sexist epithets, we would be ok with the owner throwing them out. We don't get caught up in slippery slope fallacies.

I can't scientifically prove that Fat-people Hate was objectively abusive but that's the beauty of social norms, I can say "I think it's abusive" and put pressure on businesses to remove its platform. Which is what I'm doing :)


Sure, and doomscrolling, misinformation, and the constant need for external validation are horrific problems. I don't think they're quite on the subject when we're talking about people's refusal to disengage with online hostility. It's like trying to make it illegal for people to flip you the bird in traffic. Is it a nice thing to do? Absolutely not, but letting it ruin your life is completely optional. The same thing applies to people on Facebook telling you to go fuck yourself.

A well mixed cesspool? No one with any sense wants to hang out in a cesspool. That's the problem. You are expecting human behavior to change. And worse, on the internet none the less.

I think it's important to recognize how we got to the point where we are now. Infantilism is a big component in far anything communities, but the internet has made it way worse in some cases

If you go to (excuse my outdated concepts of extremist communities) Parler to talk about LGBT in a positive manner you will get about the same amount of vitriol as if you went to r/FemaleDatingStrategy to call out misandry or if you went to a specific /pol/ thread to fight antisemitism.

I think the bottom line here is, vitriol is pretty much omnipresent among us. The difference between communities is how they run the ductwork to siphon it out of our daily conversations. That's where the discourse is formulated and fine-tuned to the specific needs of its members.

When you build bubble-like communities, you will get echo chambers that breed infantile subjects. If you allow people to call each other certain slurs, but not other ones, you will naturally optimize for resistate to the former. If you build a forum with usernames and perhaps even an upvote system, people will recognize and build up reputation bound to their names. If you make an Anonymous board, people won't care about reputation. These are just two options on a huge spectrum of possible alignments. Engagement-oriented platforms (by that I mostly mean social media and Reddit), are however a special case.

Maybe an anecdote makes more sense: A few years ago, in high school, I used to find joy in trolling. I felt especially at home on the imageboard that starts with four, but when the thread would scroll over the limit and plunge into oblivion I realized how little those three people that I made seethe actually mattered. To contrast that, on platforms that value engagement, it was and probably still is a lot easier to reply with something inane and watch the replies roll in. A single statement that would go into the archives mostly unnoticed instead made an impact on dozens if not hundreds of people. After getting out of my turbo edgy phase I realised that I hurt a pretty good amount of people in both cases, and it feels somewhat dishonest to believe that every single downvote, reply and slur hurled my way was born in infantilism.

What I'm trying to say is that when we increase the number of interactions, we as a byproduct also increase the number of "bad" interactions. When I talk to "bad" people with "bad" opinions, I try to recognize that even if they are 90% infantile garbage, the rest can come from honest pain and discontentment. But sometimes that's just being too charitable. People are hard.

next

Legal | privacy