I don’t really agree. Like I said, I don’t think you will ever get both sides of an argument there; one side will force the other out eventually. I’d rather have a proliferation of communities which are ‘censored’. Sure, have a forum for racists where non-racists get banned, whatever (although I won’t shed any tears if it gets closed down). If I want to see what racists think, I can go check it out. But I don’t want to be part of a space where I’ll regularly encounter racist stuff, and they don’t want to be part of a space where they’ll regularly encounter anti-racist stuff. Mashing me and the racists together in an anonymous arena where anything goes obviously won’t result in us all nicely debating and getting to know each other. It is not a step in the right direction.
I tried to make a community without censorship at one time. It seemed like a simple enough goal. After all, who was I to judge what's right and wrong? I'd let people post anything they wanted, and if it wasn't illegal, it would be allowed. It also provided a fairly clear line on what was acceptable and I liked that concept.
Well, it went great for a short while. After a couple of years it started to get worse though. A small group of people started posting racist comments and topics. Long term members would then ask why this racism was allowed, and I defended their free speech. Then I started to lose some of the friendly regulars from the community. They got tired of being insulted, and if the community didn't censor of remove racist comments, they argued I was supporting them and giving them a platform to spread hate.
As I watched the community degrade, I realized censorship would be required if I wanted the community to develop into what I envisioned and so I changed course.
In short, no censorship is going to give you a certain type of community, so before starting consider if this is the type of social network you want to bring into the world.
I agree that this is the way we should go if we want to have moderation of content that also respects freedom of speech. However, I don't think what most people people want is just to not see racist comment online, they want to prevent/forbid racist comment online, even if they don't see it.
Otherwise Parler would never have been shut down like it was, people would simply ignore it. But instead, people go out of their way in order to silence Parler, even if it's not their community.
I generally agree with you here actually, I wouldn't want any of this to be used to completely suppress speech, but I don't think it's unreasonable to hide by default communities which are higher friction to some degree.
I can definitely see a completely open alternative viewpoint, but reddit has already gone in a very different direction.
I also have a strong desire for a community tolerant of nuanced conversation, but any attempts I've seen to do so (bitchute, gab, parler, voat), seem to almost immediately be taken over by precisely the worst parts of any other community. So then the sane people who would otherwise not mind sharing the community with those fringe factions tend to avoid participating in the communities at all.
It's almost like, ironically, the best way forward would be to create a moderated community that slowly gets a user base of a diverse set of people, and then slowly pull back the moderation over time. Allowing something like subreddits with communities to self-select their own moderation levels is also great, imo. Reddit was perfect until the platform itself stopped being neutral.
I've never seen a community that was completely open to discussion and where bad ideas disappeared because they were effectively challenged.
On the other hand, I've been in several communities where the mods had a clear agenda and the participants were helpless to change it.
I'd personally rather the people who are hostile to new ideas or critical thinking have their own communities and that their ability to harm other communities can be mitigated by a moderation system accountable to the users.
I do think there needs to be serious discussion about how to make communities that are good for people. The echo chamber and polarization effects tend to contract people into a smaller worldview. How to have a cohesive community that doesn't do that is essentially an unsolved problem in the social sciences.
Neither do I. My point was nobody is able to have a "good online space" because some humans suck, no matter the ideology :)
You said "my community", do you think all people using alternative platforms share the same views as each other, much less have any control over each other?
No, it's just like Reddit or Twitter where you can find people saying the worse possible shit about anything.
The thing is there's also just as many good people on those alt communities who just want a place to discuss their beliefs.
Just because the Taliban is on Twitter doesn't mean @timmy is a terrorist because they share the site, same goes for Gab and others.
Personally, I believe if you aren't making threats or anything else illegal then you should be free to say whatever you want, otherwise things get murky on who decides what's "hate" or "misinformation". We've already seen those excuses abused greatly.
A user configured client side filter would suffice without any censorship needed.
It's absolutely true that unpopular minorities (of whatever kind) should be able to talk. It's also absolutely true that virality can create dangerous situations which are difficult to control.
Instead of banning, which I agree has problems, I favor putting questionable communities in a 'slow mode' where people need to wait to join them (and maybe communications within the community are delayed). Then, in 5 or 10 or 15 days, you can see if the community is still 'a problem' (whatever that means in this situation) and if greater moderation is required. The idea is to preserve communities that build over time, give admins time to make decisions, and highlight the hard questions that need to be answered by platforms.
Just seen my other comment was downvoted to hell. Just shows the position of many users of this forum. Why can we not have an area where more people are represented and know fine well they wont be targetted with abuse? If this means banning racist or fascist outlets / users then so be it.
There is no such space because there can be no such space.
If you want a community without bad actors, bigots, cranks, etc, then it has to have rules and moderation, which means censorship. If you want free speech, you're going to get the bigots, cranks and bad actors. "Speak(ing) and think(ing) in a relatively free and neutral sense without ending up in some ulterior trap" requires curation and tone policing, which is censorship. Otherwise, the bigots and the cranks are also free to speak and think in the same way as everyone else, ruin the community and drive everyone else away.
You can't have your cake and eat it too, you have to pick one.
Genuinely curious, can you expand on how you will achieve the middle-ground between safe space and hate speech?
I would assume you are aware of Voat.co and think everyone would agree they are suffering from the "paradox of intolerance", how can you draw the line without stifling legitimate conversation? For example I can't even post an honest question about the Israel/Palestine conflict on numerous subreddits due to the inflammatory responses that usually accompany them.
One option is the thing we used to have: private displays of virtue in distributed systems.
When Reddit is hosting racists, it's easy to blame Reddit (and, more specifically, easy to put pressure on them to deny those users a forum on Reddit's servers). When such content is on a distributed, federated system, it's up to individual service providers how much of a tolerance they'll have for speech others consider completely unacceptable. Some will choose not to host it, some will perhaps choose to host it (and others can choose to peer with them or not)... But in general, there will be more variety of opinions represented in what is "acceptable behavior" when there are many distributed systems instead of a few centralized ones. This does, of course, increase the complexity of accessing them...
... honestly, this probably mirrors societal development as a whole. In a town of frontiersmen, you barely have enough time to worry about what goes on in your neighbor's house. In a city, where you can hear your neighbor through the walls (and have deeply-intertwined interdependency networks through which pressure can be applied), you both have more reason to care and more tools to pressure someone to conform.
... all of that having been said, to take a step back from the mechanical and consider the philosophical, I'm not sure it's an inherently virtuous thing that my frontier neighbor could be out there doing Texas Chainsaw Massacre stuff and there's little structural pressure that can be applied to get him to cut it out...
If you build a platform specifically to house/attract people who were banned from typical platforms because they had a tendency towards promoting violence, then I would argue that you are very much enabling (possibly even encouraging) their behavior. I believe that is a pretty logical sequence, and a clear line to draw.
There are very few people who earnestly want an unmoderated place of discourse, because those serve very little functional purpoae. Eventually most people will find something either irrelevant to their interests or personally repugnant presented to them and will go back to a place where there is some degree of moderation in place so that they can consistently find thing that interest and engage them. Why are you on HN and not one of these wholly unmoderated forums? Even curation of topics is a form of moderation, not to mention HN's strict approach to actually thoughtful commentary. The people who earnestly want a wholly unmoderated space are increasingly likely, depending on their desire for it, to be one of those people engaging in something so boorish that it got them removed from moderated spaces.
Furthermore, there is no small amount of irony in you saying you'd rather talk about free speech right after telling someone what they can or cannot claim.
This is why I wish there was a megaforum where anyone can isolate themselves into smaller forums as they wish. Then no one gets deplatformed by the feds or corporations or whatever (unless they call for violence or doxx someone, something like that), and yet no one has to see what they don't want to. Still but a stopgap to foster better communities, alas.
I run an online community. Building a safe space is extremely difficult and full of grey area, online or offline.
There's a fine line between the freedom to express yourself and offending someone else. As a community moderator, choosing what is and isn't offensive is by definition subjective. It's an impossible task for a global, multi-cultural community. You end up with a vanilla message board where most interesting thought/debate is stifled. Reminds me of the campus "safe space" debate.
I'm pretty much of a similar mind. I don't want to be closed off from dissenting opinion. I'm pretty free speech and although I'm not in favor of platform blocking, I'm all for personal blocking, and/or creating niche communities, maybe ones not even federated.
I think there's a lot of value in seeing differing opinions even if you don't agree with them, as long as the discussion remains civil. That goes for myself as much as anyone else, I'm human, but I'll tend to cut off once personal insults are flowing.
I got the domain jump.red a while ago, to make a more conservative/libertarian community... may throw a lemmy instance up this weekend there. Won't necessarily fit the bulk, but I think there's room. Was also thinking on putting up one on the bbs/tech related domains I have.
No, and that's not what I'm asking of HN. I like communities that:
- Have well defined rules
- Enforce them equally
- Moderate transaprently
That's it. I'm not asking for an unfettered platform for free speech. If one of your rules is "no hateful posts targetting specific people or people groups", then it's a well justified decision to remove that post.
Whenever you get a bunch of people together you have to have rules or you get the inevitable mentally ill person who starts to troll or try to spread racism. Think 8chan or 4chan on steroids. It just has to be done. Hopefully the rules are permissive enough to allow reasoned debate on topics between diverse opinions. However I don't need to know your political leanings in all caps while discussing the latest news so pitch that guy off. If I'm being rude or ranting pitch me off.
Every single time someone tries something like this, it becomes a bastion for hate speech and everyone else is driven away. It turns out most people actually like a moderated forum. If you joined a book club and every meeting had people just screaming the N-word at the top of their lungs the entire time, I suspect you wouldn't stay at that book club long. Yet time and time again we see people learning this same lesson. It makes me wonder whether everyone who is extremely anti-censorship really does just want to scream the N-word in everyone's face all day, deep down.
That's great but I'm not talking about subreddits banning topics, I'm talking about the enforcement of acceptable views on topics that are already being discussed.
I agree with having safeguards against things like hate-speech, I disagree with how conveniently and broadly that's defined by certain people in order to push their social and political agendas.
reply