>>If we are going to say that censorship is justified because you want to prevent violence or insurrections, we open the floodgates to some seriously bad things<<
If it were just as bad as "The censors want to prevent the bad things", I think I might feel that it would be justifiable. This type of thinking also robs the broader public of its agency to reason. The broader public is not comprised of petulant children, regardless of what the media outlets would tell you and what the behavior of Twitter might show (very few of those people display that sort of behavior in actual life, where consequences happen).
This "moral highgrounding", to protect us from the corruption of verbiage that may occur on one platform or another is a ridiculous idea to think that functional adults might need. We are not so weak as to be protected from ideas that "must not be named". Words are not magic, and people cannot be enchanted by their mere utterances. They can be enticed by them being outlawed and the verboten mystique that makes the forbidden seem so savory.
People may be uneducated, but it is impossible to educate them through silence and the silencing of ideas. These verboten ideas will get spread around, major platform or no. They'll spread without any kind of resistance or discussion and no disinfectant light of truth by debate will be shone on them because they will be pushed down to the nooks and crannies, off of the popular platforms where the light of day can show them to be the BS that they are. They'll find their place in small groups and factions, I felt and grow and split people apart. They'll make people not talk to each other and they'll make us a weaker country and a less educated one. A country that tries to run away from hard questions and conversations because it's easier to tell someone or some group to shut up and don't talk about that, rather than address a topic with an open and honest conversation with facts and dialogue.
> The social platforms aren't censoring you (or some idea you like) because they disagree with you. They are censoring because they are large social platforms, and ideas are POWERFUL and DANGEROUS.
> Let me be clear: if you run a large social network, you will be forced by inexorable circumstance to censor certain things, you will be forced to "arbitrate" on topics you have an (inevitably) limited understanding of, and it will all be really really shitty.
> (The alternative is just collapse of the platform, so I guess you do always have a choice - but then you're not a social platform anymore)
I'd love if he provided an example of these powerful and dangerous ideas that must be censored. This is exactly the kind of language that any powerful organization that is scared of losing power says - it's all in the interest of saving lives... okay, what are the examples? Why is this not a matter best handled by law enforcement?
This just further convinces me that social media is a cancer to society.
> The surprise isn't that censorship would exist, but that we would support it.
Please don't speak for me.
> It turns out I won't defend, to the death, your right to copy and paste the n-word a million times and call it discourse, and death threats to anybody, especially journalists, are simply unacceptable.
Which are justifications for censorship how?
> The surprise is that online, people yell "fire!" into a crowded theater for the fun of it
Hu? What would be an example of that?! I can't really imagine how that analogy could possibly apply to online conversations (given that those usually don't take place in a crowded theater) ...
> I now find myself supporting limits on speech - not by the government, but the corporations running the modern day printing press.
So, you are convinced that censorship by corporations will overall have a better long-term outcome for society than free speech combined with efforts to change people's minds (instead of preventing them from saying what they think) and other approaches to combatting crimes?
> It turns out ideas are dangerous, but not in the way we thought - we thought they would set us free and railed against government censorship, but instead people were seduced by the idea that they were born kings, only to have that taken from them by the immigrants/Jews/homosexuals/media elite/Hillary/Trump.
Yes, I agree that ideas can be dangerous, and as far as I am concerned, in exactly the way that I thought. I think one of the most dangerous ideas ever is the idea that authoritarianism is a solution to anything. Some use that idea to come to the conclusion that you have to elect a strong leader that will deport all Mexicans. Some use that idea to come to the conclusion that you have to empower a private communication monopoly to decide what speech is acceptable in society. All do it with the expectation that the power they put in place will defend their interests. History tells us that that is a naive expectation.
> It turns out using the ashes of public education to shine the light of liberty ends up with racist public opinion.
Are you sure that you aren't confusing visibility with existence? That is to say: Has unrestricted online discourse led to people adopting a racist standpoint, or has it simply made visible what was there all along?
> I don't want this to sound wrong, but I perceive most of the people complaining about censorship to be hypocrites. They'd be the first to censor if they were in charge of these platforms.
> The reason they're upset is because they are not in control.
Do you notice how that starts me, as someone who is firmly anti-censorship, off on the defensive as you seem to view me as an insincere or stupid bad guy?
> I think in some ways they are like the people scared of change, and the once change has taken place they are scared of not being in control.
As a general principle I’m for the idea that we aught to allow free expression of ideas, especially online since that is the way to share ideas. Life happens very differently for people even just domestically in the US, and when our viewpoints come into conflict I’d rather understand what they have to say and talk about it. How is that possible to do when there’s a party in the middle that is able to determine what you’re not allowed to read? I'm scared of not being in control of that and thus censorship in general, not about having my version of truth be published.
> I don't care how much "good reason" you think you have, it doesn't make censorship not censorship
Agreed. The problem isn't censorship, it's that people have this idea that censorship is inherently evil. I guess this stems from gross ignorance about how fucking stupid people in general are, and how they're taken advantage of by charlatans, to the detriment of society as a whole.
Watch these two (short, 2min) clips showing an interview of the darling of the GOP:
This is a video of a man who lied to get into office (about being a small businessman, about his accident derailing a Naval career, about being a constitutional expert, etc) and then used that platform to spread lies that led to insurrection and deaths.
It's simply irresponsible to let such filth continue to mislead people, and censoring him would be better for USA, and better for the planet.
> "Finally, censorship is always bad, for a variety of well understood reasons that we don't need to repeat here. But in the case of some types of content, it has special dangers. When you censor a web site based on the extreme or dangerous views of its creator(s), you haven't stopped those people from thinking that way. You haven't made them go away. You certainly haven't stopped the people who hold those views from doing whatever else they do when they're not posting on the Internet. What you've actually done is given yourself a false sense of accomplishment by closing your eyes, clapping your hands over your ears, and yelling "Lalala! I can't hear you!" at the top of your voice. Pretending a problem doesn't exist is not only not a solution, it makes real solutions harder to reach."
I no longer believe this, when cesspits of alt-right, racist assholes use such grandiose ideals to spread their hatred, which then bubbles out into the real world.
The idea that good ideas will win, and that common sense and rationality will take the day, are not really supported by what we see around the net. Instead the greater internet fuckwad theorem holds more true, and the spread of vile, violent ideologies is enabled.
Freedom of speech is a protection from government, but I think those providing speech platforms, such as hosting companies, should probably take more responsibility for what they propagate.
> We do not have the time, through our own actions or not, for quiet contemplation and retrospection into deep topics.
This is a problem, but how does censorship solve it? You would in fact need to solve that problem for all ideas in order to correctly know what to censor, which nobody has time to do, hence the problem.
> We have polarized ourselves into camps of right and wrong.
This is exactly what the censorship makes worse. You say "not here, go somewhere else" to anyone who disagrees and everyone who disagrees is separated, so that each camp has no exposure to the other side's arguments and whoever is wrong will remain so.
> We are exposed to a staggering volume of information that all seems mostly the same with some slight differences but widely varying outcomes.
The world is complicated. Not everybody is going to understand everything. That isn't possible, but it isn't required. What you need are two things.
One is free and open debate, so that the people who choose to allocate their time to a given specific topic enough to come to an accurate conclusion on that topic are free to do so.
And the other is trusted thought leaders who consult a variety of domain experts on a given topic to determine the consensus of the experts and relay it to all of the people who don't have time to look at the specific details on that specific topic. This is what has completely imploded. The news media has lost their credibility by explicitly abandoning even the pretense of impartiality and chasing clickbait nonsense for ratings, so nobody trusts them anymore. As it should be when they're publishing things like that.
But you either do the research yourself (which nobody has time to do for everything) or you trust somebody else to do it for you. So now people need to figure out who they can trust to distill the informed consensus. But you don't do that by censoring the debate, you do it by being so consistently right that people begin to trust you over the lies of the blaring partisans. That doesn't come from force, it comes from hard work, which nobody seems willing to do anymore, which is why everything is messed up.
> We're concerned about speech that is being amplified in a way where there's a civic responsibility to ensure that the harmful messages are not being amplified.
Who gets to pick what is harmful, though? Why can we not choose our own gatekeepers, rather than having them foisted upon us by corporate oligarchs?
The big problem here is that it is completely unreasonable and unprecedented to assume that this will be done in a transparent, consistent, open way. There will be no clear appeals process, and there's no reason to believe that the censorship won't be ideologically biased in favour of whatever is marketing best at the moment.
> it's controls on what can be said to large amounts of people
It's controls on what the working class can say to large amounts of people. The elite can still just go buy a newspaper or a broadcast media network, or start their own website, etc. and issue forth opinions on a grand scale. Regular folks will have to go through censors, which will end up being some combination of AI and outsourced farms of moderators in other countries or some other such Kafkaesque nightmare of business process.
> there's a civic responsibility to ensure that the harmful messages are not being amplified.
"harmful" is a social construct, and what is "harmful" to one group might not be to another. Which groups will be prioritized for this? How will you pick sides fairly without imposing a new dimension on whatever underlying group conflicts already exist?
The problem is that what you want to do cannot be done fairly. It can only be done in a draconian way, and the result will be an even wider fracturing of dialogue in a way that doesn't result in what you want, at all.
>I want free speech. And I want social norms. They're not incompatible.
Yes, they are. and the way you're trying to square the circle here is by creating an arbitrary distinction between censorship (which is the evil government doing scary bad things) and 'moderation' (which is private groups doing the same thing, but in like, a good way).
There is no material difference between the two other than the size of the institution doing the censoring. Social norms, by their very definition constrain and civilize people by telling them what not to do or what not to say, either explicitly or implicitly so we don't all behave like a bunch of monkeys in the banana factory.
If there really was such a thing as an 'innate right to voice your ideas', censorship here on HN would be as vile as the government doing it, possible even more so because you at least elect the latter. Clearly it isn't vile though, because without strict censorship discourse here would not be possible.
> the news media has for some time been working in concert with civil society organizations, government, and tech platforms, as part of the censorship apparatus.
Those who disagree that this is the case don't bother me. Everyone has a right to be wrong.
What worries me is how many people seem to think that it's a fine idea and that talking about it is at best unseemly.
> Since you seem to be more or less pro-censorship, I’ll assume that you’re a bit left leaning
What gave you that impression? I was raising questions to show that any censorship (or definition of civil discussion) is problematic.
It's a tough problem because bad ideas can lead to bad things, but stopping good ideas can lead to bad things too. If we all agreed on what's good and what's bad this would be easy, but we don't.
I'd much rather live with the consequences of free speech than live with the consequences of censorship. But neither side should project claim it's a utopia.
>I believe it's becoming increasingly clear that merely allowing far-right dialogue online is essentially responsible for ease of radicalization, and that proactive censorship would be the best approach to forestall this.
Because of course, censorship only blocks the material you personally don't want others to view and never material you personally would want others to view. I can see a lot of politicians, especially those getting elected nowadays, wanting to block any discussion of minority rights, LGTBQ+ rights, trans rights, drug reform, abortion, women's health, women's rights, climate change, non-christian religion discussions, etc, etc, etc.
edit: Note that this isn't a straw man argument. All it takes is one to look at the censorship that happened in the US in the 50s and 60s (comic code, movie/television code, mccarthyism, etc. ) to see what is possible.
> censorship is the intent to prevent someone from speaking.
You can't mean that seriously. If I successfully form the intention to stop someone speaking, but fail to carry through (perhaps because I don't own a police force), are you really saying that I've succeeded in censoring them?
It seems to be the fashion these days to construct arguments based on redefining words. If "censorship" is a kind of intention, then there doesn't appear to be much wrong with censorship. You've redefined a word that is generally considered to refer to something bad, to mean something that isn't bad. That's sort-of OK, but it means that we have to put a glossary of definitions at the top of anything we say.
> There are lots of great ideas about how to tame problematic online behaviour.
Isn't that what all the censoring europeans, chinese, russians, saudis, israelis, etc all say? All censors are trying to target "problematic" behavior.
Your comment is no different than what every tyrannical or oppressive regime espouses. The biggest problem with social media and internet is that it is transnational. Countries and people that value censorship really shouldn't have a say in american social media. I believe they should develop their own social media.
I no longer ascribe to the idea that everyone should have free speech. I believe we should have free speech in america and you should have whatever you want in your own country and own social media. The problem with american social media is that it is trying to appease everyone and as a result, it becomes a race to the lowest common denominator.
> > it sounds more like you’re fine with it because you agree with who they’re censoring...
> Yes…
Well, at least you’re honest about supporting censorship of the “right” people.
> I will worry about this when authoritative information about important topics gets removed from all platforms at the whim of one or two companies
By then it’d already be too late. Though again I doubt you’d care even as long as the “right” people disappeared from your view. You don’t seem particularly worried about authoritarianism as long as the content you’ve been told is “harmful” has been “reviewed” and summarily removed.
> In many scenarios that we experience every day, we would be better served by accepting censure over misinformation.
No. Not at all. I refuse your premise. Not only you are begging the question here (what scenarios? Your example was terrible and I really don't think you can come up with a good one), I honestly worry more about those that believe this rhetoric than the "victims" of misinformation.
Also, it's curious how those that so easily accept censorship never think that they will eventually be on the wrong side of the taser gun.
> I agree with you in theory. But in practice this is impossible. The human brain is physically unable to work everything through from first principles.
Good thing then that this is NOT WHAT I AM SAYING.
There is no need to "work though things from first principles". The idea is NOT to determine a priori what is "right" or "safe" and then make a binary decision. The base idea is to decide on what action to take (or to refuse to take) by asking yourself what is the worst possible thing that can happen if the information I have is wrong? What are the odds of me being wrong?.
I'd suggest you get acquainted with Nassim Taleb and Joe Norman to understand better how to deal with complexity and uncertainty.
> In practice they have turned people against each other with very real and serious consequences.
Bullshit. There was no Facebook during the time of the Crusades. There was no Twitter during the Cold War and no smartphones during WW1 and WW2. None of these things would be avoidable if only we could censor wrongthink.
On the other hand, THERE ARE video records of Tienanmen Square who have been successfully hidden from an entire country for an entire generation.
(Sorry for the harsh language, but I start reading any kind of censorship-apologetic and fighting instincts kick in. If you don't see how much of a sign of being morally bankrupt it is to casually defend the hellish things like state-sponsored censorship, I see no point in continuing the "debate")
> Because the process itself involves heaping lots of social costs on people
I'm open to that argument. But I think this writer is really trying to have it both ways. They literally say, censorship doesn't work and we shouldn't do it because we are the ones who will be censored. It just seems like a kitchen sink argument. Either it is dangerous to society because it works or it is dangerous to a few because it is unjustly punitive.
I am not fully in agreement with what is happening right now because I believe it is fundamentally immoral to force someone into a society where they must provide for themselves and then to take away their means of provision.
But I also see our society as deeply censoros before this and a lot of people who are very upset now were fully willing to ignore that as long as it didn't broach opinions they value. Two wrongs don't make a right but it does leave me a little suspicious of how genuine they are and how magnanimous they will be when the pendulum of discourse swings back in their favor.
> I'm not saying that censorship is just or desirable. Just that it works.
But it doesn't, unless it is extensive/complete. It just seems like it does because censored media is constantly reassuring us that the censorship is working, and that all reasonable people enjoy it.
You can't destroy ideas by censoring them from the largest outlets, you have to perpetually search out the smallest outlets (e.g. open everyone's mail) to make sure that these ideas aren't still infecting people, multiplying exponentially. You can't relax anywhere, for a moment. The only surefire way to kill or silence ideas is to kill or silence the people who hold them. That means you have to have systems in place to detect stray ideas, and processes in place to eliminate them.
> "By banning speech is to say, "I agree people are too stupid to make their own decisions. Let's make the world 'safe' for them and ban ideas I don't agree with."
1. People are stupid. Really, really stupid. Especially when given the means to surround and reinforce themselves with other idiots. See for example - antivax and alt-med in general, chemtrails any number of ridiculous conspiracy theories that propagate through the web.
2. These people are not thinkers, they are not open to having their viewpoints challenged and reason will not move them from their course.
I agree, this whole area is massively subjective, I'm not saying I have a solution. But this black and white idea that censorship is always bad, and the notion that we have a functional marketplace of ideas which people re-evaluate based on reason is ... well it's a fantasy.
> Personally, I think the word "censorship" should be reserved for governmental suppression but that ship sailed a long time ago.
This is the exact same argument used all the time to justify censorship. Most people agree about government censorship being bad and corporate censorship being fine.
However, what people miss is that this is government censorship. To think what's going on now isn't government suppression is completely naive. There are several examples of the US government influencing social media to have them censor. They admit it all the time.
If it were just as bad as "The censors want to prevent the bad things", I think I might feel that it would be justifiable. This type of thinking also robs the broader public of its agency to reason. The broader public is not comprised of petulant children, regardless of what the media outlets would tell you and what the behavior of Twitter might show (very few of those people display that sort of behavior in actual life, where consequences happen).
This "moral highgrounding", to protect us from the corruption of verbiage that may occur on one platform or another is a ridiculous idea to think that functional adults might need. We are not so weak as to be protected from ideas that "must not be named". Words are not magic, and people cannot be enchanted by their mere utterances. They can be enticed by them being outlawed and the verboten mystique that makes the forbidden seem so savory.
People may be uneducated, but it is impossible to educate them through silence and the silencing of ideas. These verboten ideas will get spread around, major platform or no. They'll spread without any kind of resistance or discussion and no disinfectant light of truth by debate will be shone on them because they will be pushed down to the nooks and crannies, off of the popular platforms where the light of day can show them to be the BS that they are. They'll find their place in small groups and factions, I felt and grow and split people apart. They'll make people not talk to each other and they'll make us a weaker country and a less educated one. A country that tries to run away from hard questions and conversations because it's easier to tell someone or some group to shut up and don't talk about that, rather than address a topic with an open and honest conversation with facts and dialogue.
reply