Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> Neither of these situations are ideal, nor are they catastrophic

No, no, this is actually quite catastrophic. Someone else is deciding what is and isn't acceptable for you to consume by default. Establishing the precedent that someone else knows better than you, what content is acceptable for you to view. That's absurd.

Christopher Hitchens had a very poignant[1] part about why the path to hell is paved with censorship, and I think it's worth a watch.

[1] https://www.youtube.com/watch?v=QIyBZNGH0TY&feature=youtu.be...



sort by: page size:

>We need to realize our problem is not with censoring people, it's with who does the censoring.

No, the problem is with censoring people.

You make it seem like censorship is a given. I disagree and feel no need to have a third party prune unwanted ideas for me. This will seem like a strange idea for some but I don't need a priest in between myself and God. I don't need a doctor in between myself and good health. Why would I need a censor in between myself and information?

No need for governments, no need for corporations, just me and my silly brain will decide what to digest. What a concept!

-

We need to realize our problem is not with censoring people, it's not with who does the censoring, it's people that try and normalize censorship.


> Not censoring an idea and arguing that an idea should not be implemented are not equivalent.

I'm not sure this is correct, but I'm willing to be convinced if you can expand a little more. I guess I see it as censoring the act of censoring.

> If this is true, i.e. reporting of the censorship increases the visibility of the bad idea, then it completely destroys the pro-censorship argument that it reduces the visibility of bad ideas. It would also immediately cease to be true if censorship ever became uncontroversial, in which case you couldn't claim that it's giving the allegedly bad idea a chance at a fair debate.

In this case, it is true. But youtube censorship is not equivalent to societal censorship, it just signals that youtube does not endorse Icke's ideas and does not wish to publish them, but people can still evaluate them somewhere else.

> Is somebody suggesting that YouTube should be prohibited from creating and distributing their own videos?

No, the suggestion is they should not be allowed to write their own terms and enforce them.


> things are going to need to be sacrificed

I say this for emphasis, not to personally attack you: Whenever I read or hear this it makes me extremely suspicious, because it often doesn't come from those who will do the sacrifice themselves.

Censorship and other forms of regulating free speech is simply never a good solution. Perpetrators of lies will even hide behind this, and create a false image of rebellious heroism, instead of facing public discourse and the consequences thereof.

YouTube is a private entity but they correctly assess that they have a significant responsibility here. But curating social media content is not unheard of, Wikipedia manages it surprisingly well. Instead they could for example certify and emphasize content that is grounded in science and verify content creators of such. This is from the top of my head but there are certainly other people who have smarter ideas that don't involve straight up censorship.

Talk is talk. Actions should have consequences, which is why we have rules and agreements in the real world. I also very much doubt that censorship achieves what it is supposed to. People don't suddenly become enlightened (depending on context it might be the other way around) when you ban that stuff. It can easily become worse.


> the massive risk remains censorship by state actors

the state actors are to an extent, elected by the people. The big corporates have no accountability.

> I suggest we in the short term we encourage people like YouTube to censor more evil idiocy

You say that because in this narrow issue, you agree with youtube.

What is the short term?

Have you thought through all the consequences? and still, things will happen that you have not accounted for. everyone is against evil or idiocy. The question is who decides what is?

The media's propaganda can and is being used to brainwash the people, and the results are clearly out for all to see.


> Thinking you can filter it out is a fools errand.

In that case someone should build an streaming platform where there is no censorship, let the market decide.


> No, the problem is with censoring people.

No, it’s who/when/where the censorship occurs.

If you want to come into my living room and shout Nazi propaganda, your ass is getting censored and banned from my house.

If you think that level of censorship is a problem, then we have a fundamental disagreement, and the bad news for you is 95% of people will disagree with you.

If you agree on that level of censorship, then we’re just arguing where the who/when/where line should be.


> Now whether or not this is censorship is subjective.

No, its absolutely censorship. Whether or not this is the kind of censorship that is undesirable is subjective, as is any other evaluation of desirability.


> Are you suggesting that we should force people to watch things they don't agree with?

That is explicitly what I'm avoiding saying, because obviously that's an illiberal and unworkable idea. In fact, I'm not trying to mandate anything at all. Not every observation is a criticism, not every criticism is a threat.


>No, there is certain content that is forbidden by law in some countries, for example Holocaust denial, Nazi propaganda and other racist or otherwise discriminatory content. I do wonder why Youtube and Twitter don't disallow that kind of stuff by default, there's no reason for giving this kind of content any platform even if it's legal by law in the US.

I think you are confusing things. Simply because you desire forbidding others the choice of whether or not to view the content does not in any way mean that the act of suspending a channel doesn't accomplish this. The reason it is a horrendously dangerous, terrible, horrible idea to acede to a desire to hide 'dangerous' information would, I would hope, be easily seen by now, but it's obvious that this isn't the case. People seem to have forgotten that the Nazi newspaper was banned by the government of Germany after the Beer Hall Putsch. They seem to have forgotten that prior to that, the Nazi party was some guys hanging out in a pub being angry, and once the disliked establishment banned their paper it became the thing every German 'had to read'. They seem to have forgotten that it was the exact kind of censorship they seek to enact today that catapulted Hitler and his party into the forebrain of every downtrodden German.

The only legitimate and adequate response to a bad idea is a better idea. Full stop. You can not ban an idea. You can, however, drive it into hiding where it will see no opposition, no correction, no competing evidence. At that point, you guarantee for yourself the creation of an insular subculture who will draw the disaffected and opposition to the establishment. It is a Bad Idea. It has always been a Bad Idea. There has never been a happy state with heavy censorship.

Had we had such censorship in the 1960s, interracial marriage would still be illegal. Had we had it in the 1800s, slavery would still be legal. Every progress of mankind begins with spitting in the face of the established order. And that Nazi morons are clearly not a part of such progress is of no great import. You (generally, in the 'your society' sense of you) will be incapable of judging the next step of progress and this is guaranteed. In the United States, there is a legal concept called Prior Restraint on free speech and it is very aggressively avoided by the courts, as well it should be.

You are incorrect that various fringe groups would be smaller if YouTube became a bastion of censorship, and also that the platform owner is responsible for problems that you believe resulted from ideas being available. The existence of the fringe groups, their sizes, and the problems are all a result of widespread anti-intellectualism and a rejection of critical thinking and reason as an acceptable means of determining ones beliefs. Things like the anti-vaccine movement in particular grow far more through word-of-mouth than they do online resources. Zero of the anti-vaxxers went online to seek information about vaccines and found anti-vaccine ranting and then made a decision to not vaccinate their children. Instead, they heard about it from mothers who "just knew" that vaccines gave their kids autism. Or they listened to their president talking about how there are "just too many" with no basis. The idea that vaccines are harmful, that letting strangers, intellectual strangers who don't care, stick steel needles into their children and inject them full of chemicals and dead viruses was dangerous felt right and that is literally the only thing in the world that could convince them of any position.


>We see this today in those who attempt to weaponize the idea of censorship when society attempts to minimize the destructive effects of intolerant views.

Today, I think you are making a mistake and reversing the order.

It is the intolerant that are weaponizing censorship to minimize the destructive effects of the tolerant.


> how our civil liberties are being rapidly threatened

It's not that they're being threatened, it's that they're actively being taken away. Society may react to this loss of liberty by restoring it through alternate approaches. Or, it may not.

> Removing firearms videos from YouTube seems perfectly reasonable to me

I mean yes, obviously there are plenty of people who want censorship for one reason or another, otherwise it wouldn't be happening. So one's own comfort with censorship is not evidence that censorship is not happening.

> Is a company taking steps to distance themselves ...

It's what they're doing, not why. Despite the trope, evil never shows up twirling a mustache.


>It seems to me that the fact people are self-censoring is a problem, regardless of what their views are

You really think there are no views that should be self-censored? There exist many views that I believe are not worthy of being aired in public


> I am not sure what other options are there, but censorship is the worst option.

Really. That's the "worst" option? You can't think of worse things than that?


> The interesting question is: do you think that the majority of people don't see this side of the internet because of how effective the centralized control and policing is, or just because the majority of people are not interested in seeing this content in the first place?

Both! Most people don't want to see beheading videos and would be very upset if one came across their Youtube recommendations. Fortunately, YouTube's "CENSORSHIP" is pretty good at not showing such videos (even though they're completely legal content!)

However, there is a significant long tail pool of people who are totally into watching such abhorrent content (e.g., 8channers), and could easily cause a deeply offensive content to find itself in the unmoderated "Democratically trending" video feed, as it's demonstrated in the Flare example walkthrough.

Maybe that's your point? Platforms like these will necessarily be used pretty much exclusively by people who like or will tolerate seeing extremely offensive content because everyone else will be put off by the occasional display of horribleness.

Libertarian "DON'T CENSOR ME, BRO" havens like 8chan, kiwifarms, daily stormer, etc. already exist. They're not particularly popular, when compared to the likes of Twitter, Facebook, Instagram, etc. But they're certainly popular enough to draw millions of users. And every one of them would be delighted if they could post their inflammatory nonsense on Facebook or YouTube to reach a wider audience. And Facebook and YouTube have wider audiences because they moderate content.

In a previous life I worked building popular social media apps that included user-generated content. And I saw first-hand how horrific content moderation is. The shit people post to social media sites is as vile as it is vast. I'm certain most "anti-censorship" people's opinions would be changed if they'd watched an actual content moderator do their job for 30 minutes.


> What rules do we put in place that they have to host some of this content? Blatant misinformation, vague encouragement of violence... hate?

Why do they have to be rules? Why can't we have a culture of tolerance? Robots aren't running these companies. They're people, just like you and me. They can choose to be tolerant.

Tolerance is about tolerating the most horrid things. Not because it's nice or those things have some value. They're horrid, they have no value. It's because the alternative is worse. Because censorship is worse. Because when a person is censored, when they can't speak, they resort to violence and through censorship, you encourage violence. And violence can't be censored. It can only be suppressed with further violence.


> his content proliferates

This is the root of the problem. The fact that his content is on YouTube does not imply that it will proliferate.

> How are we going to solve it

Critical thinking skills are apparently undervalued. This is a problem that is hopefully solvable by education and public debate. Even if not, the situation in which a small part of the population believes in David's ideas seems preferable to me than one in which a small group of (non democratically chosen) people decide what is acceptable content and what is not.

> It's implied in the concept of a marketplace of ideas that some ideas will have value over others.

I agree, but different people will have different definitions of value.

> And so it seems implicit to the concept of free speech that one should have the freedom to accept or reject an idea, or try to disabuse someone of an idea you deem fallacious.

I agree, but censoring someone goes much farther than to disabuse. Too far, in my opinion.


> Personally, I think a better perspective is to reduce and/or avoid censorship altogether, especially as far as internet infrastructure is concerned.

Yes! Systems ought to be designed such that censorship is impossible. Not just that we won't censor because we're so liberal and all. No, just plain uncensorable by design.

Edit: language


> yes, but we do prefer to live in a culture that doesn’t remove access to an audience based on a disagreement with the content.

That’s not the US I grew up in. The government has blocked access to content via public airwaves, different stores “block” access to various products, movie theaters “block” access to movies with certain ratings, etc.

I don’t expect a religious publisher to start publishing literature it thinks is harmful, but I wouldn’t say it’s “blocking” anyone.


> Seems like a slippery slope towards government censorship.

Luckily when it’s done by the government, that’s actually unconstitutional and you have a recourse. When it’s done by a private company, you have none.

next

Legal | privacy