Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> This suggest the problem is my lack of ability to understand, when the problem is there is no longer a common definition.

The only misunderstanding seems to be yours. We can clearly identify white nationalists as actual Nazis, no redefinition required.

> Censorship can be conducted by a government, private institutions, and corporations.

No one is saying that corporations can't censor things. I am saying that this is not censorship. Facebook deplatforming you is not the same as censoring you.

> maybe the principle is race after all?

The only thing we're discussing here is the deplatforming of white nationalist content from Facebook. Bringing other races or religions and their extremism in is totally tangential at best, and actively disingenuous at worst.

> Are you well read on the topic, or is this just a casual opinion that sounds about right?

There are many examples in the news recently. Here's one I read just the other day: https://www.washingtonpost.com/politics/2019/03/22/trumps-rh... White nationalist speech creates hate crimes; the correlation appears clear to me. Of course, it also seems logical to me without any sources, so sources are just a nice bonus.

> Can you think of any possible undesirable reactions (regardless of the soundness of the logic underlying the motivation) to this type of policy?

Essentially your argument seems to boil down to: we shouldn't create good policies because crazy extremists might have bad reactions to them. But crazy extremists have bad reactions to everything. We should be trying to deplatform and deconvert extremists instead of catering to their sensitive tastes. If that offends them and causes them to lash out, that is unfortunate, but they'll do that anyway. At least if they do it to this there might be less of them in the future.

> If everything is random, why even bother with policies like this, or any at all?

While you can't control how people react to what you do, you can do what you believe is right and hope it has a good outcome in the future. Facebook apparently agrees with me.



sort by: page size:

> The only misunderstanding seems to be yours. We can clearly identify white nationalists as actual Nazis, no redefinition required.

a) With no standard definition of the phrases?

b) Some people may be able to, but can Facebook accurately and fairly (despite no standard accepted meaning of many terms) police speech (remember, we're not dealing with people wearing white hats at a rally, we're dealing with speech, which is subtle), at scale? Sure everyone can agree at the extremes, but when it's close to the middle, then it's complicated. It's like "I know pornography when I see it."

> No one is saying that corporations can't censor things. I am saying that this is not censorship. Facebook deplatforming you is not the same as censoring you.

https://newsroom.fb.com/news/2019/03/standing-against-hate/

"Today we’re announcing a ban on praise, support and representation of white nationalism and white separatism on Facebook and Instagram, which we’ll start enforcing next week."

I'm not saying they don't have a right to do this, it is their platform after all. I'm not even saying that it is necessarily or certainly a bad idea. You and I disagreeing on this is fine and healthy. But how can you interpret "you can't say <x>" as not censorship? I asked you earlier for the definition of the word you're using, and you didn't reply. I ask again: please tell me the definition you're using for "censorship", with a link to the source.

> The only thing we're discussing here is the deplatforming of white nationalist content from Facebook. Bringing other races or religions and their extremism in is totally tangential at best, and actively disingenuous at worst.

No, that's the only thing the article is discussing. The HN discussion, and our thread in particular, are discussing broader principles of free speech and fairness, possible downsides of these types of decisions, etc.

>>> I believe that social media is helping incite these incidents, so I do legitimately believe the amount of people participating in hate crimes will be reduced as a result of this.

>> A perfectly reasonable theory, what evidence of this do you have? Are you well read on the topic, or is this just a casual opinion that sounds about right?

> There are many examples in the news recently. Here's one I read just the other day: https://www.washingtonpost.com/politics/2019/03/22/trumps-rh.... White nationalist speech creates hate crimes; the correlation appears clear to me.

"To test this, we aggregated hate-crime incident data and Trump rally data (a different variable than our topic of conversation, but again, no need to bother ourselves with precision or details) to the county level and then used statistical tools to estimate a rally’s impact. We included controls for factors such as the county’s crime rates, its number of active hate groups, its minority populations, its percentage with college educations, its location in the country and the month when the rallies occurred. We found that counties that had hosted a 2016 Trump campaign rally saw a 226 percent increase in reported hate crimes over comparable counties that did not host such a rally. Of course, our analysis cannot be certain it was Trump’s campaign rally rhetoric that caused people to commit more hate crimes in the host county."

You can find a correlation in data for anything you want to support, see: http://www.tylervigen.com/spurious-correlations

Now, stating that doesn't prove your claim is wrong, I'm just pointing out the one piece of evidence you finally provided is little more than an op-ed piece. We should be collecting more and better data on these things if they're important, so we can set evidence-based policy.

Erring on the side of caution (as Facebook is doing) is fine, but there's no need to tell lies in the process as far as I can see.

> Of course, it also seems logical to me without any sources, so sources are just a nice bonus.

This sentence explains this conversation, as well as the general state of political conversation in 2019: facts and evidence are considered completely optional.

> Essentially your argument seems to boil down to: we shouldn't create good policies because crazy extremists might have bad reactions to them.

No, that is one small point of my overall concerns (my "argument", if any, is that you won't provide any evidence for your claims, or assert that none is necessary when curtailing general free speech), and it's an important point. People are becoming incredibly politically polarized to the degree that it is causing strange behavior. Some people lose the ability to engage in logic & evidence based conversations on particular topics, others shoot up churches. Shit is pretty seriously fucked up and doesn't seem to be getting better. Being cautious and thoughtful about non-obvious risks seems like a good idea to me, not something to avoid.


> common misuse of the words "literal", "Nazi", "alt-right"

It's not misuse if it's common usage, it's simply usage you disagree with. Obviously when people say "Nazi" they don't mean "members of a 1930s political party," they're making a statement about something else. If you find the usage of these words confusing that's on you, not the words or the people using them.

> Have you no concern for possible second order effects of poorly thought out and biased censorship?

Not particularly: if those things happen, let's protest those instead of wringing our hands and worrying about the poor slippery slope.

> And when I say censorship, I'm not referring to the First Amendment.

Deplatforming is not censorship. You are not guaranteed an audience for your speech. White nationalists can still say whatever they want, and they can even say it legally. But they can't say it on Facebook, and it's not Facebook's responsibility to let them any more than it is to let pro-ISIS people post their content.

> If the incidents are the problem, why is the enforcement (and news coverage) racially biased?

This is irrelevant to the discussion at best and disingenuous at worst. Facebook can ban multiple kinds of content, including pro-ISIS content (and it does). White nationalism being banned is what we're discussing here.

> How sure are you that these incidents are caused by incitement on social media?

I believe that social media is helping incite these incidents, so I do legitimately believe the amount of people participating in hate crimes will be reduced as a result of this. Do you really think that people aren't radicalized by communities on the Internet? Communities that might exist on Facebook?

Of course, I also think it doesn't matter if that's true or not. No discourse of value is being lost by banning white nationalism, and Facebook already bans a plethora of other (totally legal) content because they don't think is appropriate on their platform. If you want to read white nationalist speech, it still exists. This merely means you can't read it on Facebook anymore. How this is a tragedy I simply cannot understand.

> Is that so, or might that only be what you presumed or were told?

Yes, it's so. Read the manifesto.

> Is this worth considering when setting policy?

No. People can (and will) do anything in response to anything; maybe white nationalist groups will make Zuck their #1 enemy over this. You can't control the actions of other people, especially crazy people, so catering your policies to them seems absurd.

> For the sake of improving the quality of discourse, can you sense any legitimate concerns in what I'm saying, or do I seem to you like yet another ignorant racist, little different than those who might frequent Charlottesville rallies?

I don't really see many legitimate concerns arising from banning white nationalism from Facebook. As I've said multiple times, Facebook already bans other kinds of speech they disagree with. I get defending free speech is a thing, but Facebook is not the government and they can ban whatever speech they like. They don't control the entire Internet or even a majority of it, and they can't prevent you from hosting your content somewhere else, so... I just don't see what's being lost here, except white nationalist content on Facebook.

And even if they do go ahead and decide to ban other content, I mean, A) let's cross that bridge if we come to it and B) who cares? It's just Facebook, they don't control the entire Internet. Make another site if you want to discuss what they deplatform.


> Do they define ISIS?

ISIS is (or was) an explicit political group, an organized proto-state. It literally called itself a State. It was very clearly defined.

> As for the rest of your post, I'm baffled that you somehow think that "preventing people from trying to kill people" is somehow a negative thing that should be criticized.

"Preventing people from trying to kill people" was already against their terms of service. The whole point of this announcement is to announce the fact that Facebook is expanding their prohibited categories beyond "preventing people from trying to kill people".

> I also think that if I were consistently being mistaken for a white supremacist, I'd think about why that were happening, and probably try to distance myself from the things that were causing those mistakes. Your view appears to be that it's better to just prevent people from voicing their confusion.

As I stated earlier, multiple people have told me that desiring stronger border security and building the border wall is white nationalist. Others have told me that supporting any expansions of immigration restrictions is white nationalist. A few have even told me that opposition to affirmative action (even among groups in which Asians are the ones primarily opposing it) is white nationalism. I did not get the sense that they were saying these things ironically or in jest. Are these views tantamount to "preventing people from trying to kill people"? I live in San Francisco, which while not exactly the same environment as Menlo Park, is still in the same metro area as Facebook's HQ. There's a significant possibility that folks with similarly liberal definitions of white nationalism exist at Facebook.

Also the way you say you would "probably try to distance myself from the things that were causing those mistakes" really makes it sound like the chilling effect this has on discussion is a feature, not a bug. As significant number of people suspect that tech companies' expansions of prohibited speech is becoming a means of partisan manipulation. Statements such as yours likely reinforce this belief.

> Your view appears to be that it's better to just prevent people from voicing their confusion.

I am not trying to prevent anyone from voicing anything. The issue is that a significant number of people do confuse (or deliberately label) mainstream political views with "white nationalism" and "white separatism". Thus, Facebook's banning of these things is very likely to be seen as - and perhaps actually be implemented as - a means of suppressing legitimate political discussion. It probably would have been better to keep their prohibited categories the same, and perhaps more aggressively police certain circles and keep their policy - as you put it - "preventing people from trying to kill people"

There's already enough suspicion that Facebook is acting in a partisan manner, and more stuff like this is going to inspire ever greater calls to enforce stiffer regulation on tech companies and perhaps even breaking them up. This announcement seems like a shot in the foot for Facebook.


> I haven't thought about Neo Nazis in 20+ years,

Then may I suggest doing some research on what they have been doing in the last two decades? In particular the current tactics[1] they are using?

> banning their sites

A private business refusing service is very different from a government ban. I agree that the extreme centralization of the internet has put a troubling amount of power in the hands of very few businesses. However, this is orthogonal to the fact that neo-Nazi groups have took advantage of a lot of people ignoring them for 20+ years.

> It's not the people who tolerate their speech and ignore it that gives these people's ideas weight

If everyone was actually ignoring the neo-Nazis, we wouldn't have a problem. Instead, we have people tolerating their speech and arguing against it in the neo-Nazi framing. They have been very successful at constructing a framing with blurring language and rhetoric constructed specifically to sound reasonable. So now we have people unintentionally defending fascist ideas that has been carefully camouflaged as "distasteful idea".

[1] https://twitter.com/ContraPoints/status/896823834338263041


> because this is actual Nazis.

Yes they are, but look at how the term 'Nazi' is being thrown about with abandon these days [0].

Once you have established that it's ok to ban/silence Nazis, then all you need to do to silence your opponents is brand them as a Nazi.

That is not hypothetical, and is something that is actively happening right now.

> it's people who are declaring their allegiance to a group that literally killed millions in the name of racial purity.

Where do we draw the line? Do we kick people off the Internet if they declare allegiance to communists - a group that literally killed millions in the name of ideological purity?

> We need not and must not be tolerant of the intolerant.

Actually, we must. The only speech worth defending is offensive speech or speech you don't like.

No-one tries to stop you from saying nice things that they already agree with.

0: https://www.youtube.com/watch?v=tWFMUIP3lHo


> How do you solve the problem of white-nationalists recruiting on this webpage? Well, you ban hate-speech.

> The problem is that White-Nationalists can simply... go to Facebook... or Youtube... as recruitment grounds. The big websites aren't cooperating yet. This needs to be a systemic top-down effort, unified across the major websites.

Sure sounds like you're saying YouTube and Facebook host white nationalist content to me. Saying that YouTube and Facebook are lax in kicking off white nationalists is still saying that they let white nationalists on their site.

It's only after I challenged you to back up this claim that you pivoted to talking about comments.


>We must not tolerate intolerance, because that kills tolerance overall.

I see people repeat that, but I don't see anything to indicate it is true. Quite the contrary, it would appear that one of the main factors in the growing white nationalist population is censorship and other attempts to silence people (calling them some thought terminating cliche like "racist" or "bigot" when they aren't saying anything racist or bigoted). When you say that someone's words are dangerous and they must be silenced, you imply that their views are so overwhelmingly convincing that large numbers of people will be swayed if they hear them. Since you can't actually silence them, all you do is cause them to move somewhere you don't control and then point to your censorship on platforms you do control as evidence that you know they are correct and fear the truth. You end up portraying yourself as the evil authoritarian empire, and the people you are trying to silence as the valiant freedom fighting rebels. People are sympathetic to the rebels, not the empire.

>When an ideology is built on intolerance, it must be opposed and deplatformed at every turn.

But that very ideology is built on intolerance. I presume you are going to delete your account so as to "deplatform" yourself?


> you think that people being banned on social media is a greater threat to a fall to authoritarianism.

That's a strange reading that seems to assume that free speech goes hand in hand with authoritarianism, a laughable notion. Free speech is the antithesis of authoritarianism. No authoritarian has ever allowed anything approaching freedom of speech within their jurisdiction and sometimes even enforce it far beyond. As such, what you think is wrong.

> > To label ideological opponents with the label of a mental illness simply for disagreement is a poor show.

> I did not do this.

A phobia is a mental illness, you call your ideological opponents transphobes, so you did do this. It's pathetic name calling.

> I hope that you also donate your time and money to those who suffer at the hands of people spreading hate.

I'm here right now spending my time doing just that because you quite clearly do hate your opponents.

> Nazis are shit. You are treating me like a child.

No one here has claimed that Nazis aren't shit but you're acting like a teenager high on self righteousness that thinks proclaiming that is some kind of insight for the rest of us. It is childish.

As I wrote, HN isn't the place, and one reason I bring it up far too often is because far too often of late I see people, like yourself, treating it as such. I make no apologies for wanting the standards to remain high.


>This is exactly how ethnic nationalism in America becomes normalized.

Sounds like you're preaching censorship, both of speech and of research. Is this really the position you want to take?


> The Nazis can be identified as a Hate Group. They followed the Fascist ideology (among many other ideologies).

But the Nazis, as an organization, no longer exist. They used to control a country and have a government. Nazi ideology may still exist but the Furher is dead.

> Planned Parenthood can be identified as a group. They don't seem to be a "hate group" however.

So this is the other problem. If you want to claim they're a hate group you point to age as a protected class, and then you end up with everyone shouting at each other because whether that's true or not turns on the very point of contention. In particular, you now have to make a censorship decision where the "accused hate group" only doesn't get censored if the decisionmaker agrees with their ideology. That's... bad.

> Lets cut to the chase. Do you want to call #BLM or #MeToo a hate group?

I want not to have censorship, and I find a good way of doing this is to insist that censorship decisions be made on principled basis. Because when people realize that the rules they're proposing would also be applied to people they like, they stop liking those rules.


> I have seen zero proposed solutions or ways of combating the rise of white supremacists

No, you have only missed them. The solution is exactly free speech and more of it. These days people are often suppressed when they voice a concern that touches certain topics, i.e. immigration, religion and particularly Islam, anti-semitism, rapid changes of demographics and many more. Media oppression and flat out censorship do not make a stop to the opinions they try to suppress. They just drive them away, to hidden forums, to non-mainstream platforms, to dark web.

So instead of shunning white supremacists from media - tolerate them on and challenge them with arguments, explain how they are wrong, educate them, ridicule their nonsense; but never censor them.


> "the [ethnic group] must be exterminated to ensure white survival"

Nice example of speech that is currently not allowed on Twitter.

> [1] https://www.adl.org/sites/default/files/documents/pyramid-of...

An insane slippery slope, from "Non-inclusive Language" straight to genocide! This is laughable...

> they're comfortable white men

What a weird thing to say. You're not like the other comfortable white men, that's what you mean right?

> Typically, this is the adolescent (or frozen adolescent) view, where they don't have a theory of rights much beyond "YOU'RE NOT MY DAD YOU CAN'T MAKE ME".

Belittling people does not make you superior. It makes you sound full of fear and resentment, which by the way is still not justification for pro-censorship positions.

> Regardless, anybody who takes a maximalist position on free speech, by which I mean an expressed or implied view that it trumps all other rights, is in effect pro harassment.

Yes, and anyone who is pro-cars, is in effect pro-car accidents!


>First, there's a glut of people insisting they're only defending "free speech" and not supporting people such as those at that rally.

I'm sorry but I'm really having a hard time understanding your reasoning here. Are you really suggesting that the majority of people on HN who are defending the right to free speech of the neo-Nazis we saw in Charlottesville are doing so because they actually agree with the message those neo-Nazis and white supremacists are spewing? If this is what you believe, then please just come out and say that anyone defending this type of speech is in fact a neo-Nazi. You talk about mechanisms that undermine an honest debate but in the very next sentence you essentially argue that anyone in support(I'm adding an edit here to clarify: I mean support for the right to this kind of speech, not support for it's actual content) of this kind of objectionable speech is a Nazi in disguise. That certainly seems like undermining the debate to me. I don't like this kind of character assassination in disguise. Either be willing to call people out as Nazis and white supremacists, and back up your assertions, or give them the courtesy that they might be taking a stand in defense of a principle they actually believe in regarding speech, even in cases where they disagree with contents of said speech.

I personally don't believe a reasonable person can be a neo-Nazi and thus I have very little interest in arguing or engaging with these types of people. When it comes to free speech I tend to fall on the side of having fewer restrictions on speech, although even here my views aren't really set in stone and do change on occasion. With that being said, unlike my views on Nazism, I do think reasonable people can disagree on the issues of where to draw the line on speech. Again, while I personally fall into the less restrictions camp, I don't think its unreasonable for someone to want restrictions on certain types of speech and thus I don't automatically assume those arguing for these restrictions are authoritarians in disguise. If you think this makes me a neo-Nazi I would prefer you just come out and say it, this is of course in the interest of "honest debate".

Edited for spelling and grammar.


> Not only can I not see how this supports your assumption that suppression works, it actually demonstrates a positive relationship between suppression and these sorts of movements.

You've got cause-and-effect backwards. More and more people are deplatforming as they realize that white-nationalism is a bigger problem than they once thought.

-------

Deplatforming works. Lets not look at white-nationalism, but lets look at "Elsa-gate" instead. Children were watching creepy "Elsa" videos (from Disney's "Frozen"). How do you stop this? You ban them from the site.

Bam. Children don't watch them anymore, cause those videos are banned.

-----

How do you solve the problem of white-nationalists recruiting on this webpage? Well, you ban hate-speech.

The problem is that White-Nationalists can simply... go to Facebook... or Youtube... as recruitment grounds. The big websites aren't cooperating yet. This needs to be a systemic top-down effort, unified across the major websites.


> There is no guarantee that Nazis will not again take advantage of a population that is used to follow the state's lead, this time to silence you, or that another group won't do the same.

And that's fine. This argument keeps being brought up over and over again, but if this happened, nothing would please more because it would remove the ambiguity from the situation.

> It's frankly incomprehensible to me how anyone who doesn't support totalitarianism can look at history (or present day) and dare to normalize any amount of state control over speech.

I mean, the fact that you can look at The Holocaust or Jim Crow and think that any amount of tolerance should be shown to Nazi'ism or white supremacy is beyond incomprehensible to me. But that's the issue.

For a certain type of person, any legislation curtailing 'free speech', even if that is done to stamp out white supremacy is an existential threat to freedom in America.

But to people of color, allowing white supremacists to spout intolerance publicly with no repercussions (other than maybe getting 'cancelled') is an existential threat to THEIR freedom in America.


> Why would they shutdown Muslim websites and accounts though?

That's a good point; one I hadn't thought of.

I hope we can still agree, though, that all people who advocate for white interests are Nazis.


> Are "literal" nazis only who this new rule will be applied to?

White nationalists are literal Nazis, so... yes?

> Could you possibly provide any examples of such statements on relatively mainstream sites, and also note whether you encountered it organically or found it via a google search?

Were you asleep for Charlottesville or all the coverage of it? White nationalists literally marched with torches chanting "blood and soil" and "we will not be replaced." What do you think those messages mean to non-whites? What do you think they're advocating for when they say those things?


> It’s very alarming they’re picking a side. It’s one thing to censor violent/hateful speech above some arbitrary threshold...

Heres a quote from, jack renshew, one of the people they’ve banned:

> Hitler was right in many senses but you know where he was wrong? He showed mercy to people who did not deserve mercy ... As nationalists we need to learn from the mistakes of the national socialists and we need to realise that, no, you do not show the Jew mercy. [0]

They’re not banning your kooky ol’ conservative grandpa, these are actual hateful people who’s movements have been using Facebook to spread racism and misinformation intentionally.

Facebook has become widely known as a site full of misinformation—I can’t blame them for wanting to change that reputation. If it were my web forum I’d certainly do whatever I could to change if my site were known as a racist infested misinformation brand.

[0]https://en.m.wikipedia.org/wiki/Jack_Renshaw_(far-right_acti...


> Turns out not many people like talking to or platforming fascists.

Ironic to call the group you forcibly censor “fascists”. Is the alt-right even into promoting censorship of opposing ideas?

next

Legal | privacy