Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I think deplatforming is exactly what we need here: neonazi and white supremacists are people who are reliant on easy to use discourse-amplifying platforms. The more you make it difficult for them to gather virtually, the more you’re breaking the spell that keeps them together. Before social media, physical neonazi or white supremacy chapter had A LOT of work to do to radicalize people, and were very easy to police and keep under control. With online-based organizational tools, radicalization became easier and harder to patrol. Do you think a random Arkansas soccer mom would have ever joined anything as crazy as Q-anon conspiracy, if she had to attend Q-anon chapter in person instead of participating in an online forum while doing the laundry?


sort by: page size:

White supremacist propganda and neonazi recruiting is what causes radicalization. Deplatforming those people is a good idea, yes.

Unlike what leftwing youtube and a few researchers from liberal universities would like you to believe though, Ben Shapiro, Jordan Peterson, Stefan Molyneux and conservative news sites are not radicalising people. There's no evidence for that.


If someone is threatening to commit violence, there are already perfectly effective legal ways of dealing with it. If anything, if you do deplatform them , noticing that speech becomes significantly harder, while they become more radicalized (and if Zuckerberg did not exist, anti-semites would have had to invent him; but he does exist).

To go back to the original article, though, it is interesting that they are banning white nationalist content. Does it mean that any other nationalist content is just fine. And can I start an Even[0] separatist movement on FB, calling for eradication of the whitey? We'll start with Russians, of course, but after them we're coming for you! /s

If I didn't have better things to do, a conspiracy theory of FB (they did elect Trump, as everybody clearly knows) being in cahoots with Stormfront and specifically calling out whites to stir resentment almost writes itself here.

[0] https://en.wikipedia.org/wiki/Evens


It is crazy to think that ideas can be extinguished. These purges will serve to further concentrate and radicalize the adherents to these ideologies.

edit: I may have been wrong, it seems deplatforming might work. [0] TIL! I had often read that echo chambers exacerbated radicalization but it seems more research has yielded better data to turn over that conclusion.

edit1: it seems like radicalization is more nuanced, and different ways to handle communities based on its composition (eg. it is different to introduce someone to an ideology vs pushing a person already exposed into a more extreme view). Found this video to be informative [1].

[0]: https://www.vice.com/en_us/article/bjbp9d/do-social-media-ba...

[1]: https://m.youtube.com/watch?v=HY71088saG4


But the country was full of sects long before the Internet was even a thing. The Oklahoma City bombers are a good example of terrorists which were radicalized by offline cults. The only difference is that now anyone can see it online while before it was hidden unless you looked for it.

Like your racist uncle, before people just heard him rant at the thanksgiving table but now the whole world can read his rants on Facebook. The racism was always there, it is just harder to ignore today.

So I'd like to see proof that the Internet actually helps radicalize people. It is true that it makes it easier to find radical groups, but it also makes it easier to find other views which would hinder radicalization. Before if you made a few radical friends they could easily become your whole world! So my naive assumption is that both forces roughly cancels each other out making an open Internet mostly neutral in terms of radicalization.


> “Most people engaging with extreme right ideologies, memes, views are doing so online and often not part of a formal network, organization or political movement.”

This seems like a problem that could be solved by giving right-leaning people a platform to peacefully organize. Deplatforming and segmentation leads to right-wing activists lacking good examples for peaceful demonstration. There are peaceful right-wing political activist organizations but they get deplatformed by intolerant anarchists like this article's author and then others have trouble finding peaceful outlets for their viewpoints.


They are definitely becoming more and more radicalized, but I don't know where you get this white power stuff. Their leader is a minority and many of their members are black, and other minorities.

However, I think deplatforming them has made them more radical. I think that we are going to see this more and more, by isolating people and forcing them to go into places where there are no public postings, nobody is going to see how radical they get. You are going to lose some people but the really dangerous people will wind up reinforced in their belief that they are being persecuted, and if they think they have nowhere to go and nothing to hope for, I think we are going to start seeing some real ugly violence.


That is exactly right. It also gives them an intellectual martyrdom card and a veneer of coolness. If you want the next hip counterculture that all the kids are into to be Naziism, by all means suppress it.

Nazis don’t deserve to be handed the “2 Live Crew effect.” For those who don’t know this was a mediocre booty rap group whose record went triple platinum after attempts were made to ban it.

A parent poster raised another great point: the echo chambers that are full of this toxic stuff are themselves not free speech zones. They are heavily moderates safe spaces for specific ideologies and cultures. I am not convinced that this stuff would fare so well on a level intellectual playing field. It’s asinine.

On a final note it’s important to point out that the original explosion of e.g. Qanon happened not via some decentralized free speech zone or even the chans (where it was always ironic or a prank for most) but on social platforms with algorithmic timeline.

The algorithmic timeline is what has really ruined online discourse and promoted an explosion of insanity. Discourse on these sites is not equals but is weighted toward what will maximize engagement. Trolling, demagoguery, shock mongering, and various forms of “porn” (outrage porn, fear porn) are what maximizes engagement.

The algorithmic timeline would be extremely hard to implement in a decentralized system. That’s perhaps a good thing. It might look more like the genuinely flat forum world of the old Internet where discourse was comparatively far more reasonable.


For those interested in the history and process of radicalization, I strongly recommend David Neiwart's book "Alt America". [1] He's a journalist who spent decades covering the "Patriot" fringe in the US, which often had elements of white supremacy, conspiracy thinking, anti-government paranoia, and other nuttery. It gives him unique depth on how the Internet, for all its benefits, also made it much easier for political extremists to connect and organize.

[1] https://www.amazon.com/dp/1786634236


The only worse thing are Nazis organizing on public platforms is forcing them to congregate only with other Nazis on radical websites where no one will challenge their worldview.

Sorry, don't agree. Forcing them to congregate on unpopular websites is great because those websites have a terrible reputation and that massively hinders recruitment. Sure, they're echo chambers, but extremists already use private fora for more underground organizing activity and likewise there are people who specialize in monitoring and infiltrating those websites.

If your argument is that they just need to be talked around with reason and logic, you can always go to one of their websites and give it a try. Sure, deradicalization can work and it's great when it does, but generally the odds are poor and it's extremely labor intensive.


These efforts of course completely miss the point. You cannot radicalize somebody who isn't predisposed to whatever radical dogma they get sucked into.

Almost all of these domestic terrorists (which is what they are) are white supremacists. Why is that? Why aren't there, say, communist domestic terrorists? Because the US was founded on white supremacist principles and those views have been completely normalized, even today.

Look no further than the most popular presenter on the number one US "news" network, pushing the Great Replacement theory [1] (more [2][3]). This is no longer a fringe theory. It's been completely mainstreamed [4].

Elon Musk's recent statements [5] illustrate just how normalized these views have become:

> “The Democratic party is overly controlled by the unions and the trial lawyers, particularly the class-action lawyers.”

The US has among the lowest rate of union coverage [6] and of course Musk's comments align with a hyper-capitalist view of crushing labor organization and avoiding getting sued (fun fact: the "trial lawyers" criticism is 90s era Republican dogma relating to tort reform) but there's a lot of rhetoric about how "radical" the left has become, which is completely laughable given the history of the DNC and prominent Democrats to weed out and defeat any remotely progressive elements on the party.

The Buffalo shooter was clearly radicalized through 4chan (/pol/ in particular) and Discord. I honestly don't understand the appeal of Discord. The UI/UX is beyond awful and it has no discoverability but I guess that last issue is actually a feature when it comes to being a vessel for hate groups, unlike all the "glowies" (aka Federal agents) that are so prevelant on 4chan.

But blaming those platforms misses the point completely. The US has never had a reckoning with its racist origins and this will continue until it does.

[1]: https://www.independent.co.uk/news/world/americas/us-politic...

[2]: https://www.vox.com/23076952/replacement-theory-white-suprem...

[3]: https://www.npr.org/2022/05/17/1099223012/how-the-replacemen...

[4]: https://www.washingtonpost.com/politics/2022/05/09/nearly-ha...

[5]: https://www.cnbc.com/2022/05/18/elon-musk-says-hell-vote-rep...

[6]: https://stats.oecd.org/Index.aspx?DataSetCode=TUD


YES. Recently many platforms were key in deplatforming ISIS and slowing the rate of radicalization. ISIS was an internet phenomenon and when their media was pushed to less famous outlets then their outreach was weakened.

Ever since the beginning of the Internet webhosting companies have colluded to deplatform the KKK and a few other notorious white supremacist groups. This has been largely successful in curbing the outreach of the KKK in the United States. It's also why YouTube is getting a lot of criticism for embracing white supremacist groups and facilitating thier growing outreach. This is a break from long-running comm business practices.


You know, if these people weren't pushed off the mainstream platforms they may have been de-radicalized by regular people before things got worse. Just look at how QAnon festered underground until it was too late to get rid of.

There are mealy-mouthed refrains on all sides of this issue. For instance, I would be happy to see Tom Metzger and his White American Resistance group banned from Facebook, however the core of his white ethno-nationalist agenda isn’t too different from the black ethno-nationalism espoused by Louis Farrakhan and the Nation of Islam. In fact Tom Metzger has been invited to talk at NOI events and collaborated with their leadership in the past. Banning NOI on Facebook would cause an uproar, but a consistent policy would require it. A lot of organizations who want to ban certain groups but don’t want to have an entire department adjudicating culture would like to rely on outside groups for certification. But even once-august groups like the Southern Poverty Law Center can lose the plot, as they did when they recently and perplexingly called liberal Muslim reformer Majid Nawaz an anti-Muslim extremist (they had to settle a lawsuit with him and issue an apology). [1]

I’m not saying that a company like Facebook shouldn’t take sides at all, but you can certainly see the attraction of a simple open platform doctrine once you get into the weeds.

[1] https://www.theatlantic.com/international/archive/2016/10/ma...


And it's a popular idea in places like HN, too. If you are radical and not in a way that is sanctioned by the MSM, you're guilty of associating with Nazis and "conspiracy theorists" by mere virtue of being deplatformed like they have been.

So you think people who engaged in fringe ideologies will be slowed down by having likeminded people banned and deplatformed from popular social media and chan forums? And not instead just pushing them further and further into their ideological bubbles (complete with a new self-fulfilled victim complex) on platforms where they are the only ones and they get to police their own wrongthink?

Even ISIS seemed to have an extensive social media identity despite countless attempts to prevent them from having any platform. Which included plenty of DDOS'ing too.

I’m sure some level of banning and administration makes sense on content sites (not so sure about DNS/WAF hosts) but I’m curious at what point it becomes “feel good” slacktivism while these guys just hop onto the next forum.


I'd posit two things:

Who on earth says they need or deserve a "public channel to express themselves"? The world seemed to get by just fine before social media existed, and I have no doubt we'd all be perfectly fine if it were gone tomorrow. If the white supremacists are stuck going back to physically mailing letters around and struggling to organize that's a net win for society...

Where exactly did you come to the conclusion they're expressing themselves peacefully? So far we've had the Capitol stormed, and a thwarted attempt at kidnapping and murdering a state governor. Neither of those two events are anything resembling "peaceful".


I don't believe that the average person is interested in racist ideology. The people it tends to attract are disaffected white males. They tend to have poor education and poor job prospects i.e. no hope for the future. My guess is that they were already radicalized before being heavily exposed to racist ideology online. They will still be able to cluster online. It just won't be under our watchful eyes anymore. It'll be similar to how pedophiles cluster online.

I hope that you're right and that I'm completely wrong about this.


Completely disagree. It’s echo chambers online that are radicalizing people.

But they're organizing online. That's the thing. When it was just the Jonestown cult or the Waco terrorists, that was at least localized. But now they're able to use the Internet to whip up 10k people to assault the Capitol when they don't get their way. That's a real problem.
next

Legal | privacy