Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Jesus, how can anyone spout the term “dangerous misinformation” with a straight face?

These are adults you’re talking about - either you believe the have the capacity to validate what they here on their own or you don’t.

Humans are messy, someone will hear idea A (good) and turn it into idea B (bad). Some will be told the right thing and choose not to do it.

We don’t suddenly treat all adults like children (which is what censoring Rogan would be) just because a few decide to make stupid decisions.



sort by: page size:

People have always been that way. Before Rogan showed up, before the internet was even invented, there were "nutters" that had deeply misinformed perspectives which they followed, sometimes, to death. I'm sure someone will argue scale is the differentiator, but I'd argue that the proportions of people who ignore facts vs consider facts, is probably unchanged. (Although admittedly, I have no statistics)

Also I think a big part of the problem is the aggressive labeling of content as "misinformation" or "fake news." To me, misinformation implies propaganda issued and promoted by an enemy entity. But today, it is a term that is used to mean anything that has a fact (whether verifiably correct or incorrect) that implies a conclusion that is generally unacceptable.

For example, if the generally acceptable premise is: "everyone who is able should get a vaccination," then publicly talking to someone harmed by a vaccination (even if it's true) would be considered misinformation, because it potentially concludes something opposing the acceptable premise.

If we can't openly share ideas, good, bad, informed, misinformed, then the 99% (fake number) of us who aren't "nutters" that follow bad advice to extreme conclusions, will be denied the volume of data, perspectives, and opinions we need to make a truly informed decision.


Black-or-white thinking like this is false and damaging. The world has nuance, and the idea that “[people are] independent enough to make their own decisions. Period” rejects that nuance and substitutes a simplified and idealistic model of human behaviour which is alluring but does not reflect how human brains actually work.

It is not patronising to say that people are not all created with the same set of skills, beliefs, and values, and that some will engage with obvious bullshit. Many are not capable or interested in engaging with complex topics in a way that does not self-reinforce their pre-existing opinions. I have discussed this elsewhere[0].

When a group of people are motivated to exploit the weaknesses of others in order to get them to do things that are damaging to our democratic institutions, and they use misinformation to do it, it is unpleasant but not unreasonable to me to suggest that spreading lies through misinformation is as serious as suppressing truth through censorship and that they should both be treated equally seriously.

[0] https://news.ycombinator.com/item?id=25359003


> You should be able to read "misinformation" and easily discern it as such.

I'm afraid this is equivalent to saying "You should be able to be offered high-sugar food, and avoid it to an appropriate degree." This may be true in the moral sense, but practically speaking most people will fail this test.

To be clear, I'm not suggesting the answer is censorship, but just pointing out that large numbers of people cannot meaningfully deal with misinformation.


This is dangerous misinformation.

Blatant and dangerous misinformation? You're gonna have to back those bold claims up, otherwise what's stopping us from claiming that you are the culprit and should be gulag'd?

You have no idea what you're talking about, which is what's so fucked up about this whole situation. This is all a dystopian cliche. Opinions are not misinformation, and looking at opposing evidence and arguments is not misinformation. It never was, and never will be. The only people who ever claimed such things are historically disgraced totalitarians.

Misinformation comes from states and organizations with ulterior motives who twist and hide the truth for their own gain. It's about power.

Crying out about how "dangerous" it is to have a real discussion that you don't like is ridiculously absurd. Are you saying adults are too stupid to think for themselves, or should not be allowed to take calculated risks? Do you have some kind of proof that having discussions is bad for individuals and/or communities in some way?


The problem is not misinformation. It’s whoever gets to decide what is “misinformation” and what is not.

Are you arguing that the people who decide what's misinformation and what isn't are somehow immune to that kind of hack on their reasoning faculties?

No one gets to be the decision maker on what is misinformation and what isn't. Nobody is the exception to that rule. No matter how good faith and good their intentions are. It's not how you're doing it, it's that you're trying to do it at all that is the problem.

The author is correct that in any debate, the prominent skeptics are often extremely well informed (joe Rogan on vaccines, in his example).

I think what the author is missing is that Joe Rogan and the likes are not spreading misinformation, but spreading tools for misinformation.

My mother in law is very deep in misinformation. I’m not talking simple anti vax, I’m talking “trump will raid the Vatican for stolen gold so we can be a free country for the first time since the 19th century” misinformation, like QAnon tier stuff.

Once, when I was still trying to convince her to get a vaccine, I told her that after Israel’s early vaccine efforts, both cases and deaths are down heavily. Her response was to pull up a screenshot of a fox headline from a YouTube video that simply stated something like “cases up 60% after vaccination in Israel”

I don’t know the context of the headline. I’m sure in whatever context, it was accurate and not misinformation. But the actual spreaders of misinformation use skeptical talking points as weapons to spread misinformation, like the vaccine being a weapon to kill off the population for the great reset.

Anyways I don’t think it’s wrong to be a skeptic. But skeptic talking points being mass weaponized on social media is absolutely new and uniquely enabled by modern technology.


> With all that, why don’t we simply remove borderline content? Misinformation tends to shift and evolve rapidly, and unlike areas like terrorism or child safety, often lacks a clear consensus. Also, misinformation can vary depending on personal perspective and background.

This is a weird way to say "ideas that we suppress for being misinformation are sometimes actually true."


I am genuinely surprised at the comments in this thread. This article starts "That said, I do not think there is much evidence that misinformation has become more widespread, that this increase in misinformation is due to technological change, or that it is at the root of the political trends liberals are most angry about", which is immediately making this a partisan issue.

Secondly: "The internet makes me better informed". This is his perspective. Many people are not actively looking for information outside of their own biases. Thirdly: "Cranks often know a lot"... Should be titled "Cranks often think they know a lot". Just because someone can state a fact does not mean they are sufficiently educated to analyze that fact. Saying that there are "vaccine side-effects" (stated in the article) does not mean that Joe Rogan is able to weight the information in totality with other information from the studies. After this, the article got into a bunch of "whataboutism" and doesn't really compel me to believe that misinformation is not a problem. I agree with polarization getting worse, but could that be a symptom/signal of misinformation, is there a cause/effect relationship?

My thoughts:

There are multiple categories of false information spreading on the internet right now. I honestly believe that there is manipulation of the public happening on all sides of this problem, the thing that is still unclear is whether there is a coordinated source of this manipulation or if it is organic.

I disagree pretty strongly with the "misinformation problem" being misinformation. Anecdotally (yes, I know it's not scientific but it is the same information the author is giving) the amount of things shared on social media that are demonstrably wrong has increased over the past couple of years. There are people who truly believe that Trump was going to rally the military on January 20th or whatever date to "take back the white house". This got shared to numerous social media sites and was reposted that the "true president" was coming back. This is one explicit example, but similar things have been popping up for the past couple of years and spreading like wildfire. 5G vaccines, QAnon, flat earth, and many other fringe movements exist and ARE misinformation, the questions is how broad these movements are.

If you claim that misinformation is just "information that is partisan" or "wrongthink" I would urge you to dig further into some real, damaging examples. I think there are times when the term is weaponized, but there is a serious increase in how easy it is to spread words in this day and age. The internet, social media upvotes and shares, and other technological tooling has made the problem worse by letting people feel validated by opinions and "likeableness" of something versus things based in facts.

Real people are dying because of perpetuated information on social media and "news" outlets. Yes I understand that "individualism" should let them decide whether to make that choice, but there is a public health issue here when people are basing that choice off of Facebook memes and shared content that has no basis and gets 100k shares because it sounds cool.


Both obvious nonsense and nuanced misinformation can be harmful, they are just targeting different audiences. Not sure why one has to pick and choose, you can report any content you deem harmful.

And the big thing is dangerous misinformation and misconceptions get passed down without professional instruction - you see it all the time.

Even when people are giving their opinions to their best knowledge they can be spreading misinformation.

"What if somebody honestly posts compelling, non-obvious misinformation? How confident are you that would be corrected?"

My hopeful attitude is that none of us are so fragile and gratuitously impressionable that this would be dangerous.

My further hope is that anyone that might be so impacted would incorporate the digestion and (eventual) repudiation of this information as part of their intellectual maturation process.

My final hope is that we all learn how dangerous and stifling it is to hand over defining and legitemising "truth" to others. You need to learn and grow as an intellectual being and that doesn't come in a hothouse protected from all perturbations.


This is upside down.

First - there is tons of examples of 'misinformation' in the PDF. It goes into excessive detail. It's an amazing and thorough bit of research actually. If you want a link here it is [1]

1) The situation is fluid, facts change which requires change in posture by public health officials - which will give the impression of 'bad information' but really it's not at all for anyone paying close attention.

2) People are not censored, they're just told to not talk about things we don't yet understand on massive public forums until there's data. We definitely want YouTube and FB to get rid of bogus information related to life and death issues.

It's understandable that people in the commons don't understand this, but it's not acceptable that intelligent rational people can't grasp the nuance in the situation: 1) facts and situations change and 2) there are a lot of liars.

Those are different forms of noise.

A good example of this is the recent Brett Weinstein / Ivermectin situation.

Brett Weinstein went in an 'emergency Joe Rogan' to make a bunch of BS claims about Ivermectin, and 10's of millions of people ate it up.

But it turns out Brett Weinsten was promulgating a Giant Lie. The foundational study turned out to be fraud. Ergo his claims are total BS, he has no basis for them. [2]

So Brett Weinstein - a generally well meaning and smart guy is not 'helping' - he's 'hurting' by knowingly amplifying early, suspect and net yet validated results.

'The Government Is Censoring Me' is the Hustler's Lie.

If the government is forcibly stopping scientists form releasing data, or forcibly stopping people from speaking publicly - that's a problem.

If if YouTube and FB are dropping your videos because you're making unverified claims about important medical information - well that's literally the kind of censorship we want.

We don't want Quack Medical advice telling people to drink lighter fluid, because people will do it.

Brett Weinstein's new public identity (i.e. his source of income) is playing a kind of 'educated conspiracy theorist' - having enough knowledge to parse the situation and yet still misrepresent the facts.

It turns out Ivermectin is most likely bogus, while the 'real' trials going on at Oxford may possibly show some value, as of today, we don't have any supporting data for Brett's claims - making he, Rogan and even Lex guilty of garbage misinformation tabloid populism.

It doesn't matter what the 'end result' of the trials are, the fact is the information is very grey (and not looking good) and it's too early to be presented as a cure.

If they were responsible agents of truth, they'd have spent most of the time in the introduction talking about:

'How They Are Not Medical Experts And Have Zero Medical Qualifications',

how 'The Trials Were Limited',

that 'There Are Suspicions',

and 'We Still Know Very Little',

and because of that - 'We Can't Conclude Much At This Point'.

That is the responsible scientific and newsworthy take.

As such, frankly, it doesn't constitute the need for a 'podcast' because there's nothing to say.

Will Joe Rogan, Lex or Weinstein issue an apology/correction for the misleading hyperbole? Probably not. But they're not 'journalists' you say, they're under no obligation? Well then they don't have any credibility to speak on these important issues.

(FYI I actually like all three of those guys, but they've crossed a line here)

The mask mandate is yet another good example:

Dr. Fauci and his peers around the world, had the ignoble task of getting an arrogant and cantankerous global public to change some behaviours.

The 'mask' issue was managed responsibly: masks were needed for the medical community. If 300M Americans made a 'mad dash' for masks, the demand would have swamped the ability of Healthcare to get PPE. While mask are at best marginally useful for the public, they are essential in Healthcare.

Once production ramped up - the marginal effectiveness of masks make them useful as long as public consumption doesn't interfere with Healthcare supply. So then the public communication becomes: Wear Masks.

From early CNN posts last Spring, you can see that the 'You Don't Need Masks' signals are always accompanied with:

"Dr. Maria Van Kerkhove, an infectious disease epidemiologist with the WHO, also said at Monday's briefing that it is important "we prioritize the use of masks for those who need it most," which would be frontline health care workers." [3]

They are telling us there is short supply and that we need masks for Healthcare, which is very rational.

We just went through the biggest economic, public health and global emergency of the century, there will be chaos. Public Health officials use basic Public Communications tactics to get people to do very basic things which create better outcomes for everyone. Shutting down liars and telling people to not broadcast unsubstantiated information to their 300 Million followers (or at very least do it in a very responsible way) is what we have to do.

[1] https://www.counterhate.com/disinformationdozen

[2] https://www.theguardian.com/science/2021/jul/16/huge-study-s...

[3] https://www.cnn.com/2020/03/30/world/coronavirus-who-masks-r...


Deciding what constitutes "misinformation" is highly subjective and can lead to censorship of diverse opinions. A government empowered to regulate speech based on content risks suppressing valid dissenting voices. While misinformation can be dangerous, attempting to police speech can lead to unintended consequences and threaten democratic principles. Instead, a better approach is to provide accurate information, promote critical thinking, and foster open debate. Maintaining individual freedom for everyone across the country is more important than the lives of a few people who choose to follow misinformation.

I think I disagree with your underlying assumption that as long as you, personally, are critically evaluating sources and are 'safe', it doesn't matter whether other people end up in the misinformation spiral.

I can be as critical as I want, if the people around me who vote, have influence on my environment, etc end up in misinformation, it has a huge impact on my life.


Where do you draw the line between someone on the internet "being wrong", and someone on the internet spreading dangerous misinformation? Sure, walking away from the first is often a good course of action, but what about the second?
next

Legal | privacy