Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The author is correct that in any debate, the prominent skeptics are often extremely well informed (joe Rogan on vaccines, in his example).

I think what the author is missing is that Joe Rogan and the likes are not spreading misinformation, but spreading tools for misinformation.

My mother in law is very deep in misinformation. I’m not talking simple anti vax, I’m talking “trump will raid the Vatican for stolen gold so we can be a free country for the first time since the 19th century” misinformation, like QAnon tier stuff.

Once, when I was still trying to convince her to get a vaccine, I told her that after Israel’s early vaccine efforts, both cases and deaths are down heavily. Her response was to pull up a screenshot of a fox headline from a YouTube video that simply stated something like “cases up 60% after vaccination in Israel”

I don’t know the context of the headline. I’m sure in whatever context, it was accurate and not misinformation. But the actual spreaders of misinformation use skeptical talking points as weapons to spread misinformation, like the vaccine being a weapon to kill off the population for the great reset.

Anyways I don’t think it’s wrong to be a skeptic. But skeptic talking points being mass weaponized on social media is absolutely new and uniquely enabled by modern technology.



sort by: page size:

I am genuinely surprised at the comments in this thread. This article starts "That said, I do not think there is much evidence that misinformation has become more widespread, that this increase in misinformation is due to technological change, or that it is at the root of the political trends liberals are most angry about", which is immediately making this a partisan issue.

Secondly: "The internet makes me better informed". This is his perspective. Many people are not actively looking for information outside of their own biases. Thirdly: "Cranks often know a lot"... Should be titled "Cranks often think they know a lot". Just because someone can state a fact does not mean they are sufficiently educated to analyze that fact. Saying that there are "vaccine side-effects" (stated in the article) does not mean that Joe Rogan is able to weight the information in totality with other information from the studies. After this, the article got into a bunch of "whataboutism" and doesn't really compel me to believe that misinformation is not a problem. I agree with polarization getting worse, but could that be a symptom/signal of misinformation, is there a cause/effect relationship?

My thoughts:

There are multiple categories of false information spreading on the internet right now. I honestly believe that there is manipulation of the public happening on all sides of this problem, the thing that is still unclear is whether there is a coordinated source of this manipulation or if it is organic.

I disagree pretty strongly with the "misinformation problem" being misinformation. Anecdotally (yes, I know it's not scientific but it is the same information the author is giving) the amount of things shared on social media that are demonstrably wrong has increased over the past couple of years. There are people who truly believe that Trump was going to rally the military on January 20th or whatever date to "take back the white house". This got shared to numerous social media sites and was reposted that the "true president" was coming back. This is one explicit example, but similar things have been popping up for the past couple of years and spreading like wildfire. 5G vaccines, QAnon, flat earth, and many other fringe movements exist and ARE misinformation, the questions is how broad these movements are.

If you claim that misinformation is just "information that is partisan" or "wrongthink" I would urge you to dig further into some real, damaging examples. I think there are times when the term is weaponized, but there is a serious increase in how easy it is to spread words in this day and age. The internet, social media upvotes and shares, and other technological tooling has made the problem worse by letting people feel validated by opinions and "likeableness" of something versus things based in facts.

Real people are dying because of perpetuated information on social media and "news" outlets. Yes I understand that "individualism" should let them decide whether to make that choice, but there is a public health issue here when people are basing that choice off of Facebook memes and shared content that has no basis and gets 100k shares because it sounds cool.


> person-to-person spread of misinformation

You know, for as much as I've heard about the supposed danger of "misinformation", so many factual observations have been labeled "misinformation" early on, only to be later confirmed by WHO, a major national authority, or a medical journal.

The type of misinformation matters a lot, because if people take the concerns about permanent lung damage, reproductive damage, and the extreme contagiousness of the virus seriously, it doesn't matter if they were "misinformed" to some extent. If the misinformation encourages them to not take things seriously, or to breach social distancing (see Rudy Gobert, who should hope he hasn't committed homicide for a laugh), that's when it is the most dangerous.


"Spreading misinformation."

> What part of this is untrue?

The characterization of pandemic conspiracies and political disinformation as harmful.

It's like kids and allergies. You never hear about kids who grew up on farms being allergic to animals - it's always those whose parents didn't have any when they were growing up.

If people aren't subject to misinformation, they'll never develop the sense of who's lying and who isn't.

It used to be that we gave common-sense advice - "don't believe everything you read on the Internet". Now, it's the other way around - "we must cleanse the Internet of harmful content".

Being exposed to misinformation is good for you, and it's good for democracy.


I don't quite get where the author is coming from.

>And on the basics of civic life, that doesn’t seem to be the case. A survey from the Annenberg Public Policy Center found that in 2006, only 33 percent of people could correctly identify the three branches of government. By 2021, that was up to 56 percent. That’s way higher than 33 percent!

Is that really what anyone means about misinformation? I don't think so...

As for say the Joe Rogan examples, are knowing factoids that someone thinks support their other opinions really knowledge, or are they things they just like to gather to support public statements?

Often I get those factoids tossed at me on the internet and even links to pages that someone thinks supports their opinions but if you read 3 paragraphs in clearly doesn't.

That's not being knowledgeable...

In fact celebrating those little factoids seem right up the alley of misinformation where someone announces that "5 men were discovered to be impotent after getting the vaccine!" where that might be true, but the intent of the misinformation is obviously to push something else.


> My point is that "misinformation" should always come attached with an "according to X". Some Xs are more reliable than others.

The value of specifying epistemic sources is as important when putting forward information as it is when labeling misinformation. Yet most people are generally pretty lax at including and verifying epistemic justifications (especially for claims they are inclined to agree with.)

It is true that the act of labeling things as misinformation can itself be a form of misinformation.

The issue, as I see it, is that we have accepted the practice of deliberate misinformation as long as it is "for a good cause". We don't place as much value in people trying accurately convey information and nuance as we place on the function that information is serving.

I see this everywhere: in the statements about masks that were made early in the pandemic to make sure healthcare workers had supplies. You see this is narratives about the last election. You see this is in a lot of the reporting about Russiagate. There are little to no consequences or reprobation from people who agree with the end goals.

The incentives are all wrong and I don't know how they get fixed, but the problem is much bigger than just "Big Tech", "Social Media" or a bunch of stupid people on the otherside of the partisan divide.


Definitely. This article isn't about "disinformation and dissent", or even about why "the war on disinformation" is bad. Instead, the author seems to be waging his own war on misinformation -- it's just that the misinformation comes from everyone else.

This is emblematic of the "conservative" response to the "liberal" response to misinformation. As an example, Trump was clearly the most visible (and therefore most impactful) source of COVID disinformation in the US. The liberal response was to say that this was dangerous because people would not take the virus seriously. However, the author instead blames Fauci and the CDC for our poor response. Apparently, revising one's statements based on new information counts as lying.

If part of this article is to point out why we cannot trust our institutions (and hence why we get misinformation), why not point out the most impactful sources of misinformation? The author says the following:

> misinformation and disinformation naturally abound when there is very little trust in sense-making institutions

More importantly, disinformation naturally abounds when a sense-making institution is itself spewing disinformation. This, however, does not seem to be important to the author.

The clearest example of why the author's examples do not support his thesis is his take on Fauci, the CDC, and WHO. If we are to solve our issues with public discourse and better seek the truth, we must be good at changing our minds when new evidence comes to light. The author instead seems to claim that public official revising their guidelines counts as lying and misinformation. Instead, we could consider this the paragon for truth: we revise our beliefs when the evidence indicates we should.

Though I think this article nearly entirely misses the mark, and that the author is severely misguided in his thesis, there are still nuggets I agree with:

> Yet rather than take responsibility for poorly-informed policy decisions, the common refrain from public institutions has been to blame everyday Americans.

Absolutely agree with this. Everyday Americans can hardly be blamed for this situation, IMO. The blame needs to the source of the disinformation and the source of inconsistent policies. I'll certainly agree that our public health officials should have done a better job with communication, even with the heavy amount of disinformation from the president.

> When the only acceptable information is that approved by the ruling administration, there can be no meaningful check on state power.

I absolutely agree on this too. Clearly, government censorship is not a good idea. A good example is the personality cult around Trump: many of his supporters do not care what he does. Anything approved by Trump is accepted as fact, and if this group of people had more power, it would become very difficult to check. We absolutely should be very wary of anything which gives the government control over information.

And finally, I'll agree with @azinman2 from the parent comment:

> We need to find a way to fundamentally change the incentives in media

So true. The incentive of media (both social media and news media) is focused wholly on engagement and driving clicks for advertisers. This is fundamentally opposed to the search for truth. It only serves to increase the amount of incendiary media, which increases polarization, etc. It is a terrible situation, and I think we often underestimate the consequences. I'll happily agree that media should not be trusted, simply because the incentive to report truth does not exist (or at least is trumped by the incentive to make money).


>So in your argument, even smart well intentioned people who disagree with these public policies are spreading misinformation? You are asserting that everybody who disagrees is wrong and harmful to society, correct?

Whereas you're asserting that it's impossible for smart people to spread disinformation?


I think OP is overestimating the critical thinking skills of the population. That is why misinformation is so effective

It’s the pervasive high handed tone adopted by propagandists posing as journalists to manipulate readers. We see this propaganda language now usually signaled with phrases like “claims without evidence,” “misinformation,” “lies”... Then, the “journalist” provides no basis for their assertions, instead relying upon their status as journalists, when in fact they are propagandists usurping the role of journalists. Generally, logical fallacies like ad hominem or others follow to craft some desired narrative, never based on critical thinking. This article does just that. “Misinformation”? Says who?

These people are fighting misinformation with misinformation

"For example, the paper says that tech companies have “fumbled their way through the ongoing coronavirus pandemic, unable to stem the ‘infodemic’ of misinformation” that has hindered widespread acceptance of masks and vaccines"

The numbers on masks and vaccines show that acceptance is widespread. What are we looking for here, approval rates to rival Kim Jong Un? The worst thing that could happen to what little is left of social cohesion is an even stricter attempt at controlling information.

Sure, some information shared on social media is misinformation. Some information coming from mainstream media is misinformation, too. In some cases, the authorities will spread misinformation. After all, some of the "evidence" used to argue that masks are ineffective came from the CDC itself.

With the politicization of everything, even facts can not be considered neutral anymore. For every verifiable fact, there's a set of other verifiable facts that may be omitted to achieve the desired effect. Fact-checks are used for propaganda. When consuming information, always keep your salt dispenser at hand.


> "misinformation" is indistinguishable from "thinks I think are false" which in turn is hard to distinguish from "things I disagree with".

Only if you believe the post-modern claim that there's no objectively verifiable truth and that all narratives are equally valid.

There is a huge difference between "vaccines contain microchips" and "water retains the essence of homeopathic ingredients" on the one hand and "COVID spreads through droplets" on the other.

EDIT: I do agree billionaires shouldn't be the final arbiter, though.


> Misinformation is information that is known to be false by the spreader.

That's wrong. Misinformation is often spread unwittingly.


Sure, but as someone who agrees with you, I’m not sure we can call that ‘misinformation’. It’s just poor understanding and bad arguments.

I have seen a little misinformation relating to this subject - direct lies, fake news, fake links etc, but a negligible amount.


What's with a news source promoting misinformation?

That seems like a dangerous generalization that someone can use to hand-wave away any actual misinformation, which as the blog mentions, does exist and is spread (albeit by a small percentage of very prolific posters).

Misinformation exists, and it is a problem. At this point we've seen some evidence that state actors put some of it out there during the 2016/2020 elections, and it can be spread organically.

That said, it's not nearly as widespread or problematic as it's made to be by much of popular media.


'misinformation'

> misinformation is itself an assault on free speech.

Misinformation is a type of free speech. Whether purposeful (medical experts lying as to whether we should wear masks) or accidental (weather forecasting,) misinformation is merely a facet of freedom, to use wisely or not. Some instances are made illegal (e.g. in a stock exchange,) whereas most instances are seen as harmless (Santa Claus) or used to prevent greater harm (this shot will only hurt a little, Louise.) What's healthy is to reserve a bit of skepticism towards the information you get till you can confirm it is 'truthful' or not.

next

Legal | privacy