Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

This is the commitment and consistency effect.

The extreme cases can be found in cults where followers cling tighter to their beliefs once exposed.

A great example are the followers of Harold Camping, the Christian radio broadcaster who predicted the end of the world a few years ago, and kept re-predicting when it never came.

http://en.wikipedia.org/wiki/Harold_Camping



sort by: page size:

And your statement is something academics have shown empirically :) (I just find it slightly amusing, no point really) I think the most famous instance was that story about the cult whose prediction about the end didn't come true, and they ended up being even more convinced and made up all kinds of excuses instead of quitting (the cult).

There's an interesting concept I read about, how members of doomsday cults tend to believe MORE strongly after the doomsday date passes and everything is shown to be false.

It's something about cognitive dissonance.

I wonder if there is a known way to convince someone who is in this state. Cult debunkers and such might have thoughts?


When people's beliefs are refuted they often double down on them. It's a phenomena that people have studied.

https://en.wikipedia.org/wiki/When_Prophecy_Fails


It's known as belief perseverance (https://en.wikipedia.org/wiki/Belief_perseverance). A well-studied recent example is how people changed their beliefs after it became known that Iraq did not have weapons of mass destruction as was first suspected: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.586... (PDF)

exactly what is described here:

"Why do people persist in believing things that just aren't true?"

http://www.newyorker.com/online/blogs/mariakonnikova/2014/05...

HN thread about it: https://news.ycombinator.com/item?id=7769266


There are two types of people: people who believe the first thing they hear and people that believe the last person they hear.

The second are more unpredictable.


As someone who has been on both sides of that belief, I can agree that it generally has a stabilizing effect on people's mental state. However, once you lose that belief, it is very difficult to make yourself believe it again. And probably even harder for people that never believed in the first place.

What I find interesting (but predictable) is that they persevere in their beliefs even when presented with evidence of their unfoundedness.

"Everything else they believe follows from this first unprovable belief."

Not accepting an idea if they think it conflicts with their personal interpretation of the Christian Bible can be a crippling inflexibility, and it (IMO) betrays a kind of inverted hubris when it's not accompanied by a strong admonition to not be too cock-sure about one's personal interpretation, because that will evolve as a person changes through growth and experience.

Not being able to do this is the source of a lot of cognitive dissonance that happens when a religious authority (operating from a prophet motive, no doubt) makes a confident prediction that winds up not coming to pass. A really recent example is all of the evangelicals who predicted that Trump would be re-elected after he provably lost the 2020 Presidential election. Some of the "prophetic community" apologized online (which they then retracted under intense pressure); most of the rest of them took refuge in "Satan stole the election!" which helped facilitate the Jan 6 insurrection.


Research into belief perseverance shows that even when people are shown overwhelming evidence contradicting one of their beliefs that they dig in their heels and stick to the belief anyway. [1]

There's also Frank Lunz's observation that many people form opinions based solely on the emotive content of the words (Russell conjugation [2]) they're presented with, regardless of the facts. [3]

[1]: https://en.wikipedia.org/wiki/Belief_perseverance [2]: https://en.wikipedia.org/wiki/Emotive_conjugation [3]: https://en.wikipedia.org/wiki/Frank_Luntz#Use_of_language


Or simply that they expect to be poorly received while still believing to be true.

Hasn't it been shown that facts don't change people's mind, but (generally) makes them hold tighter to their view?

> Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, in a 2014 article in The Atlantic, journalist Cari Romm describes a study involving vaccination hesitancy. In the study, the subjects expressed their concerns of the side effects of flu shots. After being told that the vaccination was completely safe, they became even less eager to accept them. This new knowledge pushed them to distrust the vaccine even more, reinforcing the idea that they already had before.[4][5]

* https://en.wikipedia.org/wiki/Belief_perseverance

* https://blogs.lse.ac.uk/impactofsocialsciences/2023/01/24/fa...

* https://www.discovermagazine.com/mind/why-is-it-that-even-pr...

* https://archive.is/BicuE ; https://www.newyorker.com/magazine/2017/02/27/why-facts-dont...

Perhaps useful towards that are more sitting on the fence?


I think in general people are wildly inconsistent in their beliefs and behaviors.

Most people believe crazy things, even those that don't tend to not behave in consistent alignment with what they believe to be true or good.


Huh, very interesting. Quote from the twitter thread linked in a sibling comment:

> When preferences for beliefs are widespread, this constraint gives rise to rationalization markets in which agents compete to produce justifications of widely desired beliefs in exchange for money and social rewards such as attention and status.

As an example: when people would really like to believe in an afterlife due to fear of death but cannot find evidence for such an afterlife themselves, people and organisations will pop up to cater to this belief in various ways and the most "believable" offer will win out in the long term. This seems obviously true, though I had never considered it in these terms before. I wonder what the other obvious occurrences are (climate change beliefs and Ukrainian war propaganda seem like obvious examples) and/or if there are any direct applications of this principle other than becoming a better propagandist.


People will believe anything they fear is true.

There's a phrase for this, it's known as having Strong beliefs, weakly held.

Most people hold wildly inconsistent views in their head all the time, despite obvious contradictions - it’s the default human behavior.

https://www.lesswrong.com/posts/CqyJzDZWvGhhFJ7dY/belief-in-...


In light of new evidence, some people actually alter their beliefs.

There is a certain kind of person who is stuck in their belief and they will do everything they can to ensure that their belief system remains in one piece. They will raise the bar for evidence that contradicts their belief to the point that no such evidence remains, they will lower the bar for evidence that confirms their belief. They will ascribe to figments of the imagination of others when it comes to their personal experiences (because 'anecdote') and will claim to be doing science while ignoring the fact that science is by definition pretty messy until it has settled.
next

Legal | privacy