The extreme cases can be found in cults where followers cling tighter to their beliefs once exposed.
A great example are the followers of Harold Camping, the Christian radio broadcaster who predicted the end of the world a few years ago, and kept re-predicting when it never came.
And your statement is something academics have shown empirically :) (I just find it slightly amusing, no point really) I think the most famous instance was that story about the cult whose prediction about the end didn't come true, and they ended up being even more convinced and made up all kinds of excuses instead of quitting (the cult).
There's an interesting concept I read about, how members of doomsday cults tend to believe MORE strongly after the doomsday date passes and everything is shown to be false.
It's something about cognitive dissonance.
I wonder if there is a known way to convince someone who is in this state. Cult debunkers and such might have thoughts?
As someone who has been on both sides of that belief, I can agree that it generally has a stabilizing effect on people's mental state. However, once you lose that belief, it is very difficult to make yourself believe it again. And probably even harder for people that never believed in the first place.
"Everything else they believe follows from this first unprovable belief."
Not accepting an idea if they think it conflicts with their personal interpretation of the Christian Bible can be a crippling inflexibility, and it (IMO) betrays a kind of inverted hubris when it's not accompanied by a strong admonition to not be too cock-sure about one's personal interpretation, because that will evolve as a person changes through growth and experience.
Not being able to do this is the source of a lot of cognitive dissonance that happens when a religious authority (operating from a prophet motive, no doubt) makes a confident prediction that winds up not coming to pass. A really recent example is all of the evangelicals who predicted that Trump would be re-elected after he provably lost the 2020 Presidential election. Some of the "prophetic community" apologized online (which they then retracted under intense pressure); most of the rest of them took refuge in "Satan stole the election!" which helped facilitate the Jan 6 insurrection.
Research into belief perseverance shows that even when people are shown overwhelming evidence contradicting one of their beliefs that they dig in their heels and stick to the belief anyway. [1]
There's also Frank Lunz's observation that many people form opinions based solely on the emotive content of the words (Russell conjugation [2]) they're presented with, regardless of the facts. [3]
Hasn't it been shown that facts don't change people's mind, but (generally) makes them hold tighter to their view?
> Belief perseverance (also known as conceptual conservatism[1]) is maintaining a belief despite new information that firmly contradicts it.[2] Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon known as the backfire effect (compare boomerang effect).[3] For example, in a 2014 article in The Atlantic, journalist Cari Romm describes a study involving vaccination hesitancy. In the study, the subjects expressed their concerns of the side effects of flu shots. After being told that the vaccination was completely safe, they became even less eager to accept them. This new knowledge pushed them to distrust the vaccine even more, reinforcing the idea that they already had before.[4][5]
Huh, very interesting. Quote from the twitter thread linked in a sibling comment:
> When preferences for beliefs are widespread, this constraint gives rise to rationalization markets in which agents compete to produce justifications of widely desired beliefs in exchange for money and social rewards such as attention and status.
As an example: when people would really like to believe in an afterlife due to fear of death but cannot find evidence for such an afterlife themselves, people and organisations will pop up to cater to this belief in various ways and the most "believable" offer will win out in the long term. This seems obviously true, though I had never considered it in these terms before. I wonder what the other obvious occurrences are (climate change beliefs and Ukrainian war propaganda seem like obvious examples) and/or if there are any direct applications of this principle other than becoming a better propagandist.
There is a certain kind of person who is stuck in their belief and they will do everything they can to ensure that their belief system remains in one piece. They will raise the bar for evidence that contradicts their belief to the point that no such evidence remains, they will lower the bar for evidence that confirms their belief. They will ascribe to figments of the imagination of others when it comes to their personal experiences (because 'anecdote') and will claim to be doing science while ignoring the fact that science is by definition pretty messy until it has settled.
The extreme cases can be found in cults where followers cling tighter to their beliefs once exposed.
A great example are the followers of Harold Camping, the Christian radio broadcaster who predicted the end of the world a few years ago, and kept re-predicting when it never came.
http://en.wikipedia.org/wiki/Harold_Camping
reply