Underdetermination is the idea that evidence available to us at a given time may be insufficient to determine what beliefs we should hold in response to it.
Objective facts are so few and far between and based on assumptions that are either vague in their definitions, limited or flawed that chasing them appears to enforce keeping an open mind. These two paradigms appear to be intrinsically linked.
I think what I'm trying to say is that because absolute truth is so hard to determine, it is imperative that people have the ability to decide for themselves from the data what they believe to be true, even if it is "wrong" (because sometimes what is "wrong" turns out to be true in the end).
"When one's core beliefs have no basis in logic or objective reality, it's simple to maintain consistency. All that's needed is the invention of some new concept to explain the discrepancy."
As I've gotten older, I've started to doubt the idea that there are any objective facts at all, or at least if there are, the human brain has a limited capacity to comprehend and communicate them.
(Edit) This doesn't mean I don't believe in truth, right/wrong etc... it means that I'm constantly balancing what's most likely to be trueish - subject to higher quality information at a later time.
This statement, given the previous comments and the context of the article, belies a certain depth, a conundrum I find myself facing periodically.
The basic notion being that, that which we believe, is true. This is nonsense on the face of it. Being nonsense doesn't prevent it from being a seemingly fundamental and perplexing aspect of being a human.
Gettier cases long ago convinced me that it is useless to talk about 'truth' apart from 'justification'.
Obviously, none of us can ascertain truth directly. The only way we can claim A or B is true is by appealing to some justification for our belief. Therefore, the idea that [knowledge = true and justified belief] collapses to [knowledge = justified belief].
From that point, we have only to discuss what constitutes better/worse justification(s).
I suspect this conflates two semantically distinct kinds of 'belief': 'I hold the value that fairness is more important than freedom' is not equivalent to 'I know certain facts to be true.'
The scientific method is one example of a principled approach to needing evidence for everything: there is no belief, there are only theories that best explain observed phenomena. Ideally, these should be shed like socks when a better model comes along.
I haven't read the original paper, so maybe the example is better, but it seems the cow example fails the justified condition. The knowledge is justified if it derives from the evidence, but once we know the evidence is faulty it can no longer be used for justification by definition. It seems by extension that any justified true belief can become unjustified by the addition of new information that invalidates the justification on which the alleged knowledge is based upon.
I agree with you, but we have to be aware that by this definition two diametrically opposed "knowledges" can be equally valid if they have disjoint justification frameworks.
And that leads to relativism, which we don't want.
I feel that a lot of people have this assumption, unconscious or not, that the world is entirely objective. That everything can be measured, and what isn't measurable doesn't exist. I think this belief (which essentially elevates the scientific method to a religion) is causing a huge amount of grief in the world.
https://en.wikipedia.org/wiki/Underdetermination
This is closely related to the is-ought gap - now that I know the facts, what am I supposed to do with them?
reply