Certainty Bias Part 1: Overconfidence
Image: Pexels
“Once {Chief Inspector Morse} got an idea
stuck firmly in his brain, something cataclysmic was needed to dislodge it….He
wondered, as he often wondered, whether he had made the right decision. And once more, he told himself, he had.”
-- Colin Dexter, Inspector
Morse series, Last Seen Wearing (1976)
Certainty bias
Certainty bias refers to the higher weight given to 100%-sure
outcomes versus outcomes that risk lower probability but yield a higher payoff:
85% chance of winning $100 with a 15% risk of winning nothing, compared to the
certainty of a $35 prize. Most will take
the $35.
We would rather be sure than become richer. Related is
confirmation bias—the need to look at all information as a boost to preexisting
beliefs to fill the need for validation. The control illusion overestimates our
ability and underestimates obstacles, time, and expense, as well as judging how
well we are going to be able to regulate our own motivations and goal-oriented
energy.
Certainty bias origin of overconfidence
The Dunning-Kruger effect comes out of a study of self-judging
one’s ability – biased up and down the intelligence scale. Individuals with low knowledge of a domain
nevertheless far overestimate their own competence in that field, rating those
abilities higher than higher-competence peers, who actually underestimate their
skills. Finding: limited knowledge also impairs self-knowledge or metacognition,
especially of one’s limitations. The
downward spiral heads lower as ignorance leads to further mis-calculation of
ability. (“Unskilled and Unaware,” 1999)
This need for a positive confidence self-assessment is therefore
self-reinforcing of a certain knowledge that has no core basis. It is perhaps the classic case of
overconfidence from laboratory studies.
And sets up subjects as the unreliable narrators of their own
characters.
At the far opposite end of this scale of self-deception is
the world of Carol Dweck in her Mindset study (2006). Her
model proposes a way to reframe the anxiety of uncertainty by seeing
opportunities for learning experiences as a form of skill development. Shifting the focus “from proving to
improving,” this positive framework allows people to embrace the unknown
without fear, assuring themselves with a constant-improvement life plan for
behavior. Their quality of performance becomes
a development project, not a goal of 100% perfection.
In a world of breakneck change, uncertainty tolerance if not
embrace has become a new mental-health ruling.
The mind has to be retrained from valuing stability to distinguishing
between what can and cannot be controlled or influenced. Immediate closure isn’t possible or even
desirable. Openness takes on increased
value.
Neurologist Robert Burton, MD, has this to say in his book On
Being Certain: Believing You Are Right Even When You’re Not (2008):
I have set out to provide a
scientific basis for challenging our belief in certainty. …Despite how
certainty feels, it is neither a conscious choice, nor even a thought
process. Certainty and similar states of
“knowing what we know” arise out of involuntary brain mechanisms that, like
love or anger, function independently of reason.
Consider the problem of establishing the truth value of a
memory. What tells you that a memory is
real, that it reflects lived experience, reliably recorded and stored? Memory is notoriously unreliable as a truth
document, because each time we access it, we change it in some way that we
can’t later remember: this explains why eyewitness testimony is so slippery. And
why the unchanging written record has enduring value. A vivid memory has no reliable
truth value. Likewise, certainty is a feeling – from the emotional side of the think
/ feel dualism. It is based on our sense
of being right, not on the rational proof of rightness.
Writing could be the preeminent arduous test of
certainty. So many decisions live in
every passage and word choice, as well as idea capture. As Oscar Wilde described the task, “I have
spent most of the day putting in a comma and the rest of the day taking it
out.” So how does the writer know when what he is working on is “done”? There are infinite ways to express any single
thought; so how does any writing progress from draft to rearrangement of words
to final copy? There is no green light
that appears when the writing task is “complete.” Nothing
but certainty bias to let the writer be sure that what has been written is
“right.”
There must be a sense of “this is the final draft” that emerges
once the phrase or paragraph is as good as can be expected given the time limit
and the writing’s purpose. This is why I
like to say that the most important piece of writing a college-bound student
will ever produce (and under a 45-minute time clock) is the College Board Advanced
Placement essay exam.
Another case of being sure:
A psychological experiment by Dr. Bruce Moseley featured a sham knee
surgery, in which osteoarthritis patients with real complaints were “operated”
on in a charade surgery, but no real incision made. Afterwards these “sham” patients nevertheless
reported successful outcomes, saying they were recovering well and their
complaints were resolved. Even after the
real situation was revealed to them, patients nevertheless insisted the cure
was successfully performed—cognitive dissonance in action.
Cognitive dissonance
and memory distortion
A striking example of the unreliability of memory and the
power of cognitive dissonance comes from an experiment conducted by
psychologist Ulric Neisser following the 1986 Challenger shuttle disaster. The
day after the accident, Neisser asked over 100 students to document where they
were and what they were doing when they first learned of the tragic event. More
than two years later, he re-interviewed these students about the same
experience. Surprisingly, approximately one in four students provided accounts
that differed significantly from their original entries. Yet even when
presented with the evidence of their own handwriting, many students insisted
that their current recollection was correct. One student exemplified this
reaction, saying, “That’s my handwriting, but that’s not what happened.”
This phenomenon illustrates cognitive dissonance: the
tendency to reinforce and defend incorrect information or beliefs rather than
accept new, conflicting knowledge. Instead of using fresh evidence to correct
uncertain or mistaken ideas, individuals may interpret it in a way that
reaffirms what they already believe, even if those beliefs are inaccurate or
outdated—called confirmation bias. This attachment to our perceived certainties
can influence and distort subsequent perception and learning. As a result,
people may choose to “know” things that are merely personal interpretations or
entirely unpredictable events rather than remain open to uncertainty or
withhold judgment in the face of ambiguity.
Certainty bias is at the opposite end of the range from awareness
of our many errors in cognition—our collective ignorance of the many ways we
can (and do) go wrong in the face of our longing for belief in a sure thing
that can never let us down. Does this
give us any useful clues in understanding the origins of religion and an
omniscient God? Certainly.
No comments:
Post a Comment