“Courage
is willingness to take the risk once you know the odds. Optimistic
overconfidence means you are taking the risk because you don't know the odds.
It's a big difference.”
From
the The Washington Post, March 10, 2020: When Coronavirus Is
Growing Exponentially, Everything Looks Fine Until It Doesn’t. There’s an old
brain teaser that goes like this: You have a pond of a certain size, and upon
that pond, a single lily pad. This particular species of lily pad reproduces
once a day, so that on day two, you have two lily pads. On day three, you have
four, and so on. Now the teaser. “If it takes the lily pads 48 days to cover
the pond completely, how long will it take for the pond to be covered halfway?”
The answer is 47 days. Moreover, at day 40, you’ll barely know the lily pads
are there. (Megan McArdle, 3/10/2020)
In answering the lily-pad question,
many people (over half) choose day 24.
They divide the time frame in half as if this is a linear equation, when
it’s actually an exponential one. The
same mental failure happens when people calculate simple interest as compound
interest, or vice-versa. The correct
answer shows the danger in mis-framing the problem by the wrong operating
assumption. What is needed is the
outside expert to challenge the operating assumption, “I can’t fix this problem
because of X,” when the fix resides with Y, a separate but related aspect,
instead.
Robert A. Burton, MD writes in On Being Certain
(2008):
Anyone who’s been frustrated with a difficult math problem
has appreciated the delicious moment of relief when an incomprehensible
equation suddenly makes sense. We
“see the light.” This aha! is a notification from a subterranean portion
of our mind, an involuntary all-clear signal that we have grasped the heart of
a problem. It isn’t just that we can
solve the problem; we also ”know” that we understand it (p. 3).
Finding the answer
to a problem, especially a difficult and prolonged one, has all the earmarks of
the rewards our brain gives us as endorphins, the shared mechanism of drugs,
sex, and food. It is the opposite
process to Getting Lost (which we have documented before in this blog). But its achievement is blocked by the
overconfidence that most clients exhibit in keeping their problems going. The process automatically stays on track
because the techniques and assumptions being applied are not helping or
clarifying the situation—obscuring its solution.
J. Edward Russo and Paul
J. H. Schoemaker, in their book on decision-making judgment errors (Decision
Traps, 1994), cite the majority of business managers’ choosing the wrong
cause of death (the second choice) in three pairs of options: Lung cancer v. car accidents, emphysema v.
homicide, tuberculosis v. fire. In their
example, it is the “availability bias” that drives the majority choosing the
more volatile and dramatic option. This
is the human tendency to make judgments based not on rational fact-finding but
on what comes most easily to mind. Which
is the dramatic content featured most frequently by media coverage.
Such judgment errors
are part of “confirmation bias,” the mind’s searching out and focusing only on
evidence that supports a preexisting belief. Emskoff and Mitroff (at the Wharton School)
studied strategy formation at dozens of companies to discover that their
researchers were seeking out only evidence to support already formulated
strategies. Why is this important? It helps explain why problems continue to
remain unsolved and why research conducted in the same bias channel can’t lead
to any new or helpful insights, as long as it’s conducted along the same line
of assumptions.
In the current health crisis, determining the real risks of
the Covid virus to human life requires expert ability to identify actual
behavior in order to predict where it’s going and when the lockdown can be
relaxed or ended. The above lily-pad-pond
story has been cited as an example of viral growth that’s hard to wrap our more
intuitive linear minds around.
In his classic 1988 manual Innumeracy:
Mathematical illiteracy and Its Consequences, John Allen Paulos, professor
of mathematics at Temple University,
pointed to such mental errors in pseudoscience, parapsychology, newspaper
psychics, stock-market filtering scams, and the voter count in the Y2K
presidential election.
"The
fast-thinking and gut feelings usually dominate," says Paul Slovic, professor
of psychology at the University of Oregon and author of several books on risk
perception and behavioral sciences. "This is because it's natural and
easy, and most of the time we trust our intuitions to give us good guidance in
our daily lives.” Until this intuition
fails us when we encounter problems that don’t yield to our usual
problem-solving judgments—which is why we need to examine how we approach
difficult or complex issues, and why consumer problem-solving requires expert input.
"The way
the virus spreads, everything is under control until it isn't — that's the
nature of exponential growth," Slovic said. "Our minds think
linearly, at a constant rate of growth, but this is a nonlinear process. It's a
natural tendency for most of us to underestimate the speed at which an
exponential process will take off, and then suddenly it overwhelms us." https://www.msn.com/en-ca/news/other/humans-arent-wired-to-comprehend-a-pandemic-heres-why/ar-BB12oF07
So misapprehending the system in which the problem is
operating is key to the breakdown in solving it. This is the re-set needed to relocate the
problem inside the correct theory or dynamic. The complexity of consumer culture
is one example. Part of the anxiety of consumer problems is that they are
operating within a framework the consumer cannot readily access or understand how
to work with.
This even operates in the current Covid anxiety, which works
to exaggerate the severity as well as outcomes of the pandemic. Anxiety is fed by catastrophizing in the mind
when that mind is fed a steady diet of news about fresh disasters. It also tends to grow a self-identity of
helplessness and powerlessness, when in fact, Cognitive Behavior Therapy can
reveal the fact that individuals under anxiety have far more power than they
appear to acknowledge.
Fake intelligence is what we get when we take
shortsighted shortcuts as a result of overconfidence in our judgment –
believing (falsely) that these shortcuts will save us time and effort, when in
fact they are costing us in bad thinking and bad conclusions. Overconfidence results in failure to frame
the problem correctly for the best solutions, keeping us from assembling the
most relevant (key) information to the problem.
“The key issue isn’t getting the right facts but challenging the right
assumptions” (Russo & Shoemaker, p. 76).
And the authors point out that for Americans especially, seeming
confident is enough to convince others of the rightness of an answer, reinforcing
“feeling sure” oneself.
Confidence checkup:
To illustrate this principle as well as to
show off its weakness in the face of facts, here are five questions with
numerical answers. You don’t need to know the numbers – just gauge your
confidence in a range that you think includes the answer. Then, reviewing the precise
answers given below, note the range difference between what you knew and what
you thought you knew. Note: Most
managers missed 2-3 (40-60%) of the answers completely, making the mis-estimates
average out at only a 50% chance of including the answer. Here they are (p. 71): Spoiler alert: answers at the end
- Diameter of the moon, in miles
- Age at which Martin Luther King died
- Weight of a Boeing 747 empty, in pounds
- Air distance in miles, London to Tokyo
- Deepest known point in the oceans, in feet
How did your guesses go? Do they show overconfidence that indicate
problems in other areas – life choices, grasp on reality, social estimating, math
skills, or problem-solving in general?
Don’t worry – even people in charge don’t have a solid sense of their
own abilities. That’s where ability to learn
systems guided by knowledge rather than judgment shows there is hope for
improvement.
The first step is in recognizing the state of
the art and its consequences for human thinking and decision making;
overestimation of common sense, and its consequences.
Answers:
1) The diameter of the moon is 2,160 miles
2) Martin Luther King was 39 years old when he was killed.
3) A Boeing 747 empty, weighs 390,000 lbs
4) The air distance in miles, from London to Tokyo, is 5,959 miles.
5) Deepest known point in the oceans is 36,198 feet (6.85 miles)