Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome back to the Deep Dive. This is the show where we really
get into some fascinating ideas,break them down, and hopefully
give you something useful to think about.
Absolutely. Taking complex stuff and making
it, well, clearer. Exactly.
And today, we're diving deep into our own minds, actually
exploring why we often make decisions that, you know, when
(00:21):
you look back, they just don't quite add up.
Like that feeling when you buy something you didn't really
need. Totally, like maybe you're
scrolling online and suddenly this banner pops up, limited
stock remaining and bam you feelthis like urgent need for.
It. Oh yeah.
Or maybe, you know, holding on to some old gadget that barely
works because, well, you think I've put so much effort into
(00:42):
fixing this thing already. Right, they have come this far
trap. We all do it.
And The thing is, these aren't just random personal quirks.
They're actually systematic slips, predictable ways our
minds work. Or maybe fire.
Sometimes Dobelli calls them cognitive errors.
And that's what's so interesting.
They're not just random. So our mission today is to
(01:03):
unpack some really powerful, actionable insights from a
fantastic book. It's like a shortcut to
understanding these mental traps.
We're diving into Rolf Dobelli'sThe Art of Thinking Clearly now,
Dobelli. He's not exactly a research
scientist himself, right? He's more like a a brilliant
curator, a synthesizer. Yeah, a translator, almost.
(01:24):
He takes decades of amazing research from fields like
behavioral economics, cognitive psychology, stuff that could be
pretty dense. Definitely academic.
And he makes it incredibly accessible, relatable even, and
honestly pretty entertaining too.
And the core idea, the real hookof the book, isn't just that we
sometimes make bad calls. It's much deeper than.
(01:45):
That right, It's that our mistakes, our errors and
judgment, they're systematic. They're predictable ways we
deviate from pure logic. Exactly.
These aren't just one off flukes.
They're ingrained patterns, mental shortcuts, heuristics
that maybe, you know, helped ourancestors survive, but now in
our super complex modern world, they often lead us completely
(02:06):
astray, like consistently overestimating what we know.
Maybe we're being way more scared of losing 10 bucks than
excited about gaining 10. Bucks loss aversion thing, yeah.
These aren't unique flaws, they're pretty much universal
quirks of how humans think. That's the real ticker, isn't
it? These cognitive errors, they're
baked in, evolutionary leftovers.
Our brains, amazing as they are,evolved for a world that, well,
(02:30):
it just doesn't exist anymore. A much simpler world Hunter
gatherer times. Where you needed snap
judgements? Is that Russell in the grass
dinner or is it going to eat me for dinner?
Fast thinking was key. Right, no time for a spreadsheet
analysis. But now in our hyper connected,
information overloaded, financially complex world, those
same quick thinking patterns, those mental shortcuts, they can
(02:53):
send us down some really illogical rabbit holes.
And when you start connecting the dots, you see how these
biases creep into almost everything Managing money.
Career choices. Even just scrolling through news
headlines or picking what to watch next.
Understanding them isn't just like an academic puzzle, it's a
genuinely powerful tool for thinking more clearly.
(03:14):
So what can you expect today? Well, we're going to unpack some
of the most intriguing biases Dough Belly covers.
We'll use his examples, his analogies.
Some are really quite funny in real life stories to make these
ideas stick. The goal is really to equip you,
the listener, with the awareness, to make clearer
choices, to spot the hidden traps in your own thinking
(03:35):
maybe, and how others argue. And maybe just have a little
chuckle at how predictably weirdour brains can be sometimes.
It's fascinating stuff. OK, let's jump in.
Let's tackle this first big idea, something that really
challenges how we think about success.
You ever look at someone who's just wildly successful, a CEO, a
famous artist, whatever, and your first thought is just wow,
pure genius must. Be that intuitive leap, Yeah,
(03:58):
talent just radiates off. Well, this first batch of
insights from Joe Belly's book really pours some cold water on
that idea. It shows us that what you see,
well, it's often not the full picture and what you don't see
that might be the real story. Yeah, this is where we get into
what the author calls the survivorship bias, and tied
closely to it, the Swimmers bodyillusion.
(04:19):
Survivorship bias, at its core, is our tendency to really
overestimate the odds of successbecause, well, failures are
basically invisible. Invisible.
How so? We only see the survivors,
right? The companies that made it big,
the authors who got published, the athlete to one goal.
We analyze them. But the vast, vast majority who
tried and failed, they disappear.
(04:40):
They're not in the news, they'renot studied.
They're just. It's like walking downtown
seeing all these impressive successful buildings and
thinking wow, building skyscrapers must be pretty
straightforward. You don't see the empty lots
where projects failed or the half built structures that ran
out of. Money.
Exactly. Those failures are cleared away,
out of sight, out of mind. So we get this totally skewed
picture of how easy it is to succeed.
(05:02):
Dobelly uses the example of famous authors, right?
He does. For every huge bestseller, every
household name, there are literally thousands, maybe 10s
of thousands of aspiring writerswhose books never get picked up,
never even get finished. Maybe they're in what he calls
the graveyards of the Unsuccessful.
Grim picture. But true, the media focuses on
(05:25):
the triumphs. Understandably so.
Our perception of the tiny, tinychance of becoming a literary
star gets blown way out of proportion.
And it's the same for startups, artists, musicians, you name it.
We see the peak, not the massivebase of the pyramid beneath it.
OK, so that's survivorship bias.What about the swimmers body
illusion? Is that related?
Very related. It's about confusing selection
(05:45):
factors with results. We assume something causes a
good outcome when maybe just selects for people who already
have those traits. Like the Harvard example in the
book. Perfect example.
Harvard has this incredible reputation for churning out
successful, smart people. But, Dobelly asks, does Harvard
make them smart? Or does Harvard just do an
incredibly good job of recruiting the smartest, most
(06:06):
driven students from all over the world to begin with?
Ah, so it's the filter, not necessarily the factory.
Precisely. Think about professional
swimmers. They tend to have these amazing
physiques, right? Sleek.
Muscular. But does swimming give them that
body? Or are people with that kind of
natural build more likely to excel at swimming and stick with
it? Right, It's probably a bit of
(06:27):
both, but the selection part is huge.
You need the right raw materials.
Exactly. Harvard's tough selection
collection process is the filter.
It ensures they start with incredibly high potential
individuals, so the success of their graduates might say more
about their admissions office than their classrooms.
In a way, it's a classic mix up of correlation and causation.
That's such a crucial point. So the practical take away for
(06:48):
you listening is what? Be skeptical of success stories.
Maybe not skeptical, but discerning.
When you see amazing success, a person, a company, maybe a new
fitness trend, pause. Ask yourself, what am I not
seeing? Are there failures I don't know
about? Is this success the result of
the process itself, or did the process just select the best
(07:10):
candidates to begin with? Do the digging yourself.
Basically, look beyond the glossy surface.
Yeah, try to find the graveyard.Look for the selection effects
before you assume it's all down to skill or some magic formula.
It helps avoid mistaking A narrow filter for a miracle
cure. Which really begs that question,
what isn't being shown? What's hidden?
(07:30):
Always worth asking. You know, it's wild how our
brains just find patterns or miss obvious things.
We talked about ignoring invisible failures, but
sometimes, even when the data isright there staring us in the
face, our brains still managed to trip us up.
Especially with statistics, averages can be so slippery.
Oh, absolutely. This leads us right into the law
(07:51):
of small numbers. Basically, the smaller the
sample size you're looking at, the more likely you are to see
wild swings, extreme results that might have nothing to do
with reality. OK, unpack that a bit.
Small samples, extreme swings. Think about it, if you flip a
coin just four times, getting 3 heads doesn't feel that weird,
right? 75% heads.
But if you flipped it 1000 timesand got 750 heads, you'd think
(08:14):
the coin was seriously rigged, right?
Bigger sample, closer to the true average, which should be
5050. Exactly.
Small samples are just inherently more volatile.
They jump around more and we as humans, we tend to see those
jumps and think, aha, a pattern,something significant is
happening here. We might just be, well, random
(08:34):
noise because the group is small.
Del Billy has that great exampleabout the retail stores and
shoplifting, doesn't he? He.
Does imagine this big company, thousands of stores?
A consultant finds that the stores with the highest theft
rates are mostly in rural areas.The CEO is furious, ready to
spend millions on security for all rural stores.
(08:54):
Seems logical based on the data.But wait, if the consultant had
also looked at the stores with the lowest theft rates, where
would they likely be? Also rural areas because they're
smaller. Bingo.
Rural stores are often smaller. Fewer transactions, fewer
customers, which means their theft rates just by chance are
going to bounce around more wildly.
They'll naturally pop up at the extreme high end, in the extreme
(09:16):
low end of any ranking, just because of the law of small
numbers. The consultant only looked at
one extreme. So it wasn't necessarily that
rural areas had more theft overall, just more variance in
their rates because the stores were smaller.
Precisely. It's a statistical artifact, not
a real trend. Dobeli also has that funny bit
about the fictional National Institute of Unnecessary
(09:37):
Research concluding startups hire geniuses because their
average employee IQ is super high.
Right, because startups have like 5 employees, so one really
smart person or even 1 less smart person skews the average
massively compared to a company with 5000 people.
It's just math. Smaller denominators lead to
bigger fluctuations. OK, that makes sense.
(09:59):
But then there's this other thing that sounds like pure
statistical witchcraft, the WillRogers phenomenon.
Yes, stage migration. This one is genuinely mind
vetting. The idea is you can move
elements between two groups, andsomehow the average of both
groups seems to improve without anything actually getting better
overall. It's named after Will Rogers,
the humorist who joked that whenfolks left Oklahoma and moved to
(10:22):
California during the Dust Bowl.It raised the average
intelligence in both states. How on earth does that work?
Nobody's getting smarter. It's purely a trick of averages
and recategorization. Imagine two hedge funds.
Fun days, Amazing. Fun bees.
Kind of meh. You want to make both look
better, so you take a few investments from fund A that
(10:42):
are, let's say, underperforming for fund A.
They're still decent, just not as stellar as Fund A's average.
You move them over to Fund B. What happens?
Fund A's average goes up becauseyou removed its lowest
performers, and Front B's average also goes up because it
just received investments that are better than its previous
average. Wow, so no new value created and
(11:02):
nothing actually improved in themarket, but both funds look
better on paper? That's sneaky.
It's clever. It's about how you define the
sets. Dobelly also uses the car
dealership example. 2 branches, different salesman with
different sales numbers. You can carefully move just one
salesman from 1 branch to another, and magically the
average sales per salesman can increase in both branches.
(11:23):
Just by moving one person. If you pick the right person and
the numbers line up correctly, yes it feels like magic, but
it's just manipulating averages by shifting the boundaries of
the groups. So the practical take away here
is be super suspicious of averages, especially when things
seem to improve out of nowhere. Definitely ask how the average
(11:44):
was calculated. Ask what changed.
Is this real improvement or juststage migration?
Especially with small groups or when comparisons are made over
time after some kind of restructuring or
recategorization. Always ask what isn't being
shown here. What's hidden behind the
numbers, right? OK, so data can fool us, but
(12:04):
what about our own past actions?Our own history?
Turns out our brains are incredibly good at making a
stick with things, even when objectively we probably
shouldn't. Ah yes, the deep seated
reluctance to change course. This brings us squarely into the
territory of the sunk cost fallacy.
Sunk costs? I feel like I hear this term a
lot, but what does it really mean in practice?
It's our tendency, our powerful urge really, to keep investing
(12:27):
time, money, effort, even love into something just because
we've already put so much in. Even if rationally looking
forward, it's a losing proposition.
So the resources we've already spent, the sunk cost, they're
gone. Irretrievable.
Yeah. They shouldn't logically
influence our future decisions, right?
Future decisions should only be about future costs and future
(12:47):
benefits. Rationally, yes.
But emotionally, psychologically, we find it
incredibly hard to let go of those past investments.
It feels like admitting defeat, like all that previous effort
was wasted. It's that feeling of I've come
this. Far exactly.
It's the classic pouring good money after bad.
Doughbelly gives some great, very relatable examples.
(13:08):
The marketing manager fighting to keep a failing ad campaign
alive because we spent so much on it already.
Were the friends stuck in a bad relationship because I've
invested years of my life into this?
Yep, or investors holding onto stocks that are plummeting,
thinking I can't sell now I've lost too much, ignoring the fact
that the future prospects might be even worse.
(13:29):
The more we invest, paradoxically, the harder it
becomes to cut our losses. Wasn't the Concorde jet a famous
example of this on a huge scale?Absolutely a textbook case.
Despite mounting evidence that it was becoming a financial
black hole, the sheer amount of national pride and, crucially,
money already sunk into the project made it almost
(13:51):
impossible for the government's involved to pull the plug.
The past investment dictated thefuture illogically.
So for you listening, the big warning sign is hearing that
voice in your head saying, But I've already put so much into
this. That should be a huge red flag.
When you hear that stop, step back.
Ask yourself, if I were startingfrom scratch today, knowing what
I know now, would I still make this investment?
(14:13):
Forget the past costs and focus only on the future potential
versus future costs. It's about making a rational,
forward-looking choice, not being chained to the past.
Sometimes the smartest move is knowing when to walk away.
It takes courage, but it's oftenthe right call.
OK, so our brains cling to bad past decisions because of sunk
costs, but they're also activelymessing with how we see the
(14:34):
present, right? Filtering information to protect
what we already believe. Oh, absolutely.
And this is where we hit what Dobelly calls the mother of all
misconceptions. A truly powerful bias.
Confirmation bias. The mother of all
misconceptions. Wow, that's a heavy title.
It earns it. Confirmation biases are deep,
often unconscious tendency to seek out, interpret, and
(14:57):
remember information in a way that confirms our preexisting
beliefs or hypothesis, and just as importantly, to ignore,
dismiss, rationalize away, or conveniently forget any evidence
that contradicts those beliefs. So we basically see what we want
to see or what we already expectto see.
Pretty much as Aldous Huxley said, facts do not cease to
exist because they are ignored. But confirmation bias makes U.S.
(15:19):
Masters of ignoring inconvenientfacts.
Warren Buffett nailed it too, saying something like what
humans do best is interpret all new information so their prior
conclusions remain intact. It's like we have these internal
confirmation goggles on. Exactly.
We're not neutral observers. We're actively constructing a
reality that fits our narrative.It's an echo chamber built right
(15:40):
into our own heads. You see this everywhere, right?
Like that example of the executive team launching a new
strategy. Classic.
They launched the strategy and suddenly they see tons of
evidence it's working. Every little positive sign is
hailed as proof. A tiny uptick in sales?
Brilliant. A single good customer comment?
See it's working. But what about the negative
(16:01):
signs? Miss targets?
Competitors doing better? Those are just exceptions,
outliers, special circumstances.They get explained away,
minimized, swept under the rug. The team becomes almost blind to
anything that doesn't fit the success narrative they want to
believe. It's scary how powerful that
filtering can be, Dobelly mentions.
Charles Darwin had a specific trick to fight this, didn't he?
(16:22):
He did, and it shows incredible intellectual discipline.
Darwin knew how easily the brainforgets contradictory evidence,
so whenever he came across an observation or a piece of data
that seemed to challenge his theories, made a point of
writing it down immediately and in detail.
Why? Immediately.
Because he knew if he waited, his confirmation bias would kick
in and he'd either forget it or rationalize it away.
(16:45):
The more convinced he was of a theory, the harder he looked for
evidence against it. He forced himself to confront
the contradictions. That's amazing self-awareness.
So the take away for us Maybe listen for the word exception.
That's a great tip. When you hear yourself or
someone else saying, oh, that's just an exception, pause.
Is it really an exception or is it disconfirming evidence being
(17:08):
dismissed? And maybe try Darwin's method.
Actively look for ways you mightbe wrong, especially when you
feel really certain about something.
It's uncomfortable. Nobody likes being wrong, but
actively seeking out contradictory views, genuinely
trying to understand the other side?
That's a powerful antidote to confirmation bias.
What stands out to you about this constant battle with her
(17:28):
own brains? It's humbling, isn't it?
Definitely. It feels like we're constantly
needing to second guess ourselves.
OK, OK, so moving on. It's amazing how much our
perception gets warped by just context, little details, what we
compare things to, or even just touching something or owning it.
Yeah, our sense of value and reality is surprisingly
malleable. This brings us to a couple of
(17:50):
related biases. The contrast effect in framing.
The contrast effect? That's about comparisons.
Right, exactly. We judge things not in
isolation, but in comparison to something else we just
experienced or some reference point.
Something seems better or worse,bigger or smaller, cheaper or
pricier, purely because of what it's contrasted.
With like the sale pricing, $100item marked down to $70 feels
(18:13):
like a steal. But if it was just always $70,
you might not even notice it. The original $100 price acts as
an anchor, making the $70 look incredibly attracted by
contrast. Dobelly tells that hilarious
story about the brothers Sid andHarry and the suit store.
Oh, that's a classic. Sid pretends to be hard of
hearing. Customer likes a suit.
Sid yells to Harry in the back. How much for this suit?
(18:35):
Perry yells back at deliberatelyinflated price.
Say $42.00. Which is way too high.
Sid then pretends to miss here, cups his ear and tells the
customer he says $22.00. The customer, thinking they've
stumbled on an amazing bargain due to Sid's mistake, pays up
quickly before anyone notices. Brilliant.
Pure contrast affect exploitation.
(18:57):
Absolutely, and this effect alsoexplains why we barely notice
inflation sometimes. It's a slow, gradual creep.
If the government imposed that same loss of value as a sudden,
obvious tax, people would be outraged.
But because it happens little bylittle compared to yesterday, it
just fades into the background. OK.
So that's contrast. What about framing?
(19:17):
How is that different? Framing is about how the
presentation of information influences our choices, even if
the underlying facts are identical.
It's not just what you say, but how you say it.
The language, the context, the spin.
We react to the packaging, not just the contents.
Precisely, We react to how things are glossed over or
emphasized. The classic example is the meat
packaging. 99% fat free sounds way healthier to most people
(19:41):
than contains 1% fat. Even though it's the exact same
amount of fat. Exactly.
Yeah, the positive frame fat free is just more appealing.
You see this constantly in corporate speaker politics too.
Oh yeah. A problem becomes an opportunity
or a challenge. Getting fired is reassessing
your career. A stock market crash is a
correction. It's all about putting a
(20:03):
positive or less negative spin on reality.
So the practical advice, be super aware of framing in
comparisons. Try to strip away the
presentation and look at the core facts.
Evaluate something on its own merits, not just against an
arbitrary anchor or how it's been nicely wrapped up for you.
Peel back the gloss. OK, and Speaking of value, this
(20:24):
next one is weird. We value things more just
because we own them. The endowment effect.
Yes, it's a powerful and often irrational.
We simply place a higher value on things merely because they
are ours. The act of possession itself
seems to inflate their worth in our eyes.
It's like the minute you drive anew car off the lot, suddenly
you wouldn't sell it for what you just paid, even though
rationally it's market value just dropped.
(20:45):
Exactly. Your psychological ownership
kicks in. This bias messes up negotiations
all the time because sellers influenced by it genuinely feel
their item is worth more than the objective market value.
Buyers lacking that emotional connection see it differently.
It's not just calculation, it's emotional attachment, Dobelly
(21:07):
mentions that basketball ticket study, right?
A fantastic illustration Students who didn't win lottery
tickets for a big game valued them at around $170.
Seasonable, but the students whowon the tickets?
They wouldn't sell for less thanan average of $2400. 14 times
more just because they had the ticket in their hand.
The endowment effect in action. You see it constantly in real
(21:29):
estate to people get emotionallyattached to their homes, the
memories, the work they put in, and they genuinely believe
buyers should pay a premium for that sentiment, which of course
buyers won't. They're selling the memories
along with the house. And Dobelly points out, it even
works with near ownership, like in auctions, if you're bidding
aggressively, you start to feel like the item is almost yours.
Right, you get invested. That feeling makes you willing
(21:51):
to bid higher. Sometimes you're rationally
higher because pulling out now feels like a loss.
It can lead straight to the winner's curse, where you win
the auction but you've massivelyoverpaid.
So the advice is try to detach emotionally when buying or
selling. Try.
Ask yourself, what would I pay for this if I didn't own it?
Or if selling, what's the realistic market price for
(22:13):
getting my personal attachment? Easier said than done, but
awareness helps. OK, but it gets even weirder
than ownership sometimes. It's just the the history, the
essence of an object that affects us.
Now we're entering the slightly spooky realm of contagion bias.
This is fascinating. Contagion.
Like catching something. Sort of.
But psychologically, it's the irrational belief that
(22:36):
properties, feelings, or some kind of essence can transfer
from a person to an object, or from object to object, just
through contact or association, even when there's no logical
basis for. It it sounds like magical
thinking, like voodoo dolls or something.
It taps into something quite primal.
Dabelli notes that even very rational modern people struggle
with this. It's an emotional gut reaction,
(22:57):
not a logical 1. He uses the provocative example.
Would you wear a perfectly cleanlaundered sweater once worn by
Hitler? Oof.
No. Instantly, No, even though
rationally. Rationally is just a sweater,
clean fabric, no molecules of evil attached.
But the association, the perceived contagion of his
essence, creates this powerful aversion.
(23:19):
It's visceral. He tells that story about the
war correspondent and Saddam Hussein's wine glasses, too.
Right. Yeah.
She brings them back, uses them in a dental party.
When she reveals they were Saddam's, One guest literally
spits his wine back into the glass.
Total disgust. Even knowing it's irrational,
the feeling of contamination through the object is
overwhelming. Wow.
So it's not just negative associations there.
(23:40):
No, it works positively too, though maybe less intensely.
Think celebrity memorabilia or historical artifacts.
Part of the value comes from that perceived connection, that
essence. Dobelly mentions medieval
bishops using Saints, relics, bones, cloth, believing their
holiness transferred, influencing people.
It has deep roots. So the take away is notice when
(24:02):
your feelings about an object's history might be skewing your
judgment of its actual worth or usefulness.
Exactly. Recognize when that magical
thinking part of your brain might be overriding logic.
It's fascinating how persistent these ancient feeling biases are
even now, like a little bit of the medieval world still
rattling around in our skulls. Totally.
(24:23):
OK, so as we build more complex systems in society, economics
work. Even just trying to be
productive, we seem to find new ways for our thinking to go
sideways. Sometimes even things that seem
like good ideas, like offering rewards or giving people more
choices, can totally backfire. Right.
This leads us into a really counterintuitive area
Motivation. Crowding.
Motivation. Crowding.
(24:43):
OK, what's that about? The core idea is surprising.
If you offer money or some extrinsic reward for something
people were already doing out ofintrinsic motivation like civic
duty, kindness, passion, you canactually kill that original
motivation. We offering money makes people
less motivated. That seems backwards.
It does. We assume money always
(25:04):
motivates, but Dalbelli shows itcan crowd out or replace other,
often more powerful motivations like goodwill or community
spirit. It shifts the frame from a
social act to a purely financialtransaction.
Can you give an example? The Swiss nuclear waste one is
striking. That's a perfect one.
Researchers asked residents of aSwiss village that it accepted
nuclear waste site nearby. Initially, about half said yes,
(25:27):
citing things like national duty, shared responsibility.
It was a civic issue. OK, pretty public spirited.
Then the researchers introduced a financial incentive.
What if everyone got paid about $5000 a year for accepting it?
What happened? Support plummeted to below 25%.
Wow. The money made it worse.
Yes, it turned a civic duty intowhat felt like a bribe.
(25:48):
It cheapened the ACT. People thought if they have to
pay us this much, it must be really dangerous, or simply my
civic spirit isn't for sale. The money crowded out the
goodwill. And the kindergarten late fee
example. Another classic, a daycare
introduced a small fine for parents picking up kids late,
hoping to reduce lateness. Instead, lateness increased
(26:10):
because the fine essentially puta price on being late.
It removed the social awkwardness or guilt and turned
it into a service they could just pay for.
I'll be late, but it just cost me a few bucks.
Transactional thinking replaced social obligation.
So Belly even mentions a personal story about offering a
friend money for a favor. Yeah, and how it immediately
felt awkward, like it underminedthe friendship, the act of
(26:31):
goodwill. So the practical point for you
is crucial. Think carefully about why
someone is doing something before you offer an incentive.
Is it intrinsic passion, Community spirit?
Don't assume money is always theanswer.
Sometimes it poisons the well. Are you dealing with a social
exchange or a market exchange? That's a really important
distinction. OK, Next up, something that
(26:53):
feels very modern, the paradox of choice.
Ah yes, the tyranny of too much choice.
It feels like we should want more options, right?
More choice equals more freedom,more control.
Look at streaming services, supermarkets, online dating
options everywhere. We're conditioned to think more
is always better. But Dobelly, citing Barry
(27:16):
Schwartz's work and others, shows that while some choice is
good, too much choice can actually lead to problems.
Like what kind of problems? Well, first decision paralysis.
So many options it becomes overwhelming.
You can't compare them all, you get stressed and sometimes you
just give up and choose nothing.Analysis paralysis.
Exactly. And 2nd, even if you do choose,
you often end up less satisfied with your choice.
(27:37):
Why Less satisfied? You had more options to find the
perfect thing. Because with so many
alternatives, you're constantly plagued by the thought that
maybe, just maybe, one of the other options you didn't choose
would have been slightly better.You second guess yourself.
The opportunity cost feels huge.The fear of missing out on the
perfect choice. Precisely that famous Jelly
(27:58):
experiment illustrates this perfectly.
Right, the supermarket tasting booth.
They offered either 6 flavors ofjam or 24.
When faced with 24 options, people were intrigued.
They tasted more, but they bought 10 times less jam than
when they only had six choices. 24 was just too much to process,
6 felt manageable. Exactly.
Too much choice led to paralysisand fewer sales, so the
(28:22):
practical advice here is pretty clear.
Limit your own options. In a way, yes.
Before you start looking, whether it's for a job, a
holiday, new phone, decide on your key criteria first.
What are your non negotiables? What's essential then?
Only consider options that meet those criteria.
Don't get lost scrolling throughendless possibilities.
Filter before you browse. And accept that in a world of
(28:44):
infinite variety, the perfect choice probably doesn't exist,
or finding it isn't worth the effort.
Aim for good enough. As Dobelly suggests, good enough
is often the new optimum, especially for big life
decisions. Maybe even, dare I say, choosing
a life partner. Controversial, but it raises
that question, are you constantly striving for perfect
(29:06):
or can you be happy with good enough?
A crucial distinction, especially today.
OK. And as we're swimming in all
this information, all these choices, our brains create this
distorted map of the world, prioritizing dramatic stuff over
maybe more important, boring stuff.
That's the availability bias at work.
It's incredibly common. Availability, meaning what's
(29:27):
easily available in our memory. Exactly.
We overestimate the probability or importance of things that are
easily recalled. And what's easily recalled?
Things that are recent, vivid, dramatic, shocking, emotional,
or heavily reported in the media.
So we build this incorrect risk map as Dobelly calls it.
Yes, we don't think quantitatively about risk.
We think dramatically. Consider plane crashes versus,
(29:50):
say, deaths from diabetes. Plane crashes feel way scarier,
way more prominent in our minds.They dominate the news when they
happen. They're vivid, terrifying, but
statistically your risk of dyingfrom diabetes or complications
from it is vastly higher than dying in a plane crash.
But diabetes is well, it's slow,common, less dramatic.
(30:10):
It doesn't make headlines in thesame way.
So it's less available in our minds and we underestimate the
risk. Same with things like car
accidents or even murder versus something like stomach cancer.
Same principle. The dramatic, salient events
stick in our memory and inflate our perception of their
frequency, while the quieter, statistically more significant
risks fade into the background. Charlie Mugger had a great point
(30:33):
about this. Buffett's partner, right?
Yeah, he warned against making decisions based only on what's
easily countable or measurable, while ignoring things that are
crucially important but harder to quantify or less readily
available in your mind. Better to be roughly right about
the important stuff than precisely right about the
trivial stuff. Don't let the vivid, easily
recalled stuff over shadow what actually matters most, even if
(30:55):
it's harder to grasp. That's the essence of it.
So for you, the practical take away is be really mindful of how
meaty coverage, sensational stories, or even just vivid
personal anecdotes might be warping your sense of risk or
importance. Actively seek out the boring
statistics, the actual data. Don't just rely on what easily
pops into your head. Your brain loves a good story,
(31:18):
but the numbers often tell the real story.
OK, wow. So we've covered a lot of ground
here. Journeying through Dubeli's art
of thinking clearly, we've seen how data tricks us with things
like survivorship bias. And the Will Rogers sleight of
hand. How our past decisions haunt us
through sunk costs and how confirmation bias makes us
filter reality. And then how context and
ownership warp value with contrast, framing, endowment,
(31:41):
even contagion bias. Plus the traps of motivation
crowding, the paradox of too much choice, and the
availability bias skewing our risk perception.
It's a minefield out there in our own heads.
It really is but a fascinating 1.
So after exploring all these ways our amazing brains can
systematically lead us astray, what's the bottom line?
What does this all mean for you listening right now?
(32:02):
How can we actually use this knowledge?
Well, it circles back to something fundamental that Dough
Belly touches on, often illustrated by those simple
logic puzzles like the ones in the Cognitive Reflection Test or
CRT. Right, the bat and ball problem.
Exactly. It highlights that we basically
have two thinking systems. As Daniel Kahneman famously
described. System 1 is fast, intuitive,
(32:22):
automatic, effortless. System 2 is slow, deliberate,
analytical, effortful. And we default to system one
most of the time, right? It's easier.
It's the brain's energy saving mode, yeah, and it works fine
for lots of everyday things. But the problem is we often rely
on that fast, intuitive system 1even when a situation calls for
the slower, more logical system too.
(32:44):
That's when errors creep in. Like the bat and ball cost
$1.10, total bat is $1.00 more than the ball.
How much is the ball? Your fast brain shouts. $0.10
almost irresistibly. But if you stop engage system to
do the math, the bat is a dollarand five, the ball is $0.05
total $1.10 deaths difference ondollar.
Takes effort though. Or the Lily pad puzzle patch
(33:06):
double S daily covers the whole lake in 48 days.
When did it cover half the lake?Intuitive answer. 24 days half
the time. Logical answer Engaging system 2
Day 47. If it doubled to cover the whole
lake on day 48, it must have been half full the day before.
It requires overwriting that first easy impulse.
And the book notes that lots of smart people get these wrong.
(33:28):
It's not about raw intelligence necessarily, but about the
willingness to pause, question your intuition, and engage that
effortful thinking. Some studies even link lower CRT
scores more intuitive thinking, withholding on to certain
beliefs more strongly, perhaps because they're less questioned.
So the practical step is recognize when a decision is
important and consciously shift gears.
(33:48):
Pretty much for big consequential choices,
deliberately slow down. Ask yourself, is my gut reaction
reliable here or do I need to switch on the analytical engine
even though it takes more work? It's like using manual override
when the automatic setting isn'tquite right.
OK, which leads us to Dobella's main message, maybe the biggest
take away from the whole book. I think it is, and it's actually
(34:08):
quite liberating. It's that we cannot be error
free. Trying to eliminate all
cognitive biases is impossible, probably even undesirable.
But we can absolutely get betterat thinking clearly.
We can reduce the frequency and impact of these errors.
So it's not about becoming a perfect thinking robot.
Not at all. It's about awareness,
(34:29):
recognizing the patterns, knowing common pitfalls.
It's like knowing where the potholes are on a familiar Rd.
You can steer around them more effectively.
Being aware of that internal editor, the one that wants to
make you feel right all the time.
Awareness is the first step to intervention.
And Doughbelly keeps coming backto Kahneman's two systems.
Know when to trust your fast intuition system 1 low stakes
(34:50):
familiar territory and when to engage your slow rational brain
System 2 high stakes complex unfamiliar.
He also stresses the power of negative knowledge.
Negative knowledge, knowing whatnot to do.
Exactly. He argues that knowing what
pitfalls to avoid, what errors lead you astray, is often more
practical and powerful than searching for some elusive
(35:13):
secret formula for success or perfect decision making.
Like Michelangelo chipping away everything that wasn't David.
Focus on removing the flaws, theerrors.
That's a great analogy. Eliminate the mistakes, avoid
the biases we've talked about, and you naturally end up in a
better place. It's subtractive wisdom.
And then there's the circle of competence idea.
(35:34):
Borrowed from Warren Buffett. Know what you know, and perhaps
more importantly, know what you don't know.
Buffett understands balance sheets Inside out.
That's his circle. Outside that, he's cautious.
For us, it means identifying ourgenuine areas of deep
understanding and expertise. And being really honest about
the boundaries of that circle. Brutally honest, because outside
(35:55):
that circle, our intuition is much less reliable and we need
to rely much more heavily on that slow, careful system to
thinking or on seeking advice from those whose circle does
cover that area. OK, so let's tie it all
together. How do we apply this day-to-day?
First, like we said, ditch the goal of being error free.
It's impossible. OK, permission to be human.
(36:15):
Good. For small stuff, low stakes
decisions, let intuition fly. Don't overthink choosing your
coffee. Save your mental energy.
Right, don't use system 2 on trivial matters.
But for the big decisions, career, money, relationships,
health, the ones with real consequences, that's when you
must slow down. Pull out your mental checklist
of biases, Ask, Am I falling forsome costs here?
(36:38):
Is confirmation bias blinding me?
Am I being swayed by framing? Am I inside my circle of
competence? Use the biases like a pre flight
checklist for important. Decisions.
Exactly. And be rigorous.
Engage that rational brain and be super honest about your
circle of competence. Where do you really know your
stuff outside that circle? Extreme caution.
Rely on data experts and System 2, not your gut.
(37:01):
And embrace negative knowledge. Focus on avoiding the dumb
mistakes. Because dodging those known
pitfalls, those systematic errors we've explored today, is
probably the single most effective way, Doughbelly
suggests, to achieve clearer thinking and make better choices
more consistently. It guides you almost by default
towards rationality. So we've taken quite the journey
(37:22):
today through Ralph Dobellies, The Art of Thinking Clearly.
We've really dug into these fascinating, sometimes funny,
often frustrating ways our mindssystematically diverge from
logic. From survivorship bias hiding
failures to sunk costs chaining us to the past, confirmation
bias filtering our present, and things like endowment effect or
(37:43):
the paradox of choice messing with our sense of value and
satisfaction. It really underscores that key
take away self-awareness. These biases aren't going away.
They're part of our mental toolkit.
But knowing they exist, that's like having a superpower.
It gives you the ability to catch yourself, hopefully before
you make a decision based on a faulty shortcut, the power to
pause and choose a more rationalpath.
So our encouragement to you is become a more curious observer
(38:07):
of your own mind, and maybe the minds of others too gently.
Yeah, maybe keep that predictionjournal, the Belly suggests.
Write down what you expect to happen, then check back later.
See Hindsight bias in action. Or just listen.
Listen in conversations, read the news, watch ads, and see if
you can spot these biases playing out.
Once you start looking, you really do see them everywhere.
(38:27):
It's kind of addictive. That makes you a sharper thinker
for sure. And finally, just a thought to
leave you with something to maybe chew on if our brains, as
Dobelly suggests, evolved primarily not for seeking
objective truth, but for survival and reproduction, for
navigating a complex world well enough to pass on our genes.
(38:48):
What does that really imply? Yeah.
What does that mean for how we structure our societies, our
arguments, our quest for knowledge?
And maybe more personally, how much can we truly, deeply trust
our own intuitive thoughts, our gut feelings, those moments of
absolute certainty? If the machine wasn't built for
truth but for survival, it certainly makes you think,
(39:10):
doesn't it? Makes you question everything
just a little bit. Something to ponder.
Thanks for diving deep with us today.