Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Have you ever made a
really solid decision? You know,
(00:03):
felt great about it, but thenthe outcome just totally bombed.
Speaker 2 (00:07):
Oh, definitely. Or
the flip side. Right? You make a
questionable call, maybe cutsome corners and bam. Pure luck.
Fantastic result.
Speaker 1 (00:13):
Exactly. We've all
been there scratching our heads
wondering, was that me? Or justa random chance.
Speaker 2 (00:19):
It's confusing.
Speaker 1 (00:21):
Well, today we're
diving into a really powerful
book that offers a whole new wayto look at this. It's Annie
Duke's Thinking in MakingSmarter Decisions When You Don't
Have All the Facts. This deepdive is basically your shortcut
to understanding some reallysmart strategies for making
better choices, even when thingsare, well, uncertain.
Speaker 2 (00:41):
Yeah. And our mission
today really is to unpack the
core ideas from this book. Wewant to explore that big
difference between skill andluck. Understand how our own
brains sometimes lead us astray,and then give you some practical
tools. Tools to become moreobjective, think more clearly,
whether it's personal stuff orat work.
Speaker 1 (01:00):
Absolutely. Yeah.
Consider this your, go to guide
for something that affects everysingle choice you make. We've
got some surprising facts,really insightful bits.
Speaker 2 (01:09):
And hopefully a
little humor too.
Speaker 1 (01:10):
Hopefully. Okay.
Let's unpack this.
Speaker 2 (01:13):
I do it.
Speaker 1 (01:13):
So Annie Duke starts
with a central metaphor and it
really clicked for me. She says,life is way more like poker than
chess.
Speaker 2 (01:20):
Mhmm. Explain that a
bit.
Speaker 1 (01:22):
Well, in chess you
see everything, right? All the
pieces, the whole board. Luck isminimal. If you lose, it's
pretty much down to yourdecisions.
Speaker 2 (01:32):
Very clear cause and
effect.
Speaker 1 (01:34):
Totally. But poker,
that's like real life. You're
making decisions with hiddencards, hidden information.
There's risk involved, and luckplays a huge part. It's messy.
Speaker 2 (01:45):
It really is. And the
key thing here, thinking in bets
isn't really about gambling, notin the Vegas sense anyway.
Speaker 1 (01:50):
Right. It's not a
poker tutorial.
Speaker 2 (01:51):
No. It's about taking
those lessons from high stakes
poker, where people have to makegood decisions under pressure
with incomplete info andapplying them to our everyday
choices. The book's core idea isthat all our decisions are
basically bets. Bets on anuncertain future. And the trick
isn't to get rid of uncertaintyyou can't.
Impossible. It's about gettingbetter at evaluating these bets.
(02:13):
And crucially, to separate howgood your decision process was
from how good the outcome turnedout to be.
Speaker 1 (02:20):
Okay. So, if life is
poker, what are the big themes,
the main takeaways?
Speaker 2 (02:24):
Well, the book really
digs into some important stuff,
like that crucial differencebetween luck and skill in how
things turn out
Speaker 1 (02:31):
Mhmm.
Speaker 2 (02:32):
And how we tend to
mush outcomes and decision
quality together. Duke calls itresulting. We'll get into that.
Okay. We'll also look at how ourbrains are kinda built for
speed, for efficiency, notalways accuracy, which leads to
all these biases.
Speaker 1 (02:45):
Mental shortcuts.
Speaker 2 (02:46):
Exactly. And also the
surprising strength in just
admitting, you know what? I'mnot sure. Plus why it's vital to
update our beliefs, learn fromexperience rationally, often
needing help from others, fromtruth seeking groups, and even
this cool idea called mentaltime travel.
Speaker 1 (03:02):
Mental time travel.
Intriguing.
Speaker 2 (03:04):
Yeah. But the big
takeaway, the main point is
this. We can't control luck. Youjust can't. But we absolutely
can control the quality of ourdecisions.
Right. And focusing on that,improving that process over
time, it compounds. It leads toreally significant positive
changes. Makes you moreeffective, more resilient when
life throws those curveballs.
Speaker 1 (03:24):
That resulting idea
you mentioned that really stuck
with me. And the book uses theamazing example. The Super Bowl
XELIX play call. Pete Carroll,Seahawks coach.
Speaker 2 (03:35):
Oh, yeah. The
interception, painful.
Speaker 1 (03:37):
Final seconds,
they're on the one yard line.
They have Marshawn Lynch, beastmode, ready to punch it in.
Everyone expects a run.
Speaker 2 (03:43):
Everyone.
Speaker 1 (03:44):
But Carroll calls a
pass, he gets intercepted, they
lose the Super Bowl andinstantly it's labeled the worst
play call in NFL history.
Speaker 2 (03:51):
People went nuts.
Speaker 1 (03:52):
And Carroll was
absolutely a victim of resulting
there. What people missed in allthe yelling was the data.
Analysis showed that pass wasactually statistically
defensible.
Speaker 2 (04:01):
Yeah. Passes from the
one yard line had a super low
interception rate, like 2% overfifteen seasons before that.
Zero that specific season forthe Seahawks from that spot. The
decision wasn't necessarily badbased on the odds. Uh-huh.
It was just a terrible result,and that bad outcome completely
overshadowed what might havebeen a pretty reasonable, maybe
(04:22):
even smart probabilisticdecision.
Speaker 1 (04:25):
That's such a stark
example. But it makes you think,
how often do we do that in ourown lives? We look back at our
best decisions, our worst ones.And chances are, we're linking
them straight to the outcome,right? Good result equals good
decision.
Bad result equals bad decision.Not looking at the process
behind it.
Speaker 2 (04:42):
Exactly. And breaking
that habit starts with
understanding somethingfundamental. Our bets, our
decisions are only ever as goodas the beliefs underpinning
them.
Speaker 1 (04:51):
Okay.
Speaker 2 (04:51):
That's why we often
get it wrong. Remember that old
WKRP in Cincinnati episode withthe turkeys?
Speaker 1 (04:56):
Ah, the turkey drop?
Hilarious. As god is my witness,
I thought turkeys could fly.
Speaker 2 (05:01):
Precisely. His belief
was completely off. And it led
to, well, a spectacularlyfeathered disaster of a bat.
Speaker 1 (05:08):
That's funny, but
tragic. And it highlights how
easily we just absorb stuffwithout thinking.
Speaker 2 (05:15):
Yeah.
Speaker 1 (05:15):
I remember reading
Daniel Gilbert's work saying,
we're basically credulouscreatures. We find it easy to
believe things, hard to doubtthem.
Speaker 2 (05:22):
It's built in. It's
an evolutionary shortcut,
really. Our brains evolved forefficiency, for survival.
Speaker 1 (05:29):
Like better safe than
sorry.
Speaker 2 (05:30):
Exactly. Better to
make a type by error thinking
that Russell in the bushes is apredator when it's just wind
than a type two error, missingthe actual predator.
Speaker 1 (05:38):
Right. That could be
fatal.
Speaker 2 (05:40):
So we're wired to
accept information pretty
quickly, especially abstractstuff, without putting it
through rigorous checks. Quickjudgments were often more useful
for survival than perfectaccuracy.
Speaker 1 (05:51):
And this isn't just
about turkeys or ancient
history. It has realconsequences now. Duke mentions
how poker players thought suitedconnectors were great hands for
ages until they actually trackedthe data and found they were
often losers. Or think about thewhole low fat diet craze back in
the day.
Speaker 2 (06:07):
Oh yeah, remember
that. Everything was low fat.
Speaker 1 (06:09):
Right. But we
replaced fat with carbs and
sugar, partly based on researchfunded by the sugar industry, it
turns out. And that flawedbelief likely contributed to
rising obesity rates.
Speaker 2 (06:21):
Wow. That's a big
one. And then there's our ego
getting involved. We get intomotivated reasoning and self
serving bias.
Speaker 1 (06:29):
Meaning?
Speaker 2 (06:29):
Meaning we have this
predictable habit, good
outcomes. That was my skill. Badoutcomes. Oh, that was just bad
luck. Or someone else's fault.
Speaker 1 (06:40):
Right. Like the poker
player Phil Hellmuth saying, if
it weren't for luck, I'd wineveryone.
Speaker 2 (06:44):
Classic example. We
all do it, maybe just quieter.
Or think about politicians likeChris Christie quickly blaming
Bridgegate investigations whenthings look bad, then taking
full credit when job numberswent up. It's a very human
pattern.
Speaker 1 (06:56):
What's wild is that
Duke points out research, I
think by Dan Kahan, showing thatbeing smarter, like being good
with numbers, can actually makethis worse.
Speaker 2 (07:04):
How so?
Speaker 1 (07:05):
Because highly
numerate people are actually
better at spinning data to fitthe beliefs they already hold.
So just being intelligent isn'tenough to beat bias.
Speaker 2 (07:13):
Wow, that's slightly
depressing. So if raw brainpower
isn't the answer, what can wedo? How do we fight these built
in biases?
Speaker 1 (07:22):
Well, one really
powerful strategy the book
suggests is what Duke calls thebuddy system. Basically,
harnessing the power of truthseeking groups.
Speaker 2 (07:30):
Okay, like getting
other people involved.
Speaker 1 (07:32):
Yeah, exactly. But
not just any group. The best
ones, she says, encourageaccuracy, they hold each other
accountable and crucially, theyvalue diversity of ideas, not
just an echo chamber whereeveryone agrees.
Speaker 2 (07:44):
Right. People who
will actually challenge you
constructively.
Speaker 1 (07:47):
And another cool tool
is this idea of asking 'wanna
bet?' Like imagine someonechallenges you to put actual
money down on a belief you juststated really confidently.
Speaker 2 (07:56):
Ah, okay. So just the
thought of having to bet makes
you pause.
Speaker 1 (08:00):
Yeah. It forces you
to kind of step back and think,
okay, wait, How sure am I?What's the evidence?' It pushes
you to vet your info morehonestly and maybe be more open
to changing your mind.
Speaker 2 (08:10):
That connects to some
principles the sociologist
Robert Merton identified foreffective scientific
communities. Duke applies themmore broadly. They're called CU
DOS norms. Two? Yeah.
It's an acronym. C is forcommunism, meaning data and
information belong to the groupshared openly, even the
inconvenient bits. U is foruniversalism, judge the info on
(08:31):
its merits, not who said it.Doesn't matter if it's the CEO
or the intern, is the datasolid?
Speaker 1 (08:36):
Okay.
Speaker 2 (08:37):
D is for
disinterestedness. This is huge.
It means checking your ego, yourbiases, your conflicts of
interest at the door.
Speaker 1 (08:43):
Like the sugar
industry funding fat research.
Speaker 2 (08:45):
Exactly. A related
technique is outcome blindness.
If you're asking for advice on apast decision, don't tell them
the outcome. That stops themfrom just rationalizing what
happened and forces objectivethinking.
Speaker 1 (08:56):
Oh, that's clever.
Avoids hindsight bias.
Speaker 2 (08:59):
Right. And OS is for
organized skepticism. Basically,
approach new ideas by trying topoke holes in them. Ask why
might this not be true. Insteadof just looking for
confirmation, embrace saying I'mnot sure.
Speaker 1 (09:10):
Man, upholding all
those especially
disinterestedness soundsincredibly hard when your own
ego or reputation is on theline.
Speaker 2 (09:17):
It is hard, and it
requires everyone to be on
board. Duke uses that example ofDavid Letterman kind of grilling
Lauren Conrad about her realityshow drama.
Speaker 1 (09:25):
Yeah. She was not
ready for that level of
skeptical inquiry.
Speaker 2 (09:30):
Right. It shows that
truth seeking needs buy in. You
can't just spring it on people.Everyone has to agree to play by
those rules.
Speaker 1 (09:36):
So the big shift here
is changing what feels good.
Speaker 2 (09:39):
Right.
Speaker 1 (09:39):
Right. Instead of
feeling good because you were
right, you start feeling goodabout being like a good learner,
admitting mistakes, givingcredit.
Speaker 2 (09:47):
Exactly. Like the
poker pro Phil Ivey who
apparently analyzes his winninghands just as hard as his losing
ones looking for ways he couldhave played even better. It's
about continuous improvement,not just proving yourself right.
Speaker 1 (10:00):
Okay. This next idea
sounds really cool. Mental time
travel. Using it to fight ourpresent bias. Like, you know,
night Jerry wants to stay uplate watching TV, totally
ignoring that morning Jerry isgonna pay for it.
Speaker 2 (10:11):
Yes. The classic
battle. Mental time travel is
about bringing your past andfuture selves into your present
decisions. How can they helppresent you make better choices?
Speaker 1 (10:21):
How does that work in
practice?
Speaker 2 (10:22):
One powerful tool is
using Ulysses contracts.
Remember the story of Odysseustying himself to the mast?
Speaker 1 (10:28):
To resist the siren
song. Yeah.
Speaker 2 (10:30):
Right. So a Ulysses
contract is pre committing to a
rational choice now to stop yourfuture, more impulsive self from
messing things up. Thinkautomatic four zero one k
contributions. Or setting a firmbudget before you start house
hunting, you bind your futureself.
Speaker 1 (10:46):
Ah, creating a
decision interrupt for future
you. Clever.
Speaker 2 (10:50):
Exactly. Another
maybe simpler technique is the
decision swear jar.
Speaker 1 (10:55):
Okay. Like a regular
swear jar but for bad thinking.
Speaker 2 (10:58):
Sort of. You identify
language or thought patterns
that signal irrationality usingabsolutes like always or never
or worst day ever, blamingothers constantly, making
sweeping judgments about people.
Speaker 1 (11:10):
Things we all do when
we're stressed or emotional.
Speaker 2 (11:12):
Right. And when you
catch yourself using that kind
of language, ding, like puttinga dollar in the jar, it's a
signal to pause, take a breath,and try to think more rationally
about the situation.
Speaker 1 (11:21):
That's a neat little
metal trick.
Speaker 2 (11:23):
Then there's scenario
planning. Two key types here.
First is backcasting. Youimagine a really positive future
outcome, say, your project is amassive success, then you work
backward. What steps did we taketo get here?
Speaker 1 (11:36):
Like Friedrich Laul
Olmsted designing Central Park,
thinking decades ahead.
Speaker 2 (11:40):
Exactly. The flip
side is the premortem. You
imagine the project failedspectacularly, total disaster.
Then you work backward. Whatwent wrong?
Why did it fail?
Speaker 1 (11:51):
Okay. So you're
anticipating obstacles before
they happen. But why doesfocusing on the negative, the
potential failure, actually helpyou succeed? Seems
counterintuitive.
Speaker 2 (12:01):
Research, like
Gabrielle Ongans, shows that
visualizing obstacles actuallymakes achieving goals more
likely because it forces you toplan for them, to develop
strategies, it makes you moreprepared.
Speaker 1 (12:13):
So combining
backcasting the ideal path with
premortems, the potentialroadblocks, gives you a much
more realistic picture.
Speaker 2 (12:20):
Precisely. It helps
you adjust your goals, build
contingency plans. You're lesslikely to be blindsided when
things inevitably don't goperfectly smoothly.
Speaker 1 (12:27):
But we have to be
careful about hindsight bias
creeping in later. Right? Yes.Looking back after something has
happened and thinking it wasobvious all along.
Speaker 2 (12:34):
Oh, absolutely.
Whether it's a disaster like
that Kanagara grain binexplosion Duke mentions or
predicting elections, once weknow the outcome our brains tend
to see it as inevitable. Weforget all the other ways things
could have gone.
Speaker 1 (12:47):
It chops off the
branches of the tree as Duke
puts it. Yeah. Makes it reallyhard to learn the right lessons.
Speaker 2 (12:53):
Definitely.
Speaker 1 (12:53):
Okay. Let's shift
gears slightly. Like a good book
club, let's talk about somemaybe constructive critiques or
limitations of the thinking inbets approach. Mhmm. What are
some things to keep in mind?
Speaker 2 (13:06):
Good idea. Well,
first, it's not a magic wand and
the book is clear about this.Thinking in bets won't make you
perfectly rational oremotionless. That's impossible.
We're human.
Speaker 1 (13:14):
Yeah.
Speaker 2 (13:14):
It's about a shift, a
movement towards more
objectivity. And thatimprovement adds up over time,
but you'll still make mistakes,still feel emotions.
Speaker 1 (13:23):
Okay. Fair enough.
What else?
Speaker 2 (13:24):
Second, just knowing
about biases isn't enough to fix
them. Our brains are hardwiredfor those shortcuts for
efficiency. It's like knowing anoptical illusion tricks your
eyes, you still see theillusion.
Speaker 1 (13:36):
Ah. So it takes
constant conscious effort.
Speaker 2 (13:38):
Exactly. You have to
actively practice these
techniques. Third, there'spotential social friction.
Trying to be a rigorous truthseeker can rub people the wrong
way if they haven't agreed tothat kind of interaction.
Speaker 1 (13:51):
The letter manning
effect we talked about.
Speaker 2 (13:52):
Right. You need
communication and buy in, or it
can just seem you're beingdifficult or argumentative.
Fourth, even smart groups cansuffer from drift towards
homogeneity.
Speaker 1 (14:03):
Meaning they start
thinking alike.
Speaker 2 (14:04):
Yeah. Think about
Supreme Court clerks often
hiring people who share theirideology. You lose that
viewpoint diversity that's socritical for spotting errors and
getting closer to the truth.Mhmm. Echo chambers are
dangerous.
Speaker 1 (14:16):
Mhmm.
Speaker 2 (14:17):
And finally, number
five, this stuff requires
consistent work. Changing deeplyingrained mental habits like
blaming luck for failures andtaking credit for success.
That's hard. It's a discipline,not a quick fix you do once.
Continuous effort is key.
Speaker 1 (14:33):
Those are really
important caveats. So boiling it
all down, what does this meanfor you, the listener? Let's
distill maybe 10 key insightsfor daily life.
Speaker 2 (14:41):
Okay. Top 10. Let's
try. One. Life is poker, not
chess.
Accept the uncertainty and luck.Focus on making the best
possible decision with the infoyou have, not just on the
outcome.
Speaker 1 (14:52):
Two, avoid resulting.
Judge decisions on the process,
not the result alone. Ask whythings happened, good or bad.
Speaker 2 (15:00):
Three, I'm not sure.
It's okay not to know
everything. Admittinguncertainty is the start of real
learning and builds credibility.
Speaker 1 (15:08):
Four. Actively vet
your beliefs. Don't just accept
what you hear, challenge yourown assumptions, seek out
evidence, Look for differentperspectives.
Speaker 2 (15:16):
Five. Use the Wanna
Bet challenge. Even if just
mentally, it forces honestyabout how confident you really
are.
Speaker 1 (15:22):
Six. Expressing
uncertainty builds trust. Saying
I'm about 70% sure is often morecredible than false certainty
and invites helpful input.
Speaker 2 (15:30):
Seven. Actively fight
self serving bias. Try
attributing good results partlyto luck and analyze your role in
bad results. It forces learning.
Speaker 1 (15:38):
Eight. Form truth
seeking groups. Even just two or
three people committed toaccuracy, accountability, and
diverse views can make a hugedifference. Set ground rules.
Speaker 2 (15:46):
Nine. Practice mental
time travel. Use tools like the
ten-ten-ten rule, How will thisfeel in ten months, ten months,
ten years? Or Ulysses contractsto connect with your future self
and curb impulsivity.
Speaker 1 (16:00):
And dim.
Speaker 2 (16:01):
10. Use backcasting
and premortems. Plan for success
and failure. Imagine both. Workbackward.
It builds robust, realisticrealistic plans.
Speaker 1 (16:10):
That's a fantastic
list. Really practical stuff.
Now if someone loved digginginto this book and how our minds
work, what's a good thematicpairing? Like, if you like this,
you'll love that.
Speaker 2 (16:20):
Oh, absolutely. The
clear recommendation is Thinking
Fast and Slow by DanielKahneman.
Speaker 1 (16:24):
The Nobel laureate.
Speaker 2 (16:26):
Exactly. Annie Duke
references his work constantly.
Kahneman's book is really thefoundation for understanding
system one, fast intuitivethinking, and system two, slow
deliberate thinking. It divesdeep into the psychology behind
all the biases Duke talks about.
Speaker 1 (16:39):
So provides the
scientific bedrock.
Speaker 2 (16:42):
Precisely. If
thinking in bets is the
practical application, Kahnemangives you the underlying theory.
Highly recommended.
Speaker 1 (16:47):
Great suggestion.
Okay. To wrap things up, you
mentioned a haiku.
Speaker 2 (16:50):
I did. Here's one
trying to capture the essence of
thinking in bets. Future's opensky. Our choices steer the long
road. Truth lets wisdom fly.
Speaker 1 (17:00):
Nice. Future's open
sky. Truth lets wisdom fly. I
like that. So tying it alltogether, how does embracing
this thinking in Beth's mindsetactually help someone live a
better life?
What's the real impact?
Speaker 2 (17:13):
Well, I think it
fundamentally shifts your
perspective. You move away fromharsh judgment of yourself and
others and away from paralyzingregret. Instead, you move
towards continuous learning.
Speaker 1 (17:24):
Becoming a better
learner.
Speaker 2 (17:25):
Yes. It's not about
achieving perfection because
that's impossible withuncertainty. It's about steadily
increasing your chances of goodoutcomes over the long run. You
become more objective about whyyou succeed or fail, maybe more
compassionate when others facebad outcomes despite good
efforts.
Speaker 1 (17:39):
Mhmm.
Speaker 2 (17:39):
You get better
equipped to just navigate life's
inherent messiness. You gainclarity, cut down on self
deception, and start makingchoices that are truly aligned
with what you want long term. Itleads to being more informed,
more effective, and ultimately,yeah, I think it helps you live
a better, more considered life.
Speaker 1 (17:54):
Wow. What a journey
through decision making,
uncertainty, and, well, thinkinglike a poker player, but for
life. We really hope this deepdive into thinking in bets has
given you some powerful newtools.
Speaker 2 (18:07):
Yeah. Some things to
chew on. Remember, every
decision really is a bet. But ifyou understand the game better,
you can definitely play itbetter. Keep asking questions,
keep learning from youroutcomes, good and bad, and keep
making those bets on your futurewisely.
Speaker 1 (18:21):
Until next time, keep
thinking deeply about the world
around you, and we'll be here tohelp you dive in.