Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:03):
Welcome to the Sage
Solutions podcast, where we talk
about all things personalgrowth, personal development,
and becoming your best self.
My name is David Sage, and I ama self-worth and confidence
coach with Sage CoachingSolutions.
I want to start with a question,but I'm so glad that you're here
(00:27):
today.
Have you ever been in anargument with someone, maybe a
partner, a coworker, or justsomeone in the comment section
on Facebook or YouTube?
And you find yourself thinking,how can they possibly not see
the facts?
Like the evidence is rightthere.
(00:48):
You look at them and they seemlike they're just delusional.
Now this might be a littleextreme, but we all get caught
up in our own perspective.
And sometimes it truly feelsthat way.
But the uncomfortable truth thatI want us to wrestle with today
is that a lot of times, they arelikely looking at you and
(01:12):
thinking the exact same thing.
We like to think that our brainsare like cameras.
We think that we just point oureyes at the world, record the
footage, and store the data.
And that the way we think isalways logical and
computational, like a computeror a calculator.
(01:33):
But that's just not how humanpsychology works.
Your brain isn't a camera.
And it's not a computer.
In a lot of ways, it's more of apainter.
Remember your perspective ofreality is the largest driver of
your experience of reality,dramatically changing how you
(01:56):
see the world and interpretevents.
So your brain is often thispainter, and it's almost
constantly painting over theworld, over reality, in little
cognitive shortcuts or biasesthat helped us survive as a
(02:17):
species, and not just helped yousurvive, but made you feel
safer, smarter, and moreconsistent than you actually
are.
This episode has been a longtime coming.
Today we are doing a deep diveinto the operating system of
your mind.
We are talking about cognitivebiases.
(02:40):
But before we get into it, ourgoal with this podcast is to
share free, helpful tools withyou and anyone you know who is
looking to improve their life.
So take action, subscribe, andshare this podcast with them.
Okay, now don't tune out.
(03:02):
I know this might sound like aboring psychology 101 lecture,
but I've come to believe thatunderstanding and gaining
awareness of our cognitivebiases is one of the best
possible ways to try and viewthings as objectively as you
can.
Awareness helps you confrontthem.
(03:25):
Some people may say ignorance isbliss, but in this situation I
disagree.
If you don't know the glitchesin your own software, you can't
run your own programseffectively.
Now there's too many cognitivebiases for me to put all of them
in one episode, so I'm going tobe doing a part two to this
(03:45):
episode.
But today we're going to coverten major cognitive biases.
Biases that we all have builtinto us as human beings.
We're going to talk about whyyou probably think you're a
better driver than you actuallyare.
Why you think the world isgetting worse when it might not
(04:06):
be.
And why the first price you seeon a car determines everything.
So take a deep breath and let'sdo our best to get objective.
Part one.
Let's start with the biases thatare designed to protect your ego
(04:29):
and your worldview.
I'm grouping these up andcalling them the ego protection
squad to make it easier toremember.
The captain of this squad is theone you've likely heard of,
confirmation bias.
Now, you know how this works.
You have a belief.
Let's say you believe that, Idon't know, that waking up at 4
(04:50):
a.m.
is the only way to besuccessful.
Because of confirmation bias,you will subconsciously scour
the internet and your daily lifefor any and all information that
supports that belief.
It also means that you're muchmore likely to ignore any
(05:11):
information that goes counter tothat belief.
So let's say somebody thenchallenges you and says, no, you
don't have to get up at 4 a.m.
and provides a counterexample ofsomebody who's very successful
that doesn't.
You decide to scour the internetto see what it really says.
But because of the confirmationbias, as you're doing your
(05:35):
research, you're going tosubconsciously pick out and
internalize every article thatsupports what you already
believe.
And at the same time, you'llmostly ignore the hundreds of
articles about the importance ofsleep, or examples of very
effective CEOs who wake up ateight.
(05:55):
This automatic filtering ofreality, just to prove yourself
right, that you're not evendoing consciously.
That is confirmation bias.
There's also a specific flavorof this confirmation bias called
the my side bias.
While confirmation bias is a bitmore general, the my side bias
(06:17):
is specifically about how weevaluate arguments.
There's a scientific consensusof studies that show that we
evaluate information differentlybased on the my side bias,
especially when it comes topolitics.
Now I'm not looking to make apolitical statement here, and
I'm not taking either side.
(06:38):
I'm just using politics as anexample because it's highly
emotionally charged and verypolarizing.
Based on this scientificconsensus, if I put a study in
front of you that supports yourpolitical belief, you will
likely accept it immediately,not questioning inconsistencies
or errors or something thatcould make it not true.
(07:01):
We see this all the time, wheresmart analytical people will
then complement the positivesides, highlighting or saying
that this study has a solidsample size and great
methodology, overlooking anyflaws in the study because it
confirms what they alreadybelieve.
(07:21):
But if I put a study in front ofyou that contradicts your
political view, you're likely toreject it outright and say,
well, there must be somethingwrong with it, and then start
hyperfixating on the issues withit and trying to find problems
with the study.
We then see smart analyticalpeople do the exact opposite.
(07:43):
And they'll start tearing itapart, saying, Oh, the sample
size is too small, and theresearchers were biased.
Even though that similar studyearlier that confirmed their
view may have had identicalsample size and methodology.
Because of confirmation bias andmy side bias, we don't naturally
(08:04):
evaluate data based on the data.
We evaluate it based on whoseside it helps.
And in comes another cognitivebias that's basically holding
hands with these other two.
It's called the desirabilitybias.
This is simply the tendency tobelieve something is true just
(08:25):
because you want it to be true.
Think about a relationship thatis clearly failing.
All the red flags are there.
But you want the relationship towork.
You want it so badly that yourbrain actually filters out the
red flags.
You don't even see them.
You predict a positive outcomebecause the negative outcome is
(08:46):
too painful to look at and youjust don't want to believe it.
If we think back to the lastweek of our lives, I'm sure we
can find some places where anyor all of these cognitive biases
apply.
Where did you accept a fact justbecause it felt good?
That's desirability bias atwork.
It's comfortable.
(09:07):
It keeps us stuck.
And many times these compoundbecause we often believe what we
want to believe.
And once we've identified thatas a side, all three of these
biases are at play at the sametime, making it very hard to
change a belief, or even justrethink one when new information
(09:30):
is provided.
This group of biases has aninsidious spiral effect, causing
us to be less open to new thingsand restricting our learning to
what feels comfortable and whatwe already believe.
By being aware of these things,we can consciously shift our
(09:51):
perspective and try and be moreopen to contradicting
viewpoints.
This allows us to become muchmore well balanced, to think in
more shades of gray, to belifelong learners, to embrace
wisdom and wonder throughcuriosity and critical thinking.
(10:12):
Before I get into the nextcluster of cognitive biases,
there's one that kind of fallsin between them and provides a
little bit of a shade of graywith some added complexity by
showing that multiple things canbe true at the same time, and
sometimes those things arecontradictory.
And with that, we have thespotlight effect.
(10:36):
The spotlight effect is acognitive bias where people
overestimate how much othersnotice their appearance or
behavior.
This is driven by a form ofegocentrism.
People place too much weight ontheir own perspective, which
makes them feel like they'reunder a constant spotlight, when
they're often not even the focusof others' attention.
(10:59):
This really arises from theoverarching egocentric bias,
which is basically that we areall caught up in our own
perspective about our own life.
It makes us naturally imaginethings from that perspective and
that people would notice thesame things about ourselves that
(11:20):
we do.
Now this can go one of two ways.
The most common way is actuallya major driver of
self-consciousness andinsecurity.
It's when we hyperfixate on onelittle part of our hair being
wrong, or some specific piece ofour body that we really don't
(11:42):
like, when in reality most otherpeople don't even notice that,
but we assume everyone does,just like we do, because if we
see it, how could they not?
As a confidence coach, I think alot about these and how to
mitigate these while stillteaching people how to have true
(12:04):
confidence.
True confidence is not aboutbeing amazing at everything.
True confidence is a combinationof confidence and humility, of
self-awareness,self-improvement, and
self-acceptance, ofself-compassion and social
(12:25):
awareness.
We're trying to build thepowerful effects of being
confident and happy with who youare.
We're trying to fill up yourinner cup and help you realize
that you are enough and that youdo deserve to feel good and have
confidence in who you are.
But that doesn't mean thatyou're amazing at everything and
(12:48):
that everyone thinks you're theshit.
Excuse my language.
But you're just one person.
I'm just one person.
That's why we need to balance itwith humility.
And these competing yetcoexisting cognitive biases can
drive us to both beoverconfident in some areas and
(13:10):
insecure in others.
Most of the time that you'rehyper-fixating on something,
it's really just the spotlighteffect in effect.
And at the same time, thespotlight effect goes the other
way too.
We often walk into a room andassume that everyone is turning
(13:30):
their heads looking.
We're not that important, andthat's okay.
This next group of cognitivebiases, I'm gonna call the ego
enhancement squad.
These cognitive biases can bethe drivers of big egos,
arrogance, and overconfidence.
(13:50):
I mean, after all, I'm thesmartest guy in the room.
I also happen to be in a roomwith just me right now.
These biases are where weoverestimate our own competence.
We're gonna start with somethingcalled the objectivity illusion,
also known as the objectivitybias.
(14:12):
This is one of my favoritebiases because it's one of the
funniest.
It is the belief that you areless biased than other people.
I've also heard it referred toas the I'm not biased bias.
It often leads to a belief thatyou perceive reality objectively
and without bias, and thatanyone who disagrees with you
(14:36):
must be biased, ignorant, orirrational.
It's the I'm the only saneperson here feeling.
Not that that's never true, butare you really that special?
If you walk around thinking thateveryone else is crazy, you
might be suffering from theobjectivity bias.
(14:57):
This leads us right into theheavy hitter, the Dunning Kruger
effect.
Now you may have seen the memes,but let's really define it.
Dunning Kruger isn't just aboutdumb people who think that
they're smart.
It's that when you have very lowcompetence in a specific area,
you lack the ability torecognize how bad you are at it.
(15:19):
The Dunning Kruger effect showsthat people with no competence
also have no confidence in theirability to do that thing.
But it but as the graphprogresses and somebody gains
just a little bit of knowledgeand competence, enough that they
feel like they knowsignificantly more than they did
(15:41):
before, their confidence intheir own knowledge and ability
skyrocket well beyond wheretheir actual competence is.
This can often lead very dumbpeople who know a little about
something to think that they'reincredibly intelligent and
competent at it, hence all thememes.
But as the chart goes on andyour knowledge increases, you
(16:06):
have a much better understandingof the complexity of the thing,
of the skill that it actuallytakes.
So your competence increases,but your confidence actually
decreases dramatically.
You now understand how much youdon't know.
And then from that point, yourcompetence and confidence slowly
(16:28):
grow at a similar rate.
But the Dunning Kruger effectdoes show that at stage two you
are overconfident for yourcompetence, and at stage three
you tend to be underconfidentfor your competence.
When you go from knowing nothingto knowing a little, you feel
like you know so much more, andthat confidence spikes.
(16:52):
It's when you have that very lowcompetence in a specific area
that you lack the ability torecognize how bad you still are
at it.
It's the guy who reads twoarticles on macroeconomics and
thinks he can fix the nationaldebt.
Yet oftentimes the same guy isstruggling with his own
(17:13):
finances, and he doesn't knowenough to know how much he
doesn't know.
As you actually get smarter atsomething, your confidence
usually drops because yourealize, holy crap, this is way
more complicated than I thought.
If you are 100% certain about acomplex topic, you should
probably check yourself.
(17:35):
You might be on the peak of whatpeople call Mount Stupid of the
Dunning Kruger effect.
That's what they call the earlyspike of confidence in the Dun
and Kruger chart.
So this leads directly into theoverconfidence bias and its
sub-bias, the illusorysuperiority effect.
(17:56):
The overconfidence bias is thetendency to overestimate your
own abilities, your knowledge,or your control, leading to a
skewed perception of your ownskills and your performance
relative to reality.
Have you ever done trivia andhad something pop into your head
that you then decided was theanswer, and you argued that you
(18:20):
knew it with absolute certainty.
Because you heard somethingabout it once, one time, but
you're certain and then it endedup being wrong, that's the
overconfidence bias at play.
It's a belief that we're betteror smarter or more knowledgeable
than we actually are, and it'susually characterized by
(18:45):
overestimation, over precision,and finally overplacement, which
leads us to the illusorysuperiority effect.
Here's a classic statistic.
If you ask a room full ofpeople, are you an above average
driver?
About eighty to ninety percentof people will raise their
(19:05):
hands.
Now, mathematically, that'simpossible.
We can't all be above average,that's literally not how
averages work.
And we do this with everything.
We think we are more ethical,smarter, and more logical than
the average person.
Now we may not think that we arethe smartest, but we've got to
(19:26):
be at least above average,right?
This illusion of superiorityprevents us from asking for
help.
It prevents us from learning.
Because why would we learn ifyou're already better than
everyone else, or at least aboveaverage?
And not only that, I'm moreobjective than the average
person, too.
(19:49):
But by understanding yourcognitive biases and becoming
more aware of them, includingthe objectivity bias, you may
actually become more objective.
But you will never, I repeat,these are hardwired.
You will never be fullyobjective.
(20:09):
That is something that we alljust have to accept.
It's reality, it's human nature.
So ask yourself this.
In what area of your life areyou coasting because you think
you're naturally gifted?
When maybe you actually need toput in the work.
Alright, we're about halfwaythrough.
(20:29):
Let's move on to the cognitivebiases of fear and the past.
We've already covered the onessurrounding ego and arrogance.
Now it's time to talk aboutfear.
While the human brain might wantto protect your ego, it is not
naturally designed to make youhappy.
(20:50):
It was designed to keep youalive.
And because of that, we have amassive negativity bias.
Rick Hansen, a highly respectedpsychologist, says that the
brain is Velcro for badexperience and Teflon for good
ones.
You could have ten peoplecompliment your outfit today,
(21:13):
and just one person say, Man,you look tired.
And what are you going to thinkabout when you're trying to fall
asleep tonight?
Most likely it's that onenegative comment.
We give far more weight tonegative events and experiences
than positive ones.
Evolutionarily, this makessense.
(21:35):
Ignoring a sunset is fine.
Ignoring a tiger is fatal.
But in the modern world, thisbias leads to anxiety and
stress.
Studies have shown that onaverage, we will tell about
thirty three different peopleabout a negative event, and
(21:57):
maybe three of a positive eventof similar weight.
There's also data showing thatwe give twice the amount of
attention and weight to negativenews compared to neutral, and
only half the amount of weightto positive news.
Compound this with the fact that80% of news stories are negative
(22:18):
in nature because they knowthese cognitive biases.
And you can see how this spiralsout of control.
Now this feeds into a relatedbut separate bias called the
declinism bias.
This is the belief that the pastwas better than the present and
that the future is going to beworse.
(22:39):
Phrases like kids these days.
Ugh, music used to be better.
Society is collapsing.
We've believed society iscollapsing for 2,000 years.
The Romans wrote about how theyouth were ruining society.
Declinism happens because weforget the pain of the past and
we obsess over the problems ofthe present.
(23:01):
This also causes us to project alinear increase in the problems
of the present into the future.
In most cases, when you look atthe data, human well-being and
quality of life have only gottenbetter over time.
This is the whole premise behindthe book The Rational Optimist.
(23:24):
This bias stops us from seeingthe opportunities right in front
of us because we're too busylooking in the rear view mirror
and worrying about the future.
Another closely related bias isthe conservatism bias.
Now I don't mean politics here.
In psychology, the conservatismbias is the tendency to
(23:47):
insufficiently revise our beliefwhen presented with new
evidence.
It's a form of mental inertia.
Let's say you read one articlesaying that a specific diet is
bad.
Then three new high-qualitystudies come out showing that
it's actually quite healthy.
The conservatism bias is thedrag that makes you say, eh, I'm
(24:10):
still not sure.
We're gonna stick with the oldinformation because new
information feels risky.
This causes us to subconsciouslyoverweigh prior information and
underweigh new evidence.
It keeps us living in the past,running old software on this new
computer.
And when you put them alltogether, it leaves us living in
(24:34):
fear.
Alright, we're coming down thehome stretch here.
I have two more for you, andthese are huge for decision
making.
First, we have the anchoringbias.
Somewhat similar to theconservatism bias.
It's the tendency to rely tooheavily on the very first piece
of information you receive, orthe anchor.
(24:56):
You walk into a store, you see ajacket for$500, you think, that
is insane.
Then you see a second jacket for$200, and you think, wow, that's
a deal.
You only think$200 is a dealbecause you were anchored by
that first$500 price tag.
If you had walked in and thefirst jacket you saw was$50,
(25:19):
you'd think the$200 jacket was aripoff.
We often do this innegotiations, in salary talks,
even in how we judge people.
First impressions areessentially character anchors.
If someone is rude to you thefirst time you meet, that is the
anchor.
They then have to work ten timesas hard to move you away from
(25:40):
that initial data point.
And finally, I want to talkabout what seems like a
contradictory bias, thepessimism aversion bias.
Now this sounds like theopposite of the negativity bias
and also kind of the declinismbias.
But it's different.
Negativity bias is noticingthreats and giving too much
(26:03):
weight and giving extra weightto negative information.
Pessimism aversion, sometimescalled the ostrich effect, is
when we actively avoid lookingat information that might make
us feel pessimistic.
Or when we hear somethingdebilitatingly negative, in
order to protect ourselves fromthe pain of it, especially when
(26:26):
we have no control over it, weavoid that negative information,
stick our head in the sand andpretend it's not real.
An example would be hearing thatwe all might die from something
like global warming or the AIalignment problem.
Our gut reaction is to rejectthose claims outright and stick
(26:48):
our head in the sand.
But the ostrich effect is notjust for those incredibly
overwhelming examples, but alsofor the daily overwhelming
examples.
This is not checking your bankaccount balance because you know
it's low.
It's not going to the doctor tocheck out that weird pain
(27:09):
because you're afraid of thediagnosis.
It's the bias that makes usthink if we don't see the bad
news, it doesn't exist.
But it's time to face the brutaltruth, the facts.
You cannot fix what you refuseto be aware of.
Okay, so I admit there was a lotthere.
(27:30):
We basically just ran adiagnostic on your brain's
biases, or at least some ofthem.
We talked about how we filterfor what we want to hear,
confirmation and desirabilitybiases.
We talked about how we think weare smarter than we are, the
Dunning Kruger effect, and theoverconfidence bias.
(27:52):
We talked about how we cling tothe past, declinism and
conservatism.
We talked about how we giveextra weight to negative
information, negativity bias.
We talked about the spotlighteffect and how it amplifies some
of our biggest insecurities.
And finally, we talked about howwe consciously avoid negative
(28:15):
information when it feelsoverwhelming, with the ostrich
effect, and how we get stuck onthe first impression with
anchoring.
So what do we do with this?
How do we actually use it?
You cannot just flip a switchand turn these off.
You are human.
(28:36):
But you can catch them.
For the next week, I want you toplay a game.
I'm gonna play this game withyou.
I want you to try and catchyourself in just one of these.
When you're doom scrolling andthinking that the world is
ending, say to yourself, Ah,this is declinism and the
negativity bias.
(28:56):
When you're absolutely sure youwere right in an argument, ask
yourself, Am I suffering fromconfirmation bias, desirability
bias, or the objectivity bias?
When you see a price tag, ask,hold on, am I being anchored?
The moment you name the bias andunderstand it, you take away its
(29:19):
power.
The awareness of these biaseshelps us combat them so that we
can be more objective and moreaware of the forces shaping our
behavior.
So that's what I have to offeryou today.
Awareness.
It's not about having a perfectbrain.
It's about knowing how to drivethe imperfect one that you have.
(29:42):
If you enjoyed this episode,feel free to share it with
somebody else.
But try not to use it to callout somebody's bias.
That's not gonna work.
And remember you are enough.
And you Deserve to fill up yourinner cup with happiness, true
(30:05):
confidence, and resilience.
Thank you for listening to theSage Solutions podcast.
Your time is valuable, and I'mso glad that you choose to learn
and grow here with me.
If you haven't already, don'tforget to subscribe so you don't
(30:28):
miss out on more Sage advice.
One last thing, the legallanguage.
This podcast is for educationaland informational purposes only.
No coaching client relationshipis formed.
It is not intended as asubstitute for the personalized
(30:51):
advice of a physician,professional coach,
psychotherapist, or otherqualified professional.