Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Why is there so much polarization in the world, and
what does this have to do with the brain, And
what does any of this have to do with how
you picture a cat, or why we respond to certain
cartoons or the British nobleman Lord Gordon, or the Iroqui
Native Americans? And why do you naturally feel that everyone
who disagrees with you is a troll or misinformed? And
(00:29):
if you could just shout loudly enough in all caps
on Twitter, they would see that you're right. Why can't
they see that you know the truth? Welcome to inner
Cosmos with me, David Eagleman. I'm a neuroscientist and an
author at Stanford University, and I've spent my whole career
(00:52):
studying the intersection between how the brain works and how
we experience life. It hasn't escaped anybody's notice that we
are in a time in which polarization and disagreement is
(01:16):
higher than most of us have seen in our lives
so far, and so in the past decade I've become
very interested in the brain science behind that behind polarization,
and more generally, how we come to believe our own
political opinions and why we're so certain that everyone else
(01:36):
is wrong, and how if we could just talk to them,
if they could just listen to us, they would see
the light and they would know that we are right
and they were mistaken. Now, I want to set the stage.
Polarization is not a new thing. Although we are in
a polarized era right now, this is far from unique.
(01:58):
Just think about the Civil War in America, where you
had brothers and neighbors taking up arms against one another,
or in the nineteen sixties and seventies, people here held
vastly different opinions about the war in Vietnam and how
to treat the returning soldiers or take stuff that's even bigger,
like Nazism in Germany, which was the most advanced country
(02:21):
in Europe. The thing to recognize is that in the
nineteen thirty four elections in Germany, every single seat in
the Reichstag, the German Parliament, was either Communist Party far
left or National Socialist Party far right. Or look more
generally at the whole twentieth century, the Communist Revolution in
China or in the USSR, or the Hutu massacre of
(02:44):
the Tutsi in Rwanda, or the Camera Rouge in Cambodia,
and on and on. There's nothing new about polarization and
people taking up arms. And what I want to talk
about today is why so the question on lots of
people's mind, it seems to be, is this because of
social media. I don't think that has much of anything
to do with it. Keep in mind that all the
(03:06):
examples I just named preceded Twitter and Facebook, and those
were much worse than what we have going on currently.
The fact is it doesn't take much to get people
all worked up over different opinions and taking up arms,
and you don't need social media for that. And my
goal today is to explain why it is so easy
(03:27):
for people to get so worked up and believe their truth.
So this is what we're going to explore. Why does
everyone have different opinions? And why does everyone with a
different opinion to yours seem misinformed or obstreperous or like
a troll. So I'll start with a cartoon and a story.
(03:48):
The cartoon was one that I saw recently on Twitter.
It shows a bunch of people walking on a road
and up ahead there's a fork in the road and
off to one side there's a small number of smart,
thoughtful people that are following a windy path marked complex
but right, and on the other side of the fork
(04:09):
there is a very packed group of people and they're
all walking and it's labeled simple but wrong, and this
takes them to a cliff that they eventually tumble off of.
Now I'm going to come back to this cartoon later,
and when I come back, we're going to understand this
in a slightly different way.
Speaker 2 (04:26):
But first I want to turn to this.
Speaker 1 (04:28):
Story, which is a true story. Many years ago, I
got my PhD in neuroscience. I was a second year
graduate student when the new class of first year students
came in and one student really stood out, and I'm
going to call her Tanya.
Speaker 2 (04:43):
She seemed very sweet.
Speaker 1 (04:45):
And what we found out was that Tanya had extraordinary grades,
and she'd come from a very impoverished neighborhood in Chicago,
but she had these incredible grades and these incredible letters
of recommendation and great scores on her gres. So during
the first month or two of school, she was given
a special award and there was going to be a
(05:05):
banquet for it, and to my surprise, she asked me
if I would be her date to the banquet.
Speaker 2 (05:11):
So I said yes.
Speaker 1 (05:12):
I didn't know her well, but I thought she seemed
very sweet, and so I said yes. So the banquet
was supposed to be on Friday of that week, but
as it turns out, the banquet never happened. Why it's
because on Tuesday of that week, Tanya was in the
women's restroom with one of the administrators at the school
(05:33):
and they started talking. The administrator said, Wow, Tanya, everything
is so amazing about you and.
Speaker 2 (05:38):
Your grades and your skills.
Speaker 1 (05:39):
I want to know how your school cultivated a thinker
like you. And so Tanya just had some humble answer,
and so this administrator decided she was going to call
the school and find out how they produced somebody like Tanya.
Speaker 2 (05:54):
So she calls to.
Speaker 1 (05:55):
Talk to one of the professors that wrote a letter
of recommendations for Tanya. And so she dials up and
she gets the secretary and she asks to speak to
the professor, and the secretary says who, and the administrator
repeats the name. I'm looking for, Professor so and so,
and the woman on the other end says, there's nobody
here by that name. So the administrator says, yeah, this
(06:18):
is the professor that wrote this letter, and the secretary says,
I've worked here thirty years and there is nobody here
by this name. So turn out to be a fake
letter of recommendation. So the administrator calls the second letter
of recommendation, same story. So she calls the third recommender
and gets connected, and it turns out that it was
Tanya's mother's office. And so what quickly became unraveled is
(06:41):
that Tanya had faked everything her transcript or GRES. And
this was by the way back in the nineties. So
she did this by digitally scanning her gres and then
changing them with early versions of photoshop, and then reprinting them. Anyway,
one of my colleagues equipped that she should get a
PhD just for the cleverness of her deception. But the
(07:03):
thing that struck me was how blind we all were
to the deception. We were completely fooled by it. So anyway,
the graduate school said to her, WHOA, there's something really
strange here, and you have to come up with an
explanation for this, And Tanya just ran away. But one
of my classmates caught her on the way out at
(07:23):
the door, and she had an excuse for everything, and
she said, I got screwed by this person, and this
person cheated me, and I thought they were writing a
real letter, and I thought the school was accredited, but
they lied to me, so she had an excuse for everything. Now,
obviously the details of this story stuck with me through
the years because I had almost gone on a date
with this girl. Anyway, that was the end of the
(07:45):
Tanya story, or so I thought. A couple of years later,
someone pointed me to an article from the Yale University newspaper,
and that's how we learned that Tanya had left our
school and gone to ya next and faked her way
into graduate school there, and Yale had caught her. This
(08:05):
was just like the first time, but Yale was mad
because they had been paying her a stipend, and so
they put her in jail. And I pictured the girl
that I almost went on a date with sitting on
a cement bench in jail, and the next newspaper article
I found showed that in jail, Tanya had bitten two guards,
(08:26):
and at some point she was released from jail, and
then we heard nothing about her after that until two
years later. Because Tanya went home to where she lived
in Chicago, and she and her mother decided to do
a big drug deal with two men who turned out
to be undercover agents, and they got caught and she
(08:48):
was going to be sentenced for a long time, and
so she and her cousin came up with an idea.
So they went out and they found a woman who
was a drug addict, who who was approximately her size
and looked a bit like her. And they said, hey,
we're going to give you free drugs if you come
to the dentist. And she said she was doing an
(09:10):
insurance scam, and she said, come to the dentist and
say that your name is my name. And she had
the woman wearing gloves to the dentist office so there
wouldn't be any fingerprints, and she accompanied her and she
signed the paperwork for her. And then after they had
the imprints, they brought this woman home and they smashed
her on the head with a brick and they injected
(09:31):
her with a bunch of insulin to make her pass out.
That's all they had access to and that's why they
used the insulin. And their plan was to kill this
woman and burn the body so that the dental imprints
would be found and they would conclude that Tanya was
dead and then she wouldn't have to go to jail.
As it turns out, this poor woman eventually came to
(09:53):
in the basement of Tanya's cousin's house and she managed
to scramble out of the window and she ran a
crossed the street to a Kentucky Fried Chicken restaurant. She
screamed for help, and Tanya realized that the woman had
just escaped, and she ran into the Kentucky Fried Chicken
right after her and started screaming that the woman had
stolen her money. And everything was very confusing, but the
(10:14):
police showed up and they couldn't figure out what was happening,
so they arrested everyone.
Speaker 2 (10:19):
Here was this woman with blood.
Speaker 1 (10:20):
All over her, so they just took everyone into the station,
and eventually the whole thing became unraveled and everyone involved
in this case was jaw dropped. As this story unfolded,
the ease and creativity with which Tanya thought of a
plan that went, okay, how about I just kill this
(10:42):
woman and then burned the body. And when you burn
the body, the dental imprints are the things that last.
And so because this woman had gone to the dentist
under Tanya's name, then the world would conclude I'm dead
and then I won't go to jail, and I was thinking, Wow,
I almost went on a day with this woman. So
this was one of the moments in my life when
(11:02):
I was struck by how different people can be on
the inside and how little insight we have into the
cosmos of someone else's brain and mind. And happily this
was positioned at the very beginning of my neuroscience career,
and it influenced what I've been studying since. It has
(11:23):
always struck me as fascinating the differences between people. Everyone
is very different on the inside, sometimes much more than
we expect. Now it turns out my father was a
forensic psychiatrist and he was involved in most of the
big mass murder cases in New Mexico where I grew up.
One of them was a guy named William Wayne Gilbert
(11:45):
who had killed three people cold blooded murder, and my
father became the psychiatrist in that case. Now, I was
a child and we went to some social event with
my father, and I remember somebody saying to my father,
Gilbert should not get the death penalty because presumably he
feels terrible for what he's done. Presumably he feels deep
(12:09):
regret for having killed three people. And as a kid,
I remember totally feeling like I agreed with that, but
I remembered my father's surprise at the statement, because my
father had just spent hours in deposition with Gilbert, and
he explained to this man genuinely and professionally that it
(12:29):
simply wasn't the case that Gilbert had regret, because when
William Wayne Gilbert would think about the idea of going
to murder somebody, he said, he had the same level
of excitement that he did as a child on the
night before Christmas. That's what it felt like for him
when he was thinking of killing someone. And so my
(12:52):
father's point to this man and to me when I
was eight years old, is you can't actually stick yourself
in other people's shoes, as much as you'd like to.
You think everybody's just like me, especially when you're a child,
but in fact people can be quite different on the inside.
And it turns out that the governor of New Mexico
at the time commuted William Wayne Gilbert's sentence since he
(13:15):
was sentenced to death, so Gilbert was not going to
die now, and to show his gratitude, he managed to
smuggle a pistol into the prison and led one of
the most stunning prison breaks from a maximum security penitentiary
where he let out several other murderers. And so for
a few months it was very tense in New Mexico
(13:36):
because there was this group of mass murderers on the loose.
They were finally found again, but only after this very
scary month or two where they were hiding out and
taking hostages. So all this stuff really got me thinking
from a young age about the differences between people. So
I'll just mention one more anecdote here, my father deposed
a guy who was on trial. This guy had gone
(13:59):
into Western Union office and asked the clerk to use
the phone, and the clerk had said, sorry, but the
phone is only for the back office people and not
for customers. So this guy jumped the counter and beat
the clerk almost to death. And the interesting part is
that he said to my father during the deposition, but
anybody would have done the same thing in that situation, right,
(14:22):
And he meant it. He was being genuine because we
all have an internal model of the world what constitutes
appropriate behavior in the world, and we assume that everyone
else's model is the same as ours, and this guy
genuinely could not imagine anyone having a different reaction in
(14:43):
that circumstance. Now, when we think about the differences between people,
were used to thinking about extreme cases like Tanya or
William Wayne Gilbert or this guy in the Western Union office.
These people, presumably have psychopathy paths make up about one
percent of the population. These are people who simply don't
(15:05):
care about your feelings. You're just an obstacle that they're
trying to get around to get what they want. But
this idea that other people can be different from you
on the inside can be generalized. So take something like schizophrenia.
You see a man on the street corner and he's yelling.
He's in an angry dialogue with somebody who's not there.
(15:27):
He's delusional, he's not in contact with reality. So in
situations like this, you look at the man and you say, Okay,
I guess that man's internal model isn't the same as mine.
He's not experiencing reality the same way I do. So
you might think, okay, I get that about different models
on the inside for psychopaths and for people with schizophrenia,
(15:49):
But otherwise the intuition is that the rest of us
are all about the same. But there are some important
things to note here. First, when it comes to something
like psychopathy or schizophrenia, we tend to think about these
as being categories, like Okay, that person's psychopathic and I'm not,
or that person is schizophrenic and I'm not. But in fact,
(16:10):
all of these things are now appreciated as living on
a spectrum, so you have different degrees along the spectrum.
So take psychopathy for example. There's a well used way
to measure this. It's a questionnaire and you score a
certain number of points on this from one to forty,
and in the United States, if you're above a thirty,
(16:31):
then you are labeled a psychopath. You don't want to
be roommates with the guy who scores twenty nine on
this test, even though they're not technically a psychopath, they
share a lot of the fundamental characteristics, so it's a
spectral issue. And by the way, this is the same
with everything in psychiatry. This has been the direction of
(16:52):
the Bible of psychiatry called the Diagnostic and Statistical Manual.
It used to be all about categories, and now everything
is about living on a spectrum, but let's keep drilling down.
When we look at mental illnesses more generally, we find
things that influence people's thoughts or feelings or behaviors, including
not just things like schizophrenia or psychopathy, but you presumably
(17:16):
know people with clinical depression or bipolar disorder, or anxiety disorder,
or obsessive compulsive disorder or borderline or narcissistic or avoidant,
or an eating disorder, or dissociated identity disorder or panic disorder,
or many other things like that. There are at least
one hundred and fifty seven of these in the Diagnostic
(17:38):
and Statistical Manual, So it becomes increasingly difficult to assert
that everybody is exactly like you on the inside. Despite
superficial appearances, people can be very different in terms of
what is happening on the inside. And if you've read
my books or listen to my other podcasts, you know
(17:58):
that when we see people with two strokes or brain
injuries to different parts of the brain, it's not hard
to say, oh, I guess their reality is a little
bit different than mine. But it proves harder to think
about this in terms of everybody that you know and love,
because we assume that the people in our lives have
(18:19):
essentially the same thing going on on the inside. That
we do the same opinions, the same way of sense making,
the same way of gathering meaning, the same political views
on the world as you do, but they don't. Everyone
that you know is having a slightly different reality going
on than you are. We are very different people on
(18:42):
the inside. Now, I want to be clear that I'm
not just saying this as a philosophical claim. We can
increasingly measure so many examples of this, and this is
something I've worked on in my lab for most of
my career. Neuroscience has always cared about the big disorders,
the things that are most obvious and societally costly. But
(19:02):
when we start looking at the more innocuous details, we
uncover these clear and measurable ways that reality can be
different inside different heads. So, for example, imagine that you
and I and a bunch of other people are looking
over Times Square in New York. We're standing on a
corner and enjoying watching the crowd. So you open your
(19:24):
eyes and there's the world and all its blues and
golds and greens. But if you happen to be colorblind,
you're seeing it differently than the person next to you.
Maybe you can't distinguish reds from greens those look exactly
the same to you. Or maybe you have a more
extreme form of color blindness in which there are no
colors at all, just shades of gray. So for you,
(19:44):
in the person next to you, the internal experience can
be quite different, even though you're looking at the same scene.
Speaker 2 (19:50):
And we now know that a small.
Speaker 1 (19:52):
Fraction of the female population has not just three types
of color photoreceptors in their eyes and their retinas, but
four types of color photoreceptors, which means they are seeing
colors that the rest of us can't even imagine. This
is called tetrachromacy, and I'll come back to this in
(20:12):
a future episode, But the point I want to make
now is that we might all be watching the same
corner in Times Square, but having totally different internal experiences.
(20:37):
This is the type of issue that I've been studying
in my lab for many years, so you might have
heard about my episode on synesthesia. In synesthesia, people have
a blending of the senses. About three percent of the
population will, for example, see a letter on a page
and that'll trigger a color experience for them on the inside.
(20:58):
So maybe s's a purple for you, and L triggers
a green experience, and so on. It's different for every synesthete,
and there are many forms of synesthesia. You might hear
a sound and that triggers a visual experience, or you
taste something and it puts a feeling on your fingertips. Essentially,
any sense can end up having crosstalk with any other
(21:19):
sense in these different forms of synesthesia. And my colleagues
and I have documented dozens and dozens of forms throughout
the population. Now, synesthesia is not considered a disease or
a disorder. It is simply an alternative reality. The point
is that people can have very different experiences on the inside,
but to a synesthee, their experience is precisely as real
(21:42):
as anything you might experience. So in neuroscience, this is
just one more recent appreciation that from one person to
the next, reality can be a little bit different. And
let me give you one more example that's even newer,
the issue of how we imagine a visual scene inside
our heads. So I'm going to ask you to picture
(22:03):
this picture a gray and white cat on a picnic
table eating colorful cereal and looking at you suspiciously, So
really picture that in your head now. There were two
researchers who began looking at this question of mental imageries
some years ago, Stephen Costlin and Zen and Felicition, and
(22:23):
they ended up disagreeing very strongly. Coslin said, when you're
imaging something, you're essentially running your visual cortex to see
this like a movie. It's like vision. And Felician said,
that's ridiculous. You're not seeing something. It's purely conceptual. It
doesn't involve seeing in vision. And they both did experiments
(22:45):
back and forth, and Felician said, look, you're insane. It's
not stored like a picture, and Costlin said, no, you're insane.
It's not stored like a proposition. You're actually seeing it.
And it was very difficult for a conclusion to be
reached here. Both argued passionately for their side of the argument,
and this went on for twenty years in the literature.
(23:06):
So why couldn't they come to an agreement on this.
The answer is Coslin had what we now call hyper fantasia,
which means he has extremely vivid mental imagery. When he
imagines something, it's as vivid as real seeing. Now, this
is the opposite of what Felician had, which is called
(23:27):
a fantasia, where mental visual imagery is not present. He
doesn't see anything in particular, He just has a concept,
and all of this lives on a spectrum from hyperfantasia
to a fantasia. Everyone in the population is somewhere on
this spectrum.
Speaker 2 (23:45):
So think about this for a moment.
Speaker 1 (23:46):
Picture an ant crawling on a checkered tablecloth towards a
jar of purple jelly. Are you closer to the hyper
fantasia side, where you're seeing it like a movie, or
closer to the a fantasia side where you can understand
the concept perfectly fine, but you're not seeing anything. So
my lab has studied this in detail, and we use
(24:06):
neuroimaging to figure out what's going on in the brain
along this spectrum.
Speaker 2 (24:11):
But what I want to.
Speaker 1 (24:11):
Focus on today is why there was such a spirited
debate in the literature for two decades before anyone realized
there was a spectrum. It's because both researchers were operating
under the assumption that everyone else experiences visual imagery just
like they do. So when Felician introspected, he wasn't seeing
(24:34):
a picture, and so he felt clear that other people
don't either, And when Coslin introspected, he was seeing a
super clear image and he assumed that's what happens inside
every head.
Speaker 2 (24:45):
And I want to use.
Speaker 1 (24:46):
The debate between them as a more general metaphor that
we all assume that everyone else is experiencing the world
the way we do. My point in talking about color
vision and synesthesia and visual imagery is this, as neuroscience
and psychology move on from studying the really big disorders,
(25:07):
we find increasingly more subtle issues, which caused us to say, wow,
I didn't realize that someone could experience so differently than
I do. But the fact is that everyone is having
a different internal experience and this led me to search
for a good metaphor. And when I saw the movie
(25:28):
poster for The Martian, I thought, Oh, that's it, because
the poster shows a single person, Matt Damon, walking around
on his own planet. He's the only one there, and
I thought, that's the perfect model. We're each living on
our own planet, we're each having our experiences, and we think, yeah,
(25:49):
this is reality, and it makes sense that everyone has
the same experience that I do. But in fact, just
like in any galaxy, each planet is different. Everyone's got
their own atmosphere, their own landscape, their own experiences. But
we always feel certain that he or she feels exactly
the same way that I do about whatever. Take for example,
(26:13):
what comes to mind when I say the word justice.
What happens inside everyone's head is slightly different. Or fairness
what comes to your mind it might not be the
same thing that comes to someone else's. Or attractiveness or love, home, freedom, success,
(26:36):
these concepts are triggering different neural networks and different people.
They trigger different meaning, which is tied to your whole
personal history and your aspirations. But we assume when we
use words that the other person knows exactly what we mean.
We operate under the assumption that words mean the same
thing to me that they do to you. But in
(26:58):
fact that never happens because we each have different internal lives.
And one of the ways that you can always appreciate
this is just look around you. The next time you're
in the bookstore or the library. There are so many
different sections, and nobody walks in and explores all the
sections equally. Instead, people go in and they go straight
to the section they want the thing that resonates with
(27:19):
their internal model of the world, mysteries or romances or
westerns or sci fi or whatever. They gravitate to particular
things and not others because of the differences in their brains.
So this is the first important lesson that I want
to establish. Others see the world differently than you do.
But why is this true? Why can't we have one
(27:40):
really smart person who writes a blog post and says, hey,
I think this is the way we should run the country,
and all three hundred and sixty five million people read
then say, yeah, that's pretty good. Why are we all
so different? This question has been at the heart of
a very long standing debate where people attribute differences to
either genetic factors or environmental factors, in other words, the
(28:04):
nature versus nurture debates. Why do you argue with your
sibling about political issues? Does your sibling have very different
genes than you do, even though you have the same parents.
Does your sibling have different nurture than you even though
you were raised in the same household? So are we
determined by our genetics or our environment? And traditionally there
(28:26):
have been very strong advocates on both sides of this. Well,
both of these have something to say. So let's start
with genetics. Do genetic differences matter? Heck yeah they do.
Although we're all members of this species, Homo sapiens, there
are millions of differences in our genomes from person to person.
For the cognitionanty, these are single nucleotide polymorphisms or substitution variants,
(28:50):
or copy number variants and so on. And your genetics
matter for who you are. Take just as an example
from my book Incognito, so I compiled statistics that if
you are a carrier of certain genes, your probability of
committing violent crime goes up eight hundred and eighty two percent.
(29:11):
So I took statistics from the US Department of Justice,
and I broke these down into two groups, those who
carry the genes and those who do not. And it's
a massive difference. Can you guess what this collection of
genes is. We summarize it as the Y chromosome. If
you have these genes, we call you a male. So
(29:31):
genes matter, but it's not all genes. It's also your
experiences in the world. We drop into the world with
half baked brains and we absorb everything around us, everything
that you know that you believe, your language, your culture,
your memories. It's all stored in this giant neural network.
(29:52):
And how does it get stored by reconfiguration of the network.
This is known as brain plasticity. Brains absorb the world
around them, and that's how the world shapes you. We're
influenced by our culture, our friends, our neighbors, our generation,
and so on. So you're shaped by both your genes
(30:13):
and your environment, and these are intertwined in very complex ways,
such that it's really rare that we can point to
one or the other and conclude that it's responsible for
something that we see. It's all about what we nowadays
call gene environment interactions. So you've got the genes that
you drop into the world with, and then you've got
all these experiences, and these intertwined in this complex way.
(30:36):
Your experiences actually shape your nervous system and can feed
all the way down to the level of which genes
are getting expressed and which you are getting suppressed. And by
the way, you don't choose your genes, and you don't
choose your childhood experiences. None of that is about choice,
but it makes you who you are now. These differences
can be quite subtle. You can disagree with your sibling
(30:58):
politically even know you're genetically similar and we're raised in
a similar environment, but small differences can take you off
in very different directions. So you're shaped by both your
genes and your environment, and hence the nature versus nurture
question is dead. And brains end up as different from
one another as faces. Just walk along the street and
(31:22):
look at how different people's faces are. The variety of
faces that you see around you, Well, there's that much
variety in brains too. And just as a side note,
I can recognize all my students just from their brain scans,
because brains actually physically come out looking different from one another. Okay,
so we've established that people are very different on the
(31:42):
inside and given a sense of how that comes about.
But you may think, okay, other people are different from me.
But what is clear is that I see the truth.
And if I could just sit down with them and
have them listen to me, or I could just shout
in cap letters on Twitter, everyone would see the correctness
(32:03):
of my position.
Speaker 2 (32:04):
Right.
Speaker 1 (32:05):
So the main question that comes up in our lives politically,
whether on social media or and dinner conversations is why
can't everyone see the truth? So now we'll turn to
Act three, where we'll ask how do you land on
your opinions, your notion of the truth, and how accurate
(32:25):
and complete is it really? So the critical concept I
want to tell you about here is this your brain
is locked in silence and darkness, and your brain's job
is to build an internal model of the outside world
so that it understands what is happening out there. So
everything in your life, everything about the way the world works,
(32:48):
is represented in your brain, usually unconsciously. How you deal
with people, where your house is, how to operate the
appliances in your kitchen, what language you speak and your
spouse speaks, how to drive your car. Everything in your
life is represented in this internal model. And I'm not
going to talk too much about the internal model today
(33:10):
except to say that one of the fascinating things is
that usually it's totally unconscious. And I gave an example
of this in a recent episode about putting your hands
up on an imaginary steering wheel in front of you
and pretending that you're driving thirty miles an hour down
the road, and I asked you to make a lane
change from the center lane into the right lane. And
(33:32):
what essentially everyone does with their hands is they turn
the steering wheel to the right and back to center.
But that would actually steer you off the road and
you would crash. When you actually get in the cart
and watch what your hands are doing, you'll see that
the way you make a lane change is by turning
to the right, back to center, just as far to
the left, and then back to center again. That's how
(33:53):
you make a lane change. Your brain has made a
model of the physics of cars and steering wheels and
roads and so on, but you don't even know how
you do this, and you didn't even know that your
brain had that model. This is the gap between what
your brain knows under the hood and what your conscious
mind has access to. And the point I want to
(34:13):
make is that you have this same sort of model
about the whole world around you and its political truths. Now,
the details of your trajectory in the world up to
this moment convince you that you know the truths, even
though someone else's internal model might tell them that they
know the truth and it might be different. Now, the
really important point here, the thing that no one talks about,
(34:36):
is that we don't usually take into account how poor
our internal models are. Here's just an example, starting in
early twenty twenty, when the pandemic hit, why did all
the bank lobbies close? After all, there were lots of
other shops that were open. The floorist was open, the
hair salons were open, and all of these were much
(34:57):
smaller spaces than the spacious lobbies of the bank. And
it's not that the banks couldn't put up plexiglass in
front of the teller's windows. In fact, that was usually
already in place. And it's not that the banks wanted
to be closed, because they still staffed the drive up
windows throughout the day.
Speaker 2 (35:15):
So what was going on.
Speaker 1 (35:17):
Most people didn't know why, And this is an example
of internal models. If something is not in your model,
then it just doesn't strike you. The answer was that
in the spring of twenty twenty, most of the population
began wearing masks, and the bankers didn't want masked customers
coming through all day. It's a perfect costume for a robbery,
(35:37):
so they closed their lobbies. The point is simply that
we always think our understanding of the world is complete,
but we're always facing situations where we say, oh, I
guess I didn't know that. Now it's complete, by the way,
the limits of your internal model. This is the engine
of comedy. So a comedian, I'll say something like this,
I went to the doctor the other day and the
(35:57):
doctor said, put your clothes there in the corner next
to mine. It's funny and surprising only because your brain
makes a full world of assumptions about the doctor, and
then you realize that your model didn't have all the information.
And there are so many examples of the limitations of
your model. Imagine I were to draw a handful of
(36:18):
diagonal lines on a page and ask you what you see.
You'd say diagonal lines. But if I say to you
how many are there here, you'd realize that you don't
know the answer, and you have to deploy your attention
to seek the answer. So this happens to us all
the time, where we think we have a complete understanding
of what's in front of us, but a little bit
(36:38):
of questioning unmasks that we don't actually have all the
details of the picture. And I'm using this all as
a metaphor to emphasize the importance of genuine dialogue with
other people, because sometimes you can't see what questions you
haven't asked, or the things that you're not even aware
you're not aware of. And by genuine dify dialogue, I
(37:00):
don't mean how do I convince the other person that
I'm right and they're wrong. I mean listening and considering
and questioning, and trying where appropriate to change your own
mind or to stand in a slightly different viewpoint than
you were before. Now, I want to come back to
this issue that it's so easy to poke holes in
(37:21):
our internal models. And the question I've always wondered is
why do we think our models are complete when they
lack so many answers? So consider this. Do you know
what a bicycle looks like? Of course you do. Now,
if you're in a place where you can get out
a piece of paper, I'd like you to get it
out and sketch a bicycle. Go ahead, And if you
(37:44):
can't get the paper, then go ahead and sketch it
in the air with your finger. Just the basic outlines
of the frame, the wheels, the seat, the handlebars.
Speaker 2 (37:53):
That's all. Okay.
Speaker 1 (37:54):
Now, I hope you'll actually do this, because assuming you do,
you will find yourself shocked about how poorly you are
able to actually reproduce the bike on paper. You think
you have your bicycle pictured perfectly in your mind, but
your model, your internal model, is actually quite lousy. For example,
(38:14):
does the chain connect to both the front and the
back wheel? And what shape is the frame exactly? And
how do the handlebars plug into the front wheel. It's
shocking because you know what a bicycle looks like, or
at least you think you do. But it's actually a
big challenge when you're really pushed on your understanding. And
this is known as the illusion of explanatory depth. Just
(38:39):
because you're convinced that your model has the full picture,
it doesn't mean that it actually does. So here's another example.
Imagine that I ask you if you know how the
electoral college works in this country, and you might say, yeah,
I know how that works. But now I say, great,
please explain it to me, and you might find that
you get stuck. You think you know, but as soon
(39:02):
as I scratched the surface of something, you find that
it's not quite as clear as you suspected. And there
are a million examples of this sort of thing, And
if you start paying attention, you'll see more and more
of the limitations of your knowledge. So it's useful to
question ourselves, with all of our political opinions, about where
our knowledge is lacking, because not knowing information doesn't mean
(39:27):
that you don't have a high sense of certainty about it,
especially if you really don't know much about a topic
at all. My graduate advisor once told me to go
to the library to learn about lattice gases, and I
had never heard of that at all. So I walked
over to the library and I discovered there was half
a shelf with books all about lattice gases. And I
(39:49):
was shocked, because how could smart people have devoted their
scientific careers to writing about something that I had never
even heard of just twenty minutes before. And this is
the kind of lesson that emerges as you mature in
the world. But interestingly, it takes a great deal of
work to get there. If you've studied less about a subject,
(40:12):
you tend to overinflate your knowledge. This is what's known
as the Dunning Krueger effect. So these are a couple
of psychologists and they ran studies where they found that
if they ask people a bunch of questions about humor,
or grammar or logic, you then take the people who
score at the bottom quartile, and they grossly overestimate their
(40:35):
performance on the test. Although their test scores put them,
let's say, in the twelfth percentile, they estimate themselves to
be in the sixty second percentile when you ask them,
how much do you know about this compared to other people?
In other words, the less that you know about a topic,
the more confidence you have in your own abilities. What
(40:56):
happens is that as you learn more about a topic,
your confidence goes down, and it's only much later, when
you become an expert, that it starts to go back up. So, now,
given everything I've told you so far, I want to
return to the cartoon that I described at the beginning
of this podcast, about the fork in the road where
one sign points to complex but right and the other
(41:16):
points to simple but wrong, and almost everybody is going
in the simple but wrong path, except for the few
people winding their way up the steep, complex but right path.
Now the cartoon struck me as funny, but perhaps not
for the obvious reasons, because when I saw this cartoon
on Twitter, I noticed that it had racked up many
thousands of likes. So I started to research who exactly
(41:39):
were the enthusiasts, and I had a suspicion that I
knew the answer, and that turned out to be correct.
Each person of whatever political persuasion, sees himself in the
complex correct thinkers winding the steep path. Whether you are
on the right or the left, whether with the Independence
(42:00):
or Green Party or libertarians, whether you're a fan of
Antifa or QAnon, whether you're a denizen of Wokistan or Magastan,
you fundamentally know that you are a person who engages
in refined and proper thinking. You appreciate data that is
(42:21):
intricate and meaningful, while people on the other side, for
reasons that you can only guess, they believe incorrect things.
What I want to emphasize is that each person who
clicked the thumbs up button knew that he was not
like the sheep on the opposite side of the fork. Instead,
he possessed an intricate and accurate view of the world. Okay,
(42:44):
So which side of the political debate did the cartoonist support, Well,
who knows who cares. It was presumably one of his
most successful cartoons because it was the skeleton key that
fit the lock of each reader's internal model. So the
(43:15):
first step to rising above our internal models is to
start watching for these bug traps that lure us in.
I just saw an example yesterday, a bumper sticker that
read make America America Again. And everyone on the road
who saw this bumper sticker presumably thought that sounded great.
(43:36):
Why because both sides of the political spectrum are equally
happy to engage in retrospective romanticization. Liberals and conservatives can
equally well reach back in memory to a time that
seemed less complicated, a totally illusory era where the nation
agreed with the logic of your political viewpoint, before the
(44:00):
of the crazies with whom you now have to deal.
So the only thing that the bumper sticker really points
back to is the impoverishment of our memories. The cartoon
and the bumper sticker. These work because they can mean
anything to anyone, and what they reveal is the degree
to which we live inside our internal models. And we
(44:22):
assume everyone shares the same model we do, and anyone
who has a different internal model we tend to demonize.
And that's because we believe our models so strongly, because
that's all we have, whether that's our religion or our
political side. Of the spectrum, whether we're the Communists or
the Nazis in nineteen thirty four, it makes us angry
(44:44):
that other people can't see the truth as clearly as
we can, and we are suspicious of them. And this
leads me to the final chapter of today's episode, which
is the notion of empathy. And there's an important aspect
of this that's typically over look So I'll illustrate this
with a historical example. In the late seventeen hundreds, there
(45:05):
was a British nobleman named Lord Gordon, who was born
into privilege, but he found himself caring deeply about the
welfare of the sailors. He was an officer, but he
campaigned energetically to improve their conditions, and his empathy was
broader than that. When they sailed to Jamaica, he was
(45:26):
disgusted by the slavery there and he berated the British
governor about it. So here's an example of a guy
where everywhere he went he sought to improve the well
being of.
Speaker 2 (45:36):
Those less fortunate.
Speaker 1 (45:38):
So the question, from a neuroscience point of view is
why did Lord Gordon care so much for others? And
why do any of us help strangers after all, the
driving force of evolution is survival of the fittest, not
of the friendliest. Well, fortunately there's another force at work.
Speaker 2 (45:58):
Our brains are.
Speaker 1 (45:58):
Constantly in the business of simulating the experiences of other people,
and under the right circumstances, this leads to empathy, the
experiencing of another's emotions. Empathy is what counterbalances our appetite
for power and tribalism and violence. Empathy is the glue
(46:20):
that binds society together. Our species dominance is due in
part to our empathy, which helps us to cooperate flexibly
in large groups. Now, how can we study empathy neuroscientifically?
So in my lab we performed a brain imaging study
in which you are in the scanner and you see
(46:41):
six hands on the screen in front of you, and
the computer goes aroundoo and randomly picks one of the hands.
So one of two things happens. Either the hand gets
touched with a Q tip or it gets stabbed with
a syringe needle.
Speaker 2 (46:57):
And when you.
Speaker 1 (46:58):
See it get stabbed, it's very cringe worthy. And so
what we're doing is we're looking in the brain images
to understand what is the difference between these two cases.
That are visually quite similar, but in one of the
cases you have this very visceral response. And what we
find is that when the hand gets stabbed with the
(47:19):
syringe needle, this network of areas in your brain that
we summarize as the pain matrix, this comes online. And
these areas in your brain are what would come online
if your own hand got stabbed. So when you see
someone else's hand get stabed, that activates this same pain matrix.
Speaker 2 (47:39):
You are not getting hurt, but you.
Speaker 1 (47:41):
Are simulating what it would be like to be that
person and have your hand get stabbed. This is the
neural basis of empathy. Now, if the story ended there,
all of us humans would operate like a big, cooperative
ant called But the reality is more complex. So let's
(48:04):
return to Lord Gordon. He empathized with sailors and slaves,
but he had nothing but hatred for his Catholic neighbors.
He worked tirelessly to repeal the civil rights of Catholics,
and in seventeen eighty Lord Gordon marched a crowd of
fifty thousand people to the Houses of Parliament in London
(48:26):
and for a week, the mob destroyed Catholic churches and
Catholic homes in what came to be known as the
Gordon Riots, which was the most destructive domestic upheaval in
the history of London. So why did Lord Gordon, a
person so capable of empathy, have such antipathy for Catholics.
(48:48):
The answer paints a fundamental fact about human nature, which
is our tendency to form in groups and outgroups, groups
that we feel attached to and those that we feel
opposed to. Our empathy is selective. So, especially after the
Second World War, psychologists began to study this issue of
(49:09):
in groups and out groups and how this can so
easily lead to violence. And my lab and a lot
of others have done a lot of research into this
issue of how easily we form in groups and out groups.
And I'll give you just one example of this. So
come back to this experiment with the people in the
brain scanner watching the hands get stabbed. Now we take
(49:32):
these six hands on the screen and we just add
one word labels to each hand Christian, Jewish, Muslim, Hindu, Scientologist, atheist.
And now the computer goes around boo boo, boo boop,
and it randomly picks a hand, and you see that
hand get touched with a Q tip or stabbed with
a syringe needle. And the question is does your brain
(49:54):
care as much if it's an out group versus your
in group. We tested people of all different faiths and
atheists also, and the result is that you care more
about your in group and you care less about outgroups.
When you see the hand get stabbed that is labeled
with your in group, we can measure a very big
(50:16):
response in the pain matrix. And when you see a
hand get stabbed that has one of the other labels
on it, we see a very small response in the
pain matrix. Your brain just doesn't care as much. This
is a large effect and it's depressing that it's true,
but it's just the way humans are. We care a
lot about our in groups. And these are just single
(50:39):
word labels. I mean, all the hands look the same
and they just have different colored wristbands on them so
you can distinguish them. But it turns out we are
really really sensitive to these labels. So the issue of
empathy is subtle and complex. With just a single word label,
your brain can feel more or less empathy for someone.
(51:00):
One can run the imagery about them and their pain
more or less vividly. Now, what's fascinating is how rapidly
our levels of empathy can change. So we next took
the exact same subject and we presented them with a
single sentence. The year is twenty twenty five, and these
three groups have teamed up against these three groups. And
(51:22):
so now you find your in group teamed up on
one side or the other. The computer has picked these
sides randomly, and so you've got this team and the
other team. So what do you think happens? You care
now about your allies. The two groups have randomly got
lumped in there with your in group. So suddenly when
you see their hand get stabbed, you have a larger
(51:44):
empathy response than you did just a moment ago.
Speaker 2 (51:46):
And you didn't care about them.
Speaker 1 (51:48):
You still don't care about the out groups on the
other side, but you care about these allies now more,
which is not surprising. Like, for example, when the Soviets
fought side by side with the Americans in World War II, too,
they had been bitter enemies before then World War two happened,
and suddenly their allies they're fighting together. They're clapping each
(52:08):
other on the back and sharing cigarettes and so on.
And then the war ends and now their enemies again.
Now take a moment to think about your own level
of empathy towards others. Imagine that you see a seventy
five year old man get hit in the face and
his nose gets cut and he's bleeding. Do you feel
(52:30):
an empathic sting with that? Okay, Well, now imagine that
he's at a rally for Joe Biden or for Donald Trump,
or just anyone you agree or disagree with. Is your
empathy any different? And if so, does that challenge your
view of yourself as an empathic person. If you felt
(52:51):
unequal responses in those two situations, a Biden rally or
a Trump rally, you're not alone. People generally assess their
own empathy by thinking about those in their in group.
I've always been struck by this in action adventure movies.
When we see a person get hurt, if it's the protagonist,
we really WinCE. But if it's the antagonist and he's
(53:13):
falling off one hundred foot cliff to his death, we
feel just find about that, possibly happy about that. So
what this means is that we have the capacity to
feel someone else's pain in different ways, depending on whether
they're a member of our tribe or not, and the
tribal tendencies of humans. This can incite murder and torture,
(53:35):
from the Spanish Inquisition to the Rwandan genocide. This can
buy the appeal of nationalist visions, from Hitler's Final Solution
to Mao's Cultural Revolution. So, given how deeply our biases
are ingrained, the question is, are we doomed to repeat
(53:55):
these kinds of atrocities forever? So I'm going to suggest
perhaps apps not. I'm going to give five strategies here
to narrow the empathic divide between people. The first thing
has to do with just understanding our own biases. We
can increase our awareness of our own internal thought patterns
(54:17):
so that we recognize our partisanship as we experience it.
For example, in our social echo chambers, we tend to
accept the logic of our in group and we reject
the logic of outgroups. And we're also predisposed to help
those in our in groups rather than those a little
farther away who might need help.
Speaker 2 (54:38):
More.
Speaker 1 (54:39):
Understanding the biases behind our actions in this way can
help lead us to more altruistic behavior. The second strategy
for narrowing the empathic divide has to do with building
a better model of other people. So instead of concluding
that your brother or a coworker is a or an idiot,
(55:01):
just try taking a crack at understanding his point of view.
It's not the same as agreeing with his point of view,
but it's trying to step into that person's world to
avoid the oversimplifications that we typically accept. And by the way,
this is often accomplished through art and literature, which has
for a long time wage day behind the scenes battle
(55:23):
against dehumanization. Theater and books and movies. This lets people
step into the shoes of other people, and in the
fourteen forties, when the printing press was invented, this allowed
stories to spread widely. So, for example, when Harriet Peacher
Stowe published the anti slavery novel Uncle Tom's Cabin in
(55:43):
eighteen fifty two, readers stepped inside a shack that they
otherwise wouldn't have ever entered, and once in it was
no longer so easy to relegate the characters to an
out group. The third strategy is to learn and resis
the tactics of dehumanization. There are a lot of tricks
(56:04):
that governments and propagandists employ and I'm going to do
a different episode on that, but I'll mention here that
one common ploy is what's called moral pollution, in which
a group is socially smeared by association with something repulsive
like vermin or insects, or anything that develops them in
(56:24):
a negative emotional cloud. Once you have a negative emotional
reaction to a group, it becomes harder to hear their perspectives.
So when you can recognize that a person is being
attacked for his identity rather than his arguments, you can
better defend yourself against this trick. The fourth strategy has
to do with blinding your biases, so design processes and
(56:49):
organizations that remove the chance that prejudices interfere with your judgment.
For example, a lot of software companies here in Silicon Valley,
they'll ask job candidates to submit code rather than to
show up in person. And many orchestras have blind auditions,
which means they audition people behind a curtain, so you
(57:09):
can't see the gender or the race of the person
who's looking for the job. You're just listening to the music.
And in the same way, many universities have a need
blind application process so they can separate intelligence from financial considerations.
So the idea is, wherever biases can be subconsciously triggered,
(57:30):
it's best if you just remove the opportunity. And the
fifth strategy I think is the least intuitive, and that
is to entangle group memberships. So what I mean is
work to ensure that communities are intertwined. So to see
how this would work in practice, consider the five tribes
(57:53):
of the Iroqui Native Americans, who fought intensely with each
other in the fifteenth century. So they had a new
leader come in named Deganaweda, who came to be known
as the Great Peacemaker. And what he did is he
hassigned each tribe member to one of nine different clans,
the Wolf clan or the Bear clan, or the Turtle
(58:14):
clan or Sandpiper.
Speaker 2 (58:15):
Deer so on.
Speaker 1 (58:17):
So members of each clan had representation from all the
different tribes, and these relationships were cross cutting. So I
say to you, hey, tribe member, let's go attack that
other tribe over there, And you say, oh, you know,
I would. But I'm a member of the Eagle clan
and so is he, and so I'm not really that
interested in attacking him anymore. So by emphasizing the overlapping
(58:42):
dual allegiances to tribe and to clan, de Ganaweda complicated
the notions of us and them, and in this way
he was able to defang the intertribal warfare. So what
we've seen in today's episode is how different we are
on the inside, and yet how strongly we believe our
(59:02):
own truths, even though our knowledge of everything is so impoverished.
And yet we all walk around with the impression that
if we could just sit down with another person on
the other side, we could show them the truth. So
if you have the same opinions as everyone else in
your life, great, But I hope you don't. I hope
(59:22):
you can take the opportunity to dig deep and find
out how the other folks in your life see the
world and listen to them. It's not the same as
agreeing with them or giving in to them, but it
is acknowledging that your point of view doesn't have a
lock on the absolute truth, and it's allowing that the
most important thing you can learn is an ability to
(59:45):
dialogue in conditions of disagreements and discomfort. It's the most
important thing that you can do for each other and
for your own brain. As Voltaire said, uncertainty is an
uncomfortable position, but certainty is an absurd position. So I've
given five directions for helping us to learn how to
(01:00:07):
bridge that gap, not by assuming we're right, but by
having the intellectual humility to realize that it's a big,
pluralistic world out there, and that everyone, including you, has
a model of the truth, and that only by adopting
the stance of genuine dialogue and understanding your own biases
(01:00:27):
and the possibility that we might be wrong, can we
hope to move things forward. If you're interested in learning more,
find further readings on these topics at Eagleman dot com,
Slash podcast, and if you have questions, comments thoughts, email
(01:00:48):
us at podcast at eagleman dot com, where we've been
getting great responses. Watch full video episodes and leave comments
on YouTube at Inner cosmospod. Until then, this is David
Eagleman signing off from the Inner Cosmos