All Episodes

August 14, 2023 38 mins

Would you torture someone if you were commanded by an authority figure? To what degree are your decisions contextual, and what does this have to do with matching the length of a line, the Iroquois Native Americans, the banality of evil, soldiers posing with dead bodies of their enemies, propaganda, giving shocks to a stranger, or how we should educate our children? Join Eagleman for part 2 of the exploration into brains, dehumanization, and what we can do to improve our possible future.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Would you electrocute someone if you were told to by
an authority figure? Would you torture a prisoner just because
you were put in the uniform of a prison guard?
How much of your behavior is a function of your situation?
And what does any of this have to do with
matching the length of a line, or soldiers posing with

(00:28):
dead bodies of their enemies, or propaganda or dehumanization or
how we should educate our children. Welcome to Inner Cosmos
with me David Eagleman. I'm a neuroscientist and an author
at Stanford and in these episodes, I examine the intersection
between our brains and our lives, and we sail deeply

(00:52):
into our three pound universe to understand why and how
our lives look the way they do. In last week's episode,
we talked about dehumanization and how your brain can dial
up and down the degree to which you view another

(01:15):
person as human. And I gave you a particular example
of empathy, which is where you're simulating what it is
like to be someone else, and we saw how empathy
can be modulated based on whether those people are in
your in group or your outgroup. For today, I want
to drill down a little bit deeper into the heart

(01:35):
of a related issue, which is that when you look
at people who are behaving in these very violent acts
throughout history, the assumption has been for a long time
that it's something about the disposition of those people. In
other words, there's something really wrong with those people. But
this started coming into questions some years ago because there

(01:58):
were so many hundreds of thousands, sometimes millions of people
participating in these violent acts, and it's a strange theory
to say that all of them had something wrong with
their brains. Instead, what researchers started thinking about is maybe
there are situational forces that make people behave in these

(02:20):
incredibly awful ways. So there's something to be understood here
about the social context that people find themselves in that
cause them to behave a certain way. And of course
this leads to the question of could you behave this
way if you found yourself in that situation? And this
makes us all very uncomfortable to even think about it,

(02:42):
because we know we're good people and we're not going
to behave in these terrible ways. But the reason it's
important to ask these questions is because social psychologists got
really interested in what was happening with what came to
be known as the finality of evil. So after World
War two, for example, Adolf Eikman was on trial. He

(03:05):
was one of the main coordinators of the final solution
for the Jewish population. He personally had the blood of
hundreds of thousands, or maybe millions of people on his hands.
And the thing is, as the journalist Hannah Errand put
it as she covered his trial, she called this the
banality of evil, because there was nothing particularly special about

(03:30):
Adolf Eikman. He was on the stand and he said,
I was just doing my job. He was part of
this machinery. He had this opportunity to impress his wife
and the people around him. There were all kinds of
situational forces at play here. Now this is no defense
of his behavior, but what it does encourage us to

(03:50):
do is try to understand what are these situational forces
that steer whole populations of people to do incredibly off
then in other situations you wouldn't even consider. And this
is why the whole research question about situational forces came

(04:10):
to the forefront. So right after World War two there
was a research psychologist named Solomon ash and he decided
I want to understand how it is that social forces
can change people's decision making. So he did a very
simple experiment. You come in to participate in the lab,

(04:31):
and you see there are seven other people that are
there to participate as well, just like you, And you're
all shown a line on the screen of a certain length,
and then you're shown three more lines marked AB and C,
and you're asked which one matches the original line in
terms of length. But it just so happens that you're

(04:52):
sitting in the eighth chair, and so the first person
registers his answer, and the next person calls out her answer,
and so go on, and it goes down the line,
and it turns out that all these people are shills.
That means there are plants from the experimenter. They're not
just like you, even though they appear to be just
like you. And sometimes what they'll do is they'll all

(05:14):
say the wrong answer, but they'll say it confidently. They'll
maybe pick the shortest line over there, and they'll say, oh, yeah,
it's definitely line C, and you stare at it and
you think it's line B. But person number one, person
number two, number three, they all are picking line C
and so what do you do when it comes time

(05:36):
for your turn? Are you going to say, you know what,
you guys are all wrong, it's line B. Or do
you think, gosh, maybe there's something wrong with me in
the way that I'm seeing this Now. Solomon Ash didn't
think this would work. He figured people would go ahead
and stick with what they thought was the right answer.
But the results were really surprising because what happened is

(05:59):
that almost almost everybody conformed to the group whatever the
group was saying. The experimental subject was reluctant to say otherwise,
even though this was a clear, easy perceptual task with
a right and wrong answer. So this was a surprising result. Now,
as it turns out, Ash had a student in his lab,

(06:20):
a young man named Stanley Milgram, who watched this, and Milgram,
being Jewish just like Ash, and having just seen what
happened in World War two, was very interested in these issues.
And Milgram noticed something, which is there was no social
consequence to this line experiment. It wasn't a moral decision

(06:41):
of any sort. It was just a very simple perceptual decision.
So Milgram decided to launch an experiment that drilled a
lot deeper and it's one of the most famous experiments
in psychology, but not everyone knows the details. So let's
just walk through this. First, you see an ad that's
advertising for pe people to participate in a study about memory.

(07:02):
It says, we will pay five hundred men to help
us complete a scientific study of memory and learning. The
flyer says that you will get paid for one hour
of your time and there's no further obligation. So you
show up. It's this laboratory at Yale University, and you're
told that in this experiment you're going to play the
role of the teacher. And there's another volunteer over on

(07:24):
the other side of the wall, and you're told this
is a study on the effect of punishment on how
well people can learn. And so it's explained to you
that the learner will be asked to memorize arbitrary pairs
of words, and he's going to be continuously quizzed on it,
and he's just been strapped to a chair that can

(07:45):
give him a small electrical shock. So if he gets
an answer right, the experimenter moves on to the next problem.
But if the guy gets the answer wrong, your job
is to deliver a small electrical shock. Now you're sitting
in front of this device which he has thirty switches
in a row on it, and these switches are marked
at the left side as slight shock, all the way

(08:07):
up to the right side where it says danger extreme shock.
And so you hear the experiment begin. The job of
the learner is to learn these associations between the words,
and he gets the first answer right, and then he
gets the second answer right and so on. But finally
he can't remember some pairing and he gets the answer wrong.

(08:28):
So now the experimenter and the white coat says to you,
I want you to press the button for the lowest
level of shock, level one. So that's all you need
to do. You press the button that reads slight shock.
And even though it seems like the kind of thing
you might not normally do, you're being instructed by a
professional researcher, and so you hit the button. You can't
see the learner and you don't hear anything, so the

(08:50):
whole thing is pretty straightforward. Okay, So he's going along
and he gets another wrong answer, and you're told to
give him a tiny shock again, and this goes on,
but the guy isn't performing that well at memorizing the
pairs of words, and so the experimenter tells you at
some point, okay, each time he gets another wrong answer,

(09:10):
I want you to increase the level of the shock.
So the learner gets it wrong and you're told to
hit the second button, and this goes on, and every
time he gets the wrong answer, you have to move
up in the level of shock. So you work your
way up along the buttons, and it changes from mild
shock up to extreme intensity shock and over on the

(09:33):
right side the buttons read danger severe shock, and in
fact the last few buttons are past that danger sign
and they're unlabeled. So you're getting a little worried about this,
And as the learner is going along, you're hoping that
he'll get the right answers so you won't have to
keep going up very high. But the guy gets a
wrong answer, and then another, and eventually another, and the

(09:56):
experimenter very calmly tells you to give him these higher
and higher shocks. And what's happening by midway up this
scale is that when you give a shock, you hear
the learner in the other room say ow. And as
you're going up higher than that, he says, ow, that
really hurt. And as you go up higher, he says,
let me out of here. I don't want to be

(10:17):
part of this experiment. So you look pleadingly to the experimenter,
and the experimenter in his white coat, says, keep going.
So maybe you keep going and the guy says, OW,
I want out of here, let me out, And the
experimenter says to you, keep going, and you say I
don't want to keep going. He's obviously in pain, and

(10:39):
the experimenter says, don't worry. I take full responsibility. You
are not responsible for any of this. You are just
participating in an experiment. So the question is how high
do you keep going? Because at some point, when you
reach a pretty high level, the learner stops responding. You

(11:03):
press the shock button, but you don't hear any cries anymore.
Is he unconscious? Is he dead? There's nothing but silence now,
And the question is will you keep going even beyond
that level? Will anybody go all the way to the top? Now,
As it turns out, the learner was a shill. He

(11:25):
is not a real volunteer. He was working with the experimenter.
You were the experimental subject. So Stanley Milgram talked to

(11:52):
a group of psychiatrists, and he said, what's your prediction.
How high will people go? How many people do you
think will go all the way to the top and
will give somebody the strongest electrical shock, even though the
learner seems to be unconscious at this point or maybe dead.
So the psychiatrists concluded that their prediction was one percent

(12:14):
of people will do this because psychiatrists are experts in
human nature, and they're thinking about this in a dispositional way,
in other words, who has that disposition? Who's the kind
of person that would allow them to do that? And
they figured, well, the only person who would do that
is somebody who is psychopathic, and psychopaths make up about

(12:36):
one percent of the population. But as it turned out,
sixty five percent of Milgrim's volunteers went all the way
to the top. They delivered four hundred and fifty volt
shocks to the learner. Why It's simply because they were
asked to. So Milgram wrote a famous book after this,

(12:59):
called Obedient to Authority, because he couldn't believe what he
had just found. He couldn't believe that people would listen
to him all the way up to the very top
where they were perhaps killing somebody, and somebody who was
a total stranger to them, who had done nothing to them.
And in his book he describes nineteen different versions of

(13:21):
the experiment that he did. He tweaked every possible parameter.
He figured out the details of our behavior, like if you,
the teacher, actually have to be close to the learner
where you can see him, then compliance goes down. Or similarly,
if the experimenter, the guy in the white coat, is
farther away from you, then compliance also goes down. And

(13:43):
in the extreme case, when the experimenter is just talking
to you on a telephone, compliance only drops to twenty percent. Now,
twenty percent is still really awfully high for delivering these
strong electrical shocks. And by the way, he ran the
experiment also with female participants instead of male and even

(14:04):
though the women expressed more stress about it, they still
did exactly as much. Shocking sixty five percent of them
went up to the top most switch. Now, I want
to make a quick note here about the thirty five
percent those people who did not go all the way
up to four hundred and fifty volts. Thank goodness, there's

(14:24):
that thirty five percent those are the people that we
need a socially model. Those are the people who were
perhaps raised in households where they were always asked to
question why they were doing something. And even though we
talked in the last episode about all these bloody events
in the past century, there were always people who hid

(14:45):
Jewish families in the Holocaust, or protected Chinese women during
the invasion of Nan King, or people who sheltered Tutsi
in Rwanda. So thank goodness those people exist. And our
job really is to take that thin radio signal to
the world and amplify it, to work every day to
cultivate that kind of bravery in ourselves and our children.

(15:09):
And I'm not talking about acting like we know what's
right and wrong and going and hurting people. I'm talking
about being the kind of people who don't allow that
to happen, no matter who is doing the hurting. Okay,
So back to Milgram's experiments. He wanted to make sure
that this had nothing to do, in particular with the
academic prestige of Ale University, where he was from, So

(15:32):
he rented some random office space in downtown New Haven,
Connecticut and just said he was a random researcher and
people still comply just as much. So. Milgram's experiments were
a shocking illustration of how easy it is to get
people to listen to authority. Now, it turns out there

(15:52):
is another kind of social influence also besides authority, and
that is the influence of your peers. Happens that Milgram
had a high school friend named Philip Zimbardo. Milgrim ended
up at Yale and Zimbardo ended up working at Stanford.
Zimbardo was interested in how prison systems run and why
people behave the way they do in prisons, so he

(16:20):
recruited people for a two week study, and he made
sure to do full psychological tests on them to ensure
they were all in a normal range essentially random normal
research participants. Then he assigned them to the role of
a prisoner or the role of a guard. For the guards,
he gave them things like sunglasses to cover their eyes

(16:41):
and billy clubs to carry, and for the prisoners, he
stripped them of all their clothing except for a simple gown,
and he made things as realistic as possible, so he
actually picked up the prisoners in police cars and brought
them in and had them handcuffed and checked in and

(17:02):
he had three different shifts of guards who would switch
off every eight hours, whereas the prisoners actually lived there
in this basement, which was set up to be just
like a jail cell. And you may have heard about
the outcome of this experiment. The guards started acting like
bullies to the prisoners, making them do arbitrary things just

(17:26):
for the sake of punishment. So, for example, they had
to lineup to count off in the morning, and that
was a perfect time for the guards to make up
arbitrary rules like okay, now you have to do it backwards.
Now you have to sing. Now you're not singing sweetly enough,
you have to do it again. And they would do
this kind of thing for hours just to persecute these guys.

(17:46):
But it just kept magnifying. The guards started becoming so
creatively evil in coming up with punishments and rules, and
this all happened quite quickly. The guards started taking away
food from them, taking their beds away, locking them in
solitary confinement, and everybody involved became psychologically a bit traumatized,

(18:08):
and the experiment had to shut down early because essentially,
right away the prisoners and guards fell into these roles
and what happened was the community was shocked by what
had just transpired in this experiment, because these were just
normal young men who just by dint of being put
in these roles, ended up behaving so differently. And Zimbardo

(18:34):
wrote a great book on this called The Lucifer Effect,
about how people can turn into such bad actors depending
on the situation in which they find themselves. And what
Zimbardo emphasized is the situational nature of our behavior. In
other words, it matters what role you're playing, and beyond
a particular situation, he said, you have to understand whole

(18:56):
systems to understand how humans behave you have to understand
more than their disposition, in other words, the kind of
person that individual is, and you have to understand more
than the situation they're in right now. You have to
understand the whole system they're in. Because what happens in prisons,
for example, is not unique to the Stanford prison experiment.

(19:18):
It's typical of what happens in prisons where the system
is set up with guards and prisoners and they've each
got their roles, and the guards want total compliance and
the prisoners want to resist that, so the guards will
keep upping the arms race until they're certain that they
can get compliance from the prisoners. And in this light,
it's no real surprise what happened, for example, in Abu

(19:41):
grab where photographs emerged of torture and humiliation of the
insurgents that were being held there by the US forces.
It's exactly the character of interaction that happened in the
Stanford prison experiment. So you may have seen some years
ago where photos surfaced of prisoners being stripped, naked and humiliated,

(20:01):
or another was of a man being terrorized by the
US guard dogs, or there's a photo of a man
standing on a small box and he's draped in a
sheet with his head covered, and he has two electrode
wires attached to his fingers, and he was told that
as soon as he loses his strength and falls off
the box, he's going to be electrocuted. Zimbardo's point is

(20:23):
it's not as easy as thinking about this as a
few bad apples in the system, which is how the
Army worked to portray this. The army said, we are
shocked at what happened in Abu Ghrab. There were obviously
some bad apples who behaved badly, Zimbardo's point is that
it's not really a few bad apples that's a systemic problem.
It's a system that sets up particular situations like the

(20:46):
Stanford prison experiment. He wasn't making a defense of the
soldiers who behaved badly like this, torturing their prisoners, but
it's important to try to figure out how to build
or repair these systems so that it does happen. And
shortly after that, eighteen pictures emerged of American soldiers who
are posing with dead members of the Taliban and making

(21:09):
it look like the bodies were doing things like having
their hand on their shoulder. And so what I thought
was interesting is the headlines the La Times said photo
of US soldiers posing with Afghan corpses prompts condemnation, and
in the subtitle, American officials denounce the actions of troops
photographed with dead insurgents. But this condemnation is a little

(21:32):
hard to understand because the army tells you, look, we
want you to go over there, we want you to
kill these guys, to wreck their roads and burn their bridges,
but don't take any pictures with them, because that's disrespectful.
This condemnation is complicated because the soldier's behavior was part
of the system, part of the situational variables that were

(21:54):
set up. You get these young men and women to
have vim and vigor and go out to kill the enemy,
give them propaganda, you dehumanize the enemy, and then you say, hey,
we're outraged that you didn't treat this corpse respectfully. And
by the way, just as a side note, it turns
out that with modern communication channels, these photographs surfaced quickly

(22:15):
and everyone thought this was some awful new sign of
the times. But this is as old as war itself.
People have always posed with the dead bodies of their
enemies for as long as there have been cameras, and
before photography, they would do things like cut off people's
ears or take out teeth or stuff like that, and
make belts and necklaces out of them. So there's nothing

(22:37):
new going on here. Again, this is not a defense
of that behavior, but it is to say there's something
about the systemic variables in wartime that change the way
people make decisions in these situations. And I want to
be clear about one point, which is that when I'm
talking about with these situational variables, this doesn't get rid

(22:58):
of individual respons responsibility, but it gives us the tools
to understand the variables that chronically cause situations like this.
People who behave badly in Abu grad, prison or any
place else, they still have to face punishment for many reasons,
because justice, it turns out, tries to accomplish many things

(23:19):
at the same time, such as slaking public bloodlust and
setting up examples for the next people. So the people
who commit bad acts still have to get punished. The
importance of studying all the variables that steer human behavior
is not to let people off the hook. It's to
prevent the next generation from ending up in the same

(23:40):
situation and performing as badly. Now. In this episode and
the last one, we talked about dehumanization and its neural underpinnings,
and in this episode we talked about the situational variables
which play a role in it. So the question is
what can we do about dehumanization As we come to

(24:01):
understand the neural basis and the psychological basis and the
contextual basis, how does that steer us? So I'm need
to propose three lessons that emerge for us. The first
is the unspeakably important role of education education, specifically about
propaganda and dehumanization and the social context that influences us.

(24:26):
It's critical for us to teach the Milgrim experiments and
the Zimbardo experiments to our children and to ourselves, such
as this becomes part of our background knowledge. Everyone knows
and talks about these sorts of experiments, and they know
what to do about it, because, after all, in his
book Obedience to Authority, Milgrim said, look, I'm going to

(24:47):
take what we've learned here and distill it down to
all the rules that people use when they want other
people to do their bidding. So he said, first of all,
if a person wants you to be obedient to them,
first they're going to pre arrange some form of contractual obligation.
In this case, it was we're going to pay you

(25:08):
four bucks, and then you're going to participate in this experiment. Next,
they'll give you a meaningful role, like you're a teacher
or your a guard, and those activate particular response scripts.
People feel like, oh, I know exactly what to do
with that. Then the person will present basic rules to

(25:28):
be followed, like when the student gets the wrong answer,
you move up to the next level of shock. And
these rules can be arbitrarily changed. Later, the person will
change the semantics of the act. Instead of calling it
hurting the victim, they'll call it helping the experimenter. The
person will allow for a diffusion of responsibility. So remember

(25:51):
in Milgram's setup, the experimenter said, don't worry all be
responsible for anything that happens to him. Just keep going
this way. When you're doing the act, you don't have
to take everything onto your own shoulders. The person will
start the path with small steps, like just give him
a little electrical shock. He'll barely feel it, and then
as things go on, they'll say just a little more,

(26:14):
just a little higher, until before you know it, you
are at the top of the scale, delivering dangerously high
levels to a person who might already be unconscious. It's
like the frog and the frying pan method. If the
heat gets turned up slowly, the frog doesn't jump out,
and humans are no different in this way. Finally, the

(26:35):
person will make exit costs high, but they'll allow verbal descent.
In other words, they won't let you leave the experiment,
but they will allow you to express distress. They'll allow
you to complain because that'll make you feel better. A
complaining person will still go up to four hundred and
fifty volts, but they'll feel better about themselves if they say, oh,

(26:58):
I feel really uncomfortable, I don't want to do this,
but they still do it. And finally, the person will
offer larger goals, some ideology that your small contribution helps with.
In the case of the Milgrim experiment, it was as
simple as we're trying to study the science of memory.
In the case of genocide, it's usually about pride or purity,

(27:21):
or economics or restoring dignity and opportunity or whatever story.
But the idea is that as you shock somebody or
perhaps shoot somebody, you are working in service of a
larger goal. So Milgrim was able to distill all these
rules that we find whenever people blindly follow authority, and

(27:44):
you see these rules played out the same way across
place and time. And this is what we need to
teach our children, so they know the signs to look for,
so they know how not to get used into the trap.
I mean, for God's sake, we teach all children how
to do long division by hand, and we teach them

(28:06):
how to play soccer and how to watercolor. But why
isn't it mandatory that we teach them lessons like this,
like how to know when they are getting manipulated, how
easy it is to get manipulated, how to develop immunity
against manipulation by simply knowing the signs to look for.

(28:28):
That would be an education worth having. Okay, So the
first thing we need is meaningful, universal education about these issues.

(28:53):
The second thing we need is social modeling. So in
the last episode, I talked about syndrome E, which is
where your neural circuitry for caring about other people gets
turned down or turn off, and people act like psychopaths,
performing actions like murdering mothers and their babies on camera,

(29:13):
things that would normally not even be thinkable or conscionable.
And I spoke about it as though everyone in wartime
can catch syndrome E, or that everyone in Milgrim's experiments
showed inappropriate obedience to authority, But in fact there are
always heroes who stand up against authority. In Nazi Germany,

(29:35):
for example, there was a group of students known as
the White Rose. They put all their efforts into making
and disseminating flyers and pamphlets against the actions of the
Third Rite. Now tragically, they were eventually captured and rounded up,
and they were all executed by the Nazis. But this
is the kind of thing for us to teach our

(29:58):
children about and keep their names alive, celebrating heroes who
stand against authority when they see something going horribly wrong.
And I'm not talking about just being a pain in
the neck to authority, because that's trivial and not always useful.
I'm talking about seeing something that's actually really wrong. And
even though it appears that all the adults know what

(30:20):
they're doing and have good reasons and they're only asking
you to do something small and they'll take the responsibility
and so on, think about whether it's the kind of
action you would want to take if you thought about
it from your own first principles, would you feel that
it's conscionable to murder your neighbors and take their stuff.

(30:40):
If the answer is no, then it should remain no,
even if the world gets a little nutty. And this
is where social modeling helps. We learn about heroes who
stuck with their conscience. So if you know these stories,
then the next time you find yourself in some situation,
at least got a template that you can think about following.

(31:03):
So number two is about teaching our children and ourselves
about those who stand strongly against things that are asked
of them in a time of war and madness. The
third defense against dehumanization is clever social structuring. And I
talked about this a few episodes ago about the Iroquis
Native Americans who lived up around what's now upstate New York.

(31:29):
And they're known as the League of Peace and Power.
But they weren't always known as that, and certainly not
four hundred years ago. There used to be six tribes
who were always fighting with one another, real bloody battles.
But in the sixteen hundreds they were brought together by

(31:49):
a man who came to be known as the Great Peacemaker.
He combined them into one nation. By the way, combining
people is not enough. It turns out that if you
simply push people together, that can fall back apart easily.
He did something more clever. He structured clans such that

(32:13):
each tribe member ended up belonging to one of nine clans.
So I might be a member of the Seneca tribe,
but I'm a member of the Wolf clan, and you're
a member of the Mohawk tribe, but you're also a
member of the Wolf clan. And the key is that
the membership to tribes and clans these cross cut. And

(32:36):
so how is the Seneca tribe going to fight against
the Mohawk tribe when I'm a wolf and you're a wolf.
And by the way, my Seneca friend is in the
Hawk clan and your Mohawk friend is in the Hawk
clan too, So when we all consider waging war, we think,
I don't know, I got friends over there, I've got
fellow clansmen in that tribe. So by cleverly structuring things

(32:58):
in a society, by cultivating cross cutting ties, you can
tamp down people's natural vigor to make easy outgroups. You
can complexify their allegiances. I think it's likely naive for
us to think about obtaining world peace by just getting

(33:19):
everyone to get along, because we're very hardwired for in
groups and outgroups. But we can structure things carefully like
the Iroquois chief did, so that things have counter balance,
so that it's not so easy for people to raise
arms against one another. So that's our third tool, is

(33:39):
social structuring to create or enhance cross cutting allegiances. So
let's wrap up in this episode and the last one,
I've told you about the way that the human brain
is so social and comes to understand other people as people,
but it's also really easy for brains to form in

(34:00):
groups and outgroups, and how the circus in your brain
that understand other people can come to see them more
like objects. They're no longer human, they are dehumanized. And
once someone has become an object, it becomes much easier,
sometimes trivial, to do what you need to make them

(34:21):
not be a problem to you anymore, and pushing people
into the outgroup is not hard to do. We saw
last week the tools of the trade, of propaganda and
other techniques of dehumanization, and in this week's episode we
saw how the situation you're in can influence decision making
as well. All it takes sometimes is an authority figure

(34:43):
telling you there's a contract and you have to do this,
and this is for a greater purpose, and don't worry,
the responsibility will be diffused off of you. Just press
the button, or in the case of Zimbardo's experiment, systems
have an inherent structure such that guards and prisoners have
their own implicit scripts, and it's very easy to find

(35:05):
that you know those scripts, and in that situation you
play out those scripts. All of these situations make it
much easier to take dehumanization on board, to treat another
person not like someone just like you, but instead like
an object. And although we're a massively social species capable
of such empathy, it's just not difficult to set up

(35:28):
situations where we're capable of such violence. And I suggested
three ways that we might take our knowledge of this
and reorganize ourselves. The first is education of our community
about the tricks of the trade, of propaganda and obedience
to authority, because the truth is, once you know this stuff,

(35:49):
it becomes so obvious what people are trying to accomplish,
and you have a meaningful immunity to it. But without
education on it, the youth of each new gen generation
is at risk. So let's get this information into schools
and communities. The second way is social modeling, that is,

(36:09):
looking at people who stood up before us, like the
people in Milgrim's experiments who said, sorry, I'm not going
to deliver that next shock, and the guy says, but
you have to. That's the experiment. I will take responsibility
and you say, no, I'm not going to do it.
It takes courage to be that kind of person, and
we'd all be better off if we saw lots of

(36:31):
examples of that kind of behavior. Then it wouldn't seem
so foreign to us, and we would find it easier
to discover that courage when we need it. And the
third thing is figuring out ways to surface the ties
between people that perhaps they weren't aware of before, to
establish ties that bind across all the typical boundaries. And

(36:55):
I'll talk in a different episode about how we might
do this by leveraging the power of social media. Social
media is not going away, so let's see if we
can leverage it for unity instead of division. So those
are three strategies we can take. And the reason this
matters is because we have evolved for use sociality. We're

(37:16):
not independent contributors. We have succeeded as a species because
we behave as a super organism as a group. And
the reason I think it's so absolutely critical to study
all this is because this is what is going to
define our future. I mean, we pour billions of dollars
into working to understand the science of Alzheimer's and cancer

(37:40):
and diabetes, as we should, but these issues of dehumanization
affect our species in an even deeper way, and there's
comparatively little research about this. The important thing for our
future is understanding why and how people can behave so
badly towards one another. This may be the single most

(38:04):
important question in terms of our legislation, the education of
our children, and the future of our species. Go to
Eagleman dot com slash podcast for more information and to
find further reading. Send me an email at podcasts at

(38:27):
eagleman dot com with questions or discussion, and I'll be
making an episode soon in which I address those. Until
next time, I'm David Eagleman, and this is Inner Cosmos.
Advertise With Us

Host

David Eagleman

David Eagleman

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.