Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Humans disagree on many political fronts, but we're finding ourselves
globally in a period of higher polarization, and the question
is is there anything to be done about that. Presumably
we're not going to get humans to not have conflict,
but is there any such thing as better conflict? What
(00:25):
would that mean? And what does any of that have
to do with brains or with the fact that modern
humans only spread out across the globe sixty thousand years ago,
or with social media recommender algorithms, or the Iroquois Native Americans.
Welcome to Inner Cosmos with me David Eagleman. I'm a
(00:46):
neuroscientist and an author at Stanford and in these episodes,
I examine the intersection of our brains and our lives.
And today's episode is about something that you know, I'm
obsessed with from the point of view of brainience, which
is issues of conflict and empathy and in groups and outgroups.
(01:06):
We find ourselves in a time of conflict. Now. The
things on everyone's mind are the war in Ukraine and
the war between Israel and Hamas. And even though those
have sucked up most of the media's attention, it's critical
to note that we have lots of other conflicts also
going on around the globe right now. There's ongoing internal
(01:30):
conflict in Myanmar. There's an Islamist insurgency in the Maghreb
region of Africa. There's the Boko Haram insurgency in Nigeria.
There's a civil war between rival factions in Sudan. There's
a multi sided conflict of the Syrian Civil War. There's
the ongoing civil wars in Somalia and Ethiopia, and Afghanistan
(01:53):
has been in near continuous armed conflict for essentially as
long as I've been alive. Now, if you've been listening
to this podcast for a while, you know that I'm
very interested in why conflict happens between humans and why
it happens so readily and consistently. So in this light,
(02:13):
there are three main things to note from the neuroscience
point of view. The first thing is that we all
have the same brains on the inside. So despite cultural
differences and geographic differences, your pink brain looks exactly the
same as someone else's anyone else's anywhere around the globe.
(02:35):
And I don't mean that as a feel good statement.
That's just a biological fact. Anatomically modern humans radiated out
of Africa only about sixty thousand years ago, and some
turned left and became European, and others turned right and
became Asian, and those who stayed remained African. But the
timescale that we're talking about some tens of thousands of years.
(02:57):
That is simply not enough time on the evolutionary timescale
for brains to change in any meaningful way to change
their operation or their algorithms. So we're all running the
same version of the biological operating system, despite superficial differences
in size and shape and culture and the costumes that
(03:19):
we wear and how much melanin we have in our
skin and so on. We can end up seeming pretty
different as a result of these surface properties, but we
have the same cognitive and emotional engine on the inside. Now. Unfortunately,
one of the things that operating system does is it
(03:40):
gives us all a very strong drive toward having in
groups and outgroups. This tribalism is something that characterizes our species.
We belong to groups based.
Speaker 2 (03:53):
On our neighborhood or the way we look, or our
religion or our family or whatever it is, and we
take to feel closer to our in groups and treat
them better and listen to their opinions more strongly.
Speaker 1 (04:07):
Our outgroups not so much. When it comes to someone
from the other side of the railroad tracks, or who
happens to have grown up under a different deity, or
who sings different songs or uses different language sounds to
transmit information, or whatever it is, we tend to treat
them with more suspicion. And in decades of neuroscience experiments,
(04:31):
including those from my lab, we've been able to show
that your brain simply does not care as much about
people in your outgroup. At the extreme, your brain will
view somebody in an outgroup the same way it will
view an object, not like a fellow human the way
it does with people in your in group, but instead
(04:52):
as something that's not particularly worthy of empathy, something that
you have to deal with in the same way that
you would deal with a vice, rusts, or a cockroach
or a rat. And if you're interested in more detail
on this point, I dove deep into examples of this
in episode twenty and one of the things I talked
about was an experiment that we did in my lab
(05:14):
that got at this in group outgroup difference, where all
we needed were single word labels. So I'll just tell
you the short version here of the experiment. We put
you in brain imaging in fMRI, and you're looking at
six hands on the screen and one of them gets
stabbed by a syringe needle. Now, seeing that activates a
(05:38):
network in your brain that we summarize as the pain matrix.
Your brain is showing a big spike, and what that
represents is you are feeling the other person's pain even
though it's not your hand. The eras in your brain
that care about pain are coming online. This is the
basis of empathy, of feeling someone else else's pain. But
(06:01):
now we label the six hands with one word labels Christian, Jewish, Muslim, Hindu, Scientologist, atheist,
And now we measure people of all religions in the
scanner as these different hands get stabbed. And what we
found is that everyone has a larger brain response when
(06:21):
it's their own group that's getting stabbed, and their brain
shows a smaller response when it's any of the other
five out groups. And this was true across religions and
even for the atheist group, which cared more when the
atheist hand gets stabbed than the other hands. So this
is not an indictment of religion, but more broadly, it
(06:43):
highlights your brain's exquisite skills at distinguishing in group from
your outgroups and the fact that your brain simply does
not spend as much energy on simulating what it is
like to be someone who is in your If you
want more details on that, please listen to episode twenty,
(07:04):
which is called why does your brain care about some
people more than others? And beyond the fact that we
are naturally tribalistic, there's a third point to note, which
is that every one of us feels like we know
the truth, and it's not clear to us why other
people don't see the truth as clearly as we do.
(07:24):
They must be stupid or ill informed, or trolls or
stubborn or who knows why they don't simply admit to
what is so obviously the truth. And deep down most
people have an intuition that if you could just simply
shout loudly enough in all caps on social media, that
(07:46):
everyone would agree with you, everyone would see the light.
And this is true of everything, not just political positions.
Take religion. There are two thousand active religions on this
small planet, and many people feel that if they could
just share with all the infidels why their religion is
right and everyone else is wrong, that everyone would agree
(08:09):
with them. Everyone would come to know that the religion
you happened to grow up in is the correct one. Now,
if there actually were one correct religion, we might expect
it to spread everywhere equally, But obviously that's not the case.
Religions remain clustered around where they started, and you're not
going to see a blossoming of Islam in Boise, Idaho,
(08:32):
just like you won't expect to see a blossoming of
Protestantism in Mecca. But here's the interesting part. Whether we're
examining one's political preferences, or religious affiliation or whatever, people
are generally reluctant to change to consider the perspectives of
other people. So, given the deep wiring that we have
(08:55):
for having our in groups and for feeling we know
the right answer because of our very limited internal models,
this leads to the critical question about whether there would
be any way for eight billion brains to find peaceful coexistence. Now,
if you've been a regular listener to the podcast, you
know that I'm generally highly optimistic about everything, but I
(09:18):
would suggest the answer to the question of whether we
will achieve peace is probably not. We're probably not going
to find ourselves ever in some magical moment where all
the newspapers say, Wow, nobody is fighting, we're taking a
day off today, and we're not going to find some
extraordinary presidential candidate where everyone says, yeah, that person's great.
(09:41):
I think we can all agree presidential elections are always
fairly close to fifty to fifty. Why this results from
the enormous variety of internal models inside everyone's heads. There
is no perfect candidate because no one can fit all
the constraints at once of all all the different models. Okay,
(10:02):
so if we take this all as our starting point
for today, the question is what does this all mean
for the future of conflict? Are we facing a future
where wars and polarization will stay as they are or
even grow worse. Well, that's certainly a reasonable fear. But
a little while ago I discovered there was a movement
(10:24):
of political scholars around the idea of how to have
better conflict. Now, what does better conflict mean? We'll get
into that in a second, but first I just want
to say I've been moved and inspired by this movement.
It feels like it's the opposite of what we see
online every day, where people pick aside and scream and
(10:47):
yell for it or fight and die for it at
the cost of not meaningfully considering the complexity of human situations.
Just take what's happening in the middle least. Typically, if
someone points out that the history there is extraordinarily multi
layered and complex, people will often say, no, this is
(11:10):
actually quite simple, and then they will give their version
of the history. But the extraordinary part is that people
on both sides will deliver that same line. They will say,
it's actually very simple, And to me, that's a barometer
that tells me the situation is not so simple. But instead,
like most human conflict, it's full of different paths that
(11:30):
you can take through the history to come to a
variety of conclusions. And we witness this sort of complexity
whenever you have millions of people with all personalities involved
in conflict on all sides. You have sinners and saints,
and you have psychopaths and peace seekers, and no one
(11:52):
can hope to derive a meaningful solution without doing the
hard work of digging in and trying to understand what
happens between groups of millions of humans, rather than imagining
there's a good guy side and a bad guy side. So,
as I said, I was very moved to discover that
(12:12):
some people really take on this sort of political intellectual
humility and try to figure out what do we do
with humans? What do we do about the fact that
they're never going to agree, There's always going to be conflict.
How do we make conflict more productive rather than simply
being about bloodshed? So to explore this, I called up
(12:36):
my colleague Jonathan Stray. Jonathan is a former journalist who
is now a senior scientist at the Berkeley Center for
Human Compatible AI, where he works on the algorithms that
feed up content to online and how their operation affects
things like polarization. And he writes a terrific newsletter called
(12:57):
The Better Conflict Bulletin. So here's John Jonathan. So you
are a conflict researcher. Tell us about what that means.
Speaker 3 (13:09):
Well, I try to study what causes people to disagree
and then how that disagreement transforms into polarization and eventually
violence if unchecked, And how to build computer controlled media
systems like social media that will prevent that.
Speaker 1 (13:26):
I know that you think about conflict in terms of
good and bad conflict. So what does that mean?
Speaker 3 (13:33):
Well, we don't want no conflict in society. Conflict is
so first of all, it's inevitable. We all disagree, we
want different things unavoidably. But sometimes when we disagree, it
comes to something productive, you know, we talk about it,
we figure something out, and sometimes it doesn't right, Sometimes
it escalates to some sort of zero sum game or
(13:54):
even violence. The idea is, when we think about conflict,
we're not trying to prevent people from fighting. We're trying
to have them fight in some sort of productive way.
Speaker 1 (14:03):
So what does that look like? What does that mean?
Speaker 3 (14:06):
Well, the simplest example might be violent versus nonviolent, right,
so we can all understand the difference between you know,
people trying to get their way by using force or
getting their way by discussion, yes, but also non violent
conflict tactics, you know, protests. You know, there's famous historical
figures like doctor Martin, Luther King or Gandhi who really
(14:29):
developed these techniques and showed that you could achieve you know,
large political victories without using physical force. So that's that's
maybe the most obvious way but there's a bunch of
other stuff when you start thinking about this.
Speaker 4 (14:43):
So maybe you think about this at a personal level.
Speaker 3 (14:45):
You know, we've all had the experience of an argument
where you know, we said some ugly things that can't
be unsaid, and you feel worse at the end of it.
And then I think most of us have had the
experience of an argument where you know, there were some
hard truths, but it was honest, it was carrying, it
was empathetic, and even if we couldn't solve the problem
at the end of it, we still have a positive
regard for each other.
Speaker 4 (15:06):
And so that would be an example of good versus
bad conflict.
Speaker 1 (15:09):
So one of the things that has always intrigued me
is that we all have this illusion that we know
the truth, that we have a complete story of what's
going on, and therefore other people that disagree with us
seem like they're trolls, or they're uninformed, or they're disingenuous.
And so the question is, how do we address that
(15:30):
illusion that we have. How do we get ourselves a
little bit outside our own fish bowls to see the
other side.
Speaker 3 (15:37):
Yeah, no, that's a great question. So there's a bunch
of ways of answering that. So there's I was at
a conference yesterday on the topic of intellectual humility, the
idea that actually we don't know the answer, there's always
something we're missing. And this is important not just as
an epistemological practice, that is, to be more correct in
what we believe, but as a relational practice, meaning it
(15:59):
changed is how we relate and how we can relate
to other people. And I think part of this is
we always have a partial picture, and we tend to,
in conflict situations, believe that the other person actually holds
more extreme views than they do. So there's really good
evidence of that. This is the case in the American
Culture War, where if you ask one side what the
(16:21):
other side thinks, they will imagine the other side is
actually more extreme than they are. So conflict in particular
is prone to these misperceptions of the other.
Speaker 1 (16:32):
So give us an example of how we know that
that each side thinks the other side is more extreme.
Speaker 4 (16:38):
Yeah. Yeah, there's tons of work on this.
Speaker 3 (16:39):
So there's a group called the Perception Gap which has
done a bunch of research from this. And so, for example,
if you ask Republicans how many Democrats would support completely
open borders, they'll say something like seventy five percent. If
you ask Democrats how many of them would support completely
open borders, it's something like, so they're twenty or twenty
(17:03):
five percentage points off. And this pattern is consistent across
a range of issues, and it's bidirectional as well. So
you know, if you ask Republicans or if you ask
democrats something like, you know, how many Republicans would say that,
you know, everyone should have access to as many guns
as they want, Democrats will guess that number a lot
higher than Republicans will say.
Speaker 1 (17:24):
So, speaking of Democrats and Republicans, here's the question. Are
we more polarized currently in America? Is that just an
impression or is that backed up by statistics?
Speaker 3 (17:36):
No, Unfortunately, it's real, and you can measure it a
lot of different ways. You can look at where people
are on political issues, you can look at how people
feel about each other. You can look at congressional voting patterns,
whether you know members of Congress will cross the aisle
to vote on each other's builts, and all of these
measures show the same basic pattern, which is that polarization
(17:57):
has been increasing since the the mid seventies or maybe
the early eighties and is now at historically high levels.
And it's not just America, it's several other countries in
the world as well.
Speaker 1 (18:08):
Right now, I know you and I have independently made
arguments about why this is not entirely about social media,
and that, of course seems like a good thing to
point to is the fact that this started in the seventies.
But tell us your sense of the involvement and the
non involvement of social media in this polarization.
Speaker 4 (18:27):
Yeah, well, there's definitely some involvement.
Speaker 3 (18:30):
There are a number of ways that social media, in
particular social media algorithms, that is to say, the systems
that decide what we see.
Speaker 4 (18:40):
And in what order, can have some involvement.
Speaker 3 (18:43):
So probably a lot of your listeners have heard of
the idea of the filter bubble, the idea that, you know,
we're each trapped in this sort of algorithmically produced reality
where we don't see the other side. The evidence tends
to be against that. We actually do see quite a
lot of cross cutting content, you know, even if it's
you know, rage clicking on an article that has a
different viewpoint, but there's a bunch of other things going on. So,
(19:06):
for example, most social media and other systems which select
content for us, like news recommenders and so forth, rank
things based on basically the number of clicks they get engagement,
And we really do pay more attention to things that
are valuable to us, but also we pay more attention
to things that are threatening, offensive, fearful. So this approach
(19:31):
of ranking things by how much attention we give them
also tends to amplify more extreme or scarier items.
Speaker 1 (19:57):
Now, I did an episode a few episodes ag Go
where I mentioned for the Iroquois Native Americans up in
essentially Wisconsin and Canada, some hundreds of years ago, they
were having bloody battles between these different tribes all the time,
and they had a new leader come in who came
to be known as the Great Peacemaker because one of
(20:19):
the things he did is assigned everybody in the tribes
to different clans. So you might be a member of
the Turtle clan and I'm a member of the Heron clan,
even though we're members of the same tribe. And it
turns out these allegiances were cross cutting, so that each
person had their allegiance to their tribe and also to
their clan. And these weren't equivalent, and that ended up
(20:41):
bringing peace because now it wasn't so simple to have
a clear in group and out group because they were mixed.
And I know that you in your work with social
media have been looking at something similar to this, which
is this issue of how an algorithm should rank posts.
So tell us about that.
Speaker 3 (21:02):
Yeah, so that is a great example. I've heard of
this Ariquais example as well. And what is happening is
conflict is tied up with identity. We fight basically against
people who are different than us. And in the last
few decades in this country, identity has become more and
more collapsed into a single dimension of you know, left
(21:24):
versus right, red versus blue, Republicans versus Democrats.
Speaker 4 (21:28):
It didn't actually used to be this way.
Speaker 3 (21:30):
People didn't identify with their political parties in the same
way a generation ago. And also there was much more
mixed identification at the national versus local level. So you know,
you would have some town where, you know, you had
a Republican mayor, but you had, you know, progressive politics
because that's what the people there believed.
Speaker 4 (21:52):
In and wanted.
Speaker 3 (21:53):
Politics has become very nationalized. There's less local news, there's
less split ticket voting.
Speaker 4 (22:00):
Every issue takes place.
Speaker 3 (22:02):
At this huge scale where there's only two ends in spectrum.
So how you would have to reverse this is you
would have to have some sort of sense of not
necessarily local, but community level values.
Speaker 4 (22:17):
Right, it would have to be.
Speaker 3 (22:18):
Okay for you know, specific groups online or perhaps discussion
to have opinions that just don't quite fall along this
left right axis.
Speaker 1 (22:29):
And what's the reason that we have seen more sorting
into these groups. What's the reason that particular issues go
together that don't necessarily belong together.
Speaker 3 (22:40):
Well, again, it's an identity thing. It's one of the
things that makes all of these issues sort of one
big blob where you only have two choices is merely
exposure to people who are far away from us, either
physically or social. So you know, maybe you only talk
to your neighbors in your town, or you only read
(23:01):
the local newspaper, and so you could have this sort
of quixotic local politics in a way that was healthy.
But now we are exposed to people all across the
country and indeed the world on a daily basis. So
now we're comparing our point of view to someone you
know that will never meet, who's far away from us.
And there's simulations that show you don't need any antagonism
(23:25):
for people to sort themselves into two big tribes.
Speaker 4 (23:27):
All you need is when you talk.
Speaker 3 (23:29):
To someone and they're you know, near enough like you,
you get a little bit closer to them, right, which
is a natural human thing. When two people talk, they
tend to you know, converge a little on their on
their views and identity. But because that those interactions are
now happening all across the country, we're coalescing into sort
of two big groups.
Speaker 4 (23:51):
It's it's kind of a paradox, right.
Speaker 3 (23:52):
We want to be able to talk to people who
are very far away from us, you know, that's the
promise of these online networks, and.
Speaker 4 (23:58):
Yet it seems to make our con clicks global.
Speaker 1 (24:01):
And there are network science models on this right that
demonstrate how groups end up dividing like this. And what
is the evidence that these network science models actually cash
out in real life?
Speaker 3 (24:14):
Yeah, So these models developed actually from models built in
the nineteen sixties to study physical racial segregation. And what
they showed is that you don't need racism in the
sense of not wanting to live near people who are
a different race than you. All you need is a
slight preference, you know, maybe a few percent to live
(24:34):
near people who are more like you, and that is
enough for entire neighborhoods to eventually segregate. And so those
same models are now being applied to online networks and
they show very similar results. People if they have a
slight preference for people who are maybe a little more
like them in their politics, will spend.
Speaker 4 (24:51):
More time, you know, in a bunch of ways, right,
hanging out.
Speaker 3 (24:55):
In the same groups, or maybe you know, friending them
or fallollowing them, that sort of thing. And so we
see this this sort of global splitting.
Speaker 1 (25:04):
Right, So before we drop this start, I just want
to return to this issue of what does social media
not have to do with the current polarization, And that
certainly seems like one of the issues is this, you know,
network segregation. What else do you see where you think
that social media is maybe not the sole culprit as
opposed to human behavior.
Speaker 4 (25:25):
Yeah.
Speaker 3 (25:25):
Well, one of the things I like to say is that,
you know, if Facebook could fix our democracy by changing
one hundred lines of code, they would have done it already, right,
if only our problem was that simple. But polarization is
one of these problems. That requires an all of society approach.
So it's you know, social media, yes, but it's also
media journalists. Journalists are going to have to learn to
(25:48):
cover politics in different ways, and it's the politicians themselves.
There is always an advantage to taking a view that
is more extreme because it motivates people, and so it's
kind of a devil's bargain, right. You can get people
to turn out and show up for your cause by
pressing on a divisive issue, but doing that divide society further.
(26:11):
So we actually need a whole bunch of sectors of
society to move in the same direction. And there's para
conflict scholars called Guy and Heidi Burgess, and they have
a wonderful article on forty different roles that people have
to do to create a healthy and peaceful democracy. And
(26:31):
what do they say, Well, some of them are exactly
what you'd expect, right, So we already talked about journalists
and so forth, But they also talk about issue analysts.
There are people who have to go deep on particular issues.
They talk about healers, people who present a vision where
we have emotionally healthy relationships with each other. They talk
about they call them democracy firsters, people who are concerned
(26:54):
first and foremost with the correct functioning of elections and
rule of law. Once you start thinking this way, you
realize there's a lot of different roles to play, and
a lot of people have to be pulling in the
same direction. And they say that what we need is
a generation of people interested in conflict who in much
the same way that we now have a generation of
people who are working on climate change.
Speaker 1 (27:16):
Specifically, who are interested in doing conflict right exactly, because
we certainly have a generation interest in conflict. What do
you see when you look around at college camps? Is, Jonathan,
you're located at Berkeley, and I know that you are
a very wise person who keeps a foot in both
camps and tries to see things from all sides. That's
(27:38):
not the reputation that Berkeley has in particular. So when
you look around at the campus, what do you see?
And is anything different about this generation than previous ones?
Speaker 4 (27:48):
Yeah?
Speaker 3 (27:48):
I mean, you know that's a loaded question, right the
kids today and so forth.
Speaker 4 (27:51):
But yeah, I mean, of course Berkeley.
Speaker 3 (27:52):
Is a liberal campus, right, you know, it's a public
university in one of the most liberal places in America.
Speaker 4 (27:58):
So it has it's own policy.
Speaker 3 (28:01):
I do see, And of course it's not just Berkeley
that the current generation of students are I would say,
more involved in politics, which I would say is a
good thing, but also less open or less compromising, which
troubles me in particular when it leads to the suppression
(28:21):
of alternative views. Now, I don't want to say that,
you know, anybody should be able to say anything without
any consequences. I think there are real issues here, for example, inclusion.
You don't want to make people feel like they don't belong.
Speaker 4 (28:36):
At an academic institution.
Speaker 3 (28:38):
However, where it sort of crosses a line for me
is where saying something a little unorthodox or a little
challenging becomes impossible. People are scared to do it. And
I'm not talking about, you know, the haters here. I'm
talking about people who are trying to engage in good
faith and maybe saying something that is, you know, a
(28:58):
little controversial, and there's just less.
Speaker 4 (29:01):
Tolerance for that than there used to be.
Speaker 3 (29:04):
And I think both it's just sort of anecdotally, you know,
talking to the faculty, but also there's a bunch of
data on this. Ironically, Berkeley, which is associated with free
speech very closely. Right, we have the Free Speech Cafe
on campus, and this was the heart of the free
speech movement in the nineteen sixties, is now a place
where people are often criticizing free speech as being too permissive,
(29:29):
and I find that very troubling.
Speaker 1 (29:31):
So how do you think about dealing with that as
you think about training the next generation, for example, with
all these different roles that we might need to have
better conflict, how do you do that?
Speaker 3 (29:42):
Yeah, well, you know, I tried to do this in
my classes yesterday. I mean, it's really about sort of
I think it requires two things, right. One is you
have to create a sense of psychological safety in some way.
Right when my classes, you know, we discuss algorithm design,
we discuss racially biased algorith we discuss all this stuff.
And one of the things I say at the top
(30:03):
of those classes is like, okay, so you know, we're
going to talk about some issues which are very charged.
I understand that some of you are going to have,
you know, real upsetting personal experience with this stuff. But
you know, I asked that you you give it a shot,
and that you know, you try to engage in a
spirit of curiosity. And we're not always going to get
(30:26):
things right, But I don't want anyone to ever accuse
us of not being thoughtful or careful or are empathetic.
And I think that's the spirit you have to approach this,
This is spirit of curiosity. So for example, why do
we disagree about this? What is it about you know,
this student experience and that student's experience that leads to
to such dramatically different conclusions on say, affirmative action, the
(30:51):
you know, media bias, you know, how we should deal
with crime, how should we should do with immigration that
came from somewhere? And very often what we find when
we have the conversation that way is either there's some
intense personal experience, family background. You know, my mother came
from El Salvador, my father was deported, you know, I
(31:12):
had to grow up in economic uncertainty, you know, something
like this. Or we find that people shaped their ideas
from their social network and never really thought about it.
And this is part of what polarization is, is that
all of our political ideas end up sort of collapsing
into this left right access. There's no logical reason why
(31:35):
if you are pro choice you should also be concerned
about climate change and yet here we are. So there's
some sort of social process that makes everybody split intoto
camps in this way, and we can counteract that by
thinking carefully for ourselves and having discussions with people who
might have a different view.
Speaker 1 (31:56):
So you address this in your class, but how do
you think about taking to a larger world. And before
we get into social media algorithms, which I want to do,
you know, one of the things that struck me. Let's
take the Israel Hamas conflict going on right now. Whenever
it comes up about Okay, which side launched that rocket,
it turns out that no amount of evidence sways anybody
(32:19):
on that. People have their side and they generally stick
with it. As far as I can tell. This is
just a you know, view from surfing a lot of
social media on this stuff. The question is how do
you expand beyond your class to get people to ask
these questions about changing their point of view, not even
necessarily changing it, but just being willing to examine other
(32:40):
pieces of evidence.
Speaker 3 (32:41):
First, let's talk about facts. I believe that facts matter.
I believe that they're deeply important. I used to be
an investigative journalist. We can't know everything, but we can
know some things. The problem is what facts mean depends
on who you ask. So in the you know Israel
Palestine conflict, the fact that Palestinians are living there before
nineteen forty seven is a fact that matters deeply to
(33:04):
a lot of people. The fact that Jews were living
there three thousand years ago is a fact that matters
deeply to a lot of people. So these facts are
not in dispute. What is in dispute is the meaning
of these facts, which points to the problem is fundamentally
relational in many cases rather than factual, and people can
be misinformed. I'm not disputing that, but often what you
(33:25):
find is it is about how people feel about each
other and about how they are relating to each other.
Speaker 4 (33:30):
Now, so you ask what we can do.
Speaker 3 (33:33):
So this is probably the moment to mention that I
write a newsletter called the Better Conflict Bulletin, and we
are news and analysis for.
Speaker 4 (33:40):
A better culture war.
Speaker 3 (33:41):
And the idea here is that many people have this
gut sense that we're fighting ugly, and we try to
explore in this newsletter what would it be to fight better?
So we cover conflict research, the science of how people
misunderstand and misperceive each other, and we cover people who
(34:05):
are successfully navigating the culture in a more productive way.
So there are people exploring these topics, there's not a
lot of coverage for it because, as they say in
the peace building field, sometimes peace has no natural constituency.
It's very easy to get people excited about winning. It's
harder to get people excited about living together harmoniously.
Speaker 1 (34:27):
That's right. That's why when I met I just met
you very recently, Jonathan, just a few weeks ago, and
I was so excited by the kind of work you're doing.
Because I come from a neuroscience angle. I study a
lot about in groups and outgroups and empathy and how
we so easily relegate others to a different group. I
was so excited to learn that you and others are
(34:47):
doing the boots on the ground work of trying to
bring groups of people together. And so I still want
to drill in on this point though. So you've got
the newsletter, which is to I'm a paid subscriber to that.
What are the things that you think about though, besides
social media, which we'll get to in a second, And
that's obviously a big leverage point. But are there any
(35:09):
other things you can do when you think about how
do we get people to fight less ugly.
Speaker 3 (35:14):
Yeah, so there's a bunch of sort of immediate things
that you can do. So, first of all, there's this
perception gap, so misperceptions, so we tend to be both
misinformed about what the other side actually believes and we
tend to stereotype them, meaning you know, if a Republican
thinks about a Democrat, or a Democrat thinks about a Republican,
(35:36):
they think about the most extreme version of that right.
And you can see this in the data when you
ask people, you know, you know, what is the distribution
of how you know, many you know, the opposite side
with support let's say violence if they don't get their
way in election, right, and they you get these like
very extreme distributions, where in the reality there's actually much
(35:56):
more more overlap. So yeah, first, you can inform yourself
about what other people actually believe in.
Speaker 4 (36:02):
Honest in its way.
Speaker 3 (36:04):
Second, there is a set of techniques for how to
have conversations with someone who has not just like you know,
a polite conversation, but like genuinely has different values than you.
Speaker 4 (36:19):
And it's got to start with curiosity. It's got to
start with listening.
Speaker 3 (36:23):
And I would suggest don't go into those conversations with
the goal of changing someone's mind.
Speaker 4 (36:27):
You wouldn't want them to do that with you.
Speaker 3 (36:29):
Go into those conversations with the goal of trying to
understand how they got to that place. Because when we're
talking about the American conflict, we're talking about you disagree
with half of America. Well half of America, you know,
isn't stupid. You know, they're not like fundamentally broken or
evil or something. They're going to be pretty average, just
(36:51):
like the other half of America. So how is it
that smart and kind people can end up believing something
completely different than you. That's the thing to get curious about.
And the third thing I would say is watch for
your own emotional reactions. Watch for where you know, you
get you know, angry or let's just say uptight, or
(37:14):
you know, really have this feeling that you have to
defend something or protect something, because those reactions will lead
you to first, what is it that you're scared of
that you're worried about it?
Speaker 4 (37:25):
Where are your fears and concerns?
Speaker 3 (37:27):
And second, if you can notice those and not be
overtaken by them, you can have much better relationships with
people who disagree with you. And actually, one of the
practices that we teach is speak them aloud, right, don't
don't project your anger onto the person across from you,
who you're probably stereotyping, who you know, never did you
(37:49):
personally any wrong. Just say, you know, when you say that,
I get very upset, I have reaction. I feel anger
thinking about that, without directing it towards the other person.
We can't hide our emotions. If we're going to relate better,
they have to be on the table.
Speaker 1 (38:20):
If you were suddenly assigned to be the educations are
for the nation, how would you build junior high or
high school classes to teach kids about this, about how
to relate, about how to let's say, stealman each other's
arguments so that they can understand them better.
Speaker 3 (38:38):
Yeah, so what we need to do is give young
people the ability and the experience of relating successfully across
value divides. And there's actually a bunch of organizations that
do this both at the K through twelve and the
university level. You know, these programs are some of them
(38:59):
are in class teacher led where the last students to
you know, what is a what is the thing you
feel very strongly about and either try to find students
who have a different opinion and teach them to constructively
talk about that, or do things like there was a
recent New York Times piece about a writing teacher who says,
(39:19):
come up with a character who you think is a
terrible human being or something that no one should absolutely
ever say, and write a story where they say that
thing in context in a way which is sympathetic or
humanizing towards them. And I think it's this is the
fundamental skill to be able to see the humanity in people,
(39:40):
even in moments of profound disagreement. And I don't mean
to minimize the actual stakes of these types of conversations, right.
You know, if you are an immigrant who's talking to
someone who says, you know, we shouldn't allow any more
immigration to this country, that has a deep personal impact
on you because it may means you never get to
(40:02):
see your family again.
Speaker 4 (40:03):
Right.
Speaker 3 (40:04):
So I'm not saying that, you know, we should all
talk until we get along. I'm saying we have to
see our political adversaries as human. And there are a
bunch of organizations who try to train people to do
this and also give people the experience of I had
a conversation with someone who disagrees with me and it went, okay.
We are aversive to these conversations. They're hard, they're emotionally taxing,
(40:28):
there's no guarantee they're going to come out well. So
we have to give people the confidence to engage in
this way.
Speaker 1 (40:36):
So I'm so glad to hear you use the word humanizing,
because that's really the key from a neuroscience point of view.
The issue is that people in our out group, we
actually analyze them with our brains in a different way,
such that they are more like an object than a
fellow human. And that's how that opens the door to
(40:56):
you know, genocide of various sorts, or a lower level
you know, violence, or even just insulting or whatever the
thing is. We just don't care about them in the
way that we care about the people that we consider
in our in group. And yet obviously we're all made
of the same biology. We all come about from our
genetics and our experience, and it feels like humanization is
(41:17):
really the key. So you were just telling me about
what could be done with high school kids. Tell me
what you and others are doing with adults in terms
of helping them bridge the gap.
Speaker 3 (41:28):
So there's a bunch of bridge building organizations. There's a
big one called braver Angels. And what this is is
an organization which organizes they call them Red Blue Conversations.
There's a bunch of local chapters, plus they do an
online so you can actually sign up and say, you know,
I'm in you know, this town in Indiana, and I
want to have a conversation with people who.
Speaker 4 (41:48):
Disagree with me.
Speaker 3 (41:50):
And they have a particular way that they mediate and
facilitate these conversations to make them productive. And so if
you want to have that encounter, you can have it.
They are a group that will set that up for
you and show you how to do it.
Speaker 1 (42:02):
Do the people who sign up for that are they
are Some of them just trying to win, and they think, yes,
I want to meet people from the other side, so
I can convince them.
Speaker 3 (42:10):
Possibly, But the moderation format is designed to prevent that.
They don't let people bludge in each other. And they
say right up front, the goal here is not to
change someone's mind. The goal is to increase understanding.
Speaker 1 (42:25):
Oh that's lovely, Okay, great, So you're about to tell
me about another organization.
Speaker 3 (42:28):
Yeah, so there's a bunch of organizations doing this kind
of bridging work. I'm also involved in an organization called
the Dignity Index, And what this is is they've created
a scale from contempt to dignity to rate political speech.
So when a politician talks about their opponent, are they saying,
you know, those people are evil, we have to destroy
(42:50):
them to save America. Or are they saying, I respect
what they believe. I believe something different. I think my
plan is better and if I win, we're going to
work together. Those are really different things, and the scale
actually goes it's an a point scale. It goes from
literally calling for genocide to literally saying that you see
no a difference between self and other. And so what
(43:11):
they're using this for is a couple different things. First
is they're putting together sort of scorecards in the upcoming election,
just rate different candidates, especially at the local level, on
whether they speak with contempt or dignity.
Speaker 4 (43:25):
And the second thing.
Speaker 3 (43:25):
They're doing is groups student groups in universities to train
people to rate political speech on this scale, which is
less about producing the ratings and more about getting people
to think in this way and to notice when people
are engaging in let's say, constructive versus destructive disagreement.
Speaker 1 (43:47):
Excellent. Okay, so those are things on an individual level
that can be done on a societal level. Tell me
how you think about, for example, journalism and what might
be done there.
Speaker 4 (44:00):
Yeah.
Speaker 3 (44:00):
So, as I mentioned, I used to be a professional journalist.
I was an editor at the AP, I was an
investigative journalist at Republica. So I've seen the machine from
the inside. And what I will say about journalists is,
first of all, I have enormous respect for my colleagues
and journalism, and they are some of the most deeply
idealistic people that I know. Right, this isn't this isn't
(44:22):
a conspiracy. But most of them are pretty politically liberal,
and that's not a conspiracy either. That is because, especially
with the decline of local news, most of them work
for national outlets in big cities on.
Speaker 4 (44:37):
The East Coast.
Speaker 3 (44:39):
The social context in which they exist is pretty liberal,
and in particular, much farther to the left than the
Median American.
Speaker 4 (44:47):
You know what that.
Speaker 3 (44:48):
Means is they will have less contact with and less
understanding with conservatives, which means that conservatives will not see,
their views are identity reflected in the coverage of mainstream journalism.
Speaker 1 (45:04):
And this is because conservatives tend to live in more
rural areas.
Speaker 3 (45:08):
Yes, yeah, it's because demographically journalists are not like conservatives, right.
And it's not again, this isn't a conspiracy. This just
it's pure demographics.
Speaker 4 (45:19):
Right. They are educated, urban people.
Speaker 3 (45:24):
And you know, one of the strongest correlates of political
identity is population density.
Speaker 4 (45:29):
Right, It's that simple in many ways. That's another way
you can talk about the divides in this country is
between urban and rural.
Speaker 3 (45:36):
So anyway, given that that is the state of affairs,
what you get is there are only a few media outlets,
notably Fox, which speak in a language and a value
system which resonates with conservatives, and that leaves a sort
of vacuum where people who want that kind of coverage
(45:56):
have to end up going to let's say, less credible
or even fringe news sites. Right, so you start to
get to your Newsmax or your One America, and whatever
you can say about their politics from a pure sort
of journalism quality perspective, they're just not very good. And
so that's part of why we see, you know, higher
(46:19):
rates of misinformation and so forth on the political right
in the US, And my suggestion, which is somewhat controversial,
is that we need more conservative journalists. We need more
well trained people who understand the values of the people
who most journalists don't cover well.
Speaker 1 (46:41):
Okay, So, by the way, you just pointed out this
difference between left and right wing journalism because of the
distribution in the country between rural and urban. But generally,
one of the things I've found so important is this issue.
I know this is something you've looked at about, for example,
conspiracy theories on the left and the right, and essentially
(47:02):
that they are equal. In other words, both sides, all
parties are just as subject to this sort of thinking.
And yet both parties accuse the other of this, just
in the way that both parties accuse the other of
doing book banning when both are guilty of this. Where
else do you see this sort of thing and looking
(47:22):
for other examples where the left and the right accuse
each other of things that they are equally guilty of.
Speaker 3 (47:28):
Yeah, so you're raising the issue of symmetry versus asymmetry
and conflict, and this is a big issue, right, So
you have broadly speaking, sort of two schools of thought
or ways that people talk about conflict. One is, you know,
their side is obviously worse, that they want to destroy democracy.
They're the oppressor, they're the abuser. And you have this
(47:51):
other way of talking about it, which is, look, we're
all human, we are all contributing to being locked in
this escalating conflict spiral. Nobody's immune from misperceptions or mistakes.
Both of these things can be true, right. There really
are cases where one side is doing heinous things and
(48:12):
the other side is not, or at least some sort
of difference between the two. And I think part of
thinking about constructive conflict is bringing in issues of justice,
so you know, sometimes it really is on one side
to change their behavior, and that's where we require accountability
in various forms. On the other hand, conflict is a case,
(48:36):
especially conflict escalation is a case where it really does
take two to tango. I tend to think about the
symmetries more than the asymmetries in the American conflict, largely
because everyone else is focusing on the asymmetries. And so
you mentioned conspiracy theories, so that's a great example. The
sort of media narrative generally is that there's much more
(48:59):
sort of conspiratory thinking on the right in the US.
And if you make a huge list of conspiracy theories,
you know, everything from you know, secret Jewish cabal's controlling
the world to Holocaust denial to chemtrails, what you find
is that it's pretty bipartisan there. You know, about the
(49:20):
same number of conspiracy theories are more commonly believed.
Speaker 4 (49:24):
On the left as opposed to the right.
Speaker 3 (49:27):
However, if you look at misinformation consumption, you find that
it is definitely more of a right wing thing. And
I want to put a sort of big as risk
here and say, well, you know, doesn't this depend on
who's defining misinformation? And what we find is that when
you ask bipartisan pants, so you get a bunch of
(49:49):
Democrats Republicans together and you put a news article in
front of them, or a let's say, a purported news article,
and you say, you know, is this true or not?
You know, take your time, you can use any reference
materials you wants. You know, let's you know, look it
up online. You find that there is generally strong agreement
between bipartisan panels and professional fact checkers. This is the
(50:12):
level of evidence that I want to see to say
that there really is an asymmetry.
Speaker 4 (50:16):
And I do think it's true.
Speaker 3 (50:18):
There's just much more low quality information circulating in right
wing spaces.
Speaker 1 (50:24):
And this is because of the journalism issuy that you
were mentioning.
Speaker 3 (50:28):
Yeah, I think there's a number of things going on
right One of them is there's sort of a news
void right. There just isn't a lot of right wing journalism.
So if people have a demand for that there, you know,
that creates an incentive for people who don't really care
about the journalism to publish things that are going to
get attention because there's no there isn't anything else in
(50:49):
that political space that is that is well done. So
I think I think that is real. But I want
when people say there's there's an asymmetry, right, it's it's
really those people who are the problem. I think we
should have a high bar for evidence. We should have
a high standard for saying, yes, this is real. It
isn't just a cudgel that you're using to try to
(51:10):
win the culture war, because the culture war is not winnable.
That's a fantasy. You can't exclude half of the population
from politics forevermore So, we have to find some other
way to approach each other.
Speaker 1 (51:24):
And so one of the things that you are really
concentrating on is social media as a leverage point. So
again we talked about individual ways to help with conflict resolution,
We've talked about societal way as we were just talking
about journalism, but as far as social media goes, I
know that the way you think about this is there's
this interplay between human psychology, which cares about threats and
(51:47):
recommend our algorithms, with the social media companies in terms
of what they're serving up to you, and then the
content producers, who are going to do the things that
get them the views. And so human psychology we probably
can't change too much, and the content producers were probably
not going to dissuade them from producing things that get views.
(52:08):
So really it's the recommender algorithms that are up for
grabs there. So we touched on this before, but let's
return to that. What do you see as the possibilities there?
Speaker 3 (52:19):
Yeah, so I one of the reasons that I study
recommender algorithms, which by the way, isn't just social media.
It's you know, news recommenders, it's job recommendations, it's Amazon products,
it's Netflix it's it's music, it's podcasts, it's everything. Right,
all of this stuff is picked for us by machines now,
and potentially all of it has political content. You may
(52:40):
not think that you know, who cares what Spotify's recommender
is doing. Well, you know this podcast is on Spotify, right,
so that that matters too. So broader than social media,
there's two reasons I think focusing on social media is interesting.
When is the direct effects and others are the indirect
effects via incentives for producers, So direct effects. So what
(53:02):
I would like to see is less use of engagement
signals in content ranking. So in other words, how much
somebody clicked on something, you know, how many seconds they
spent watching that TikTok video, et cetera, should have less
of an influence on whether it is shown to other people.
And to some extent, this change is already starting to happen.
(53:25):
So there are at least three platforms, of which Facebook
is the only one who's said this publicly. They basically
don't use resharing as a signal for civic and health content.
So maybe for entertainment, whatever catches your attention is fine,
But maybe for civic and health and politics and these
(53:45):
types of critical information sources we shouldn't use, you know,
whether it went viral as a signal for whether it's
any good, and so that is starting to happen.
Speaker 4 (53:56):
I'd like to see more of that.
Speaker 3 (53:58):
I recently pub to paper with a bunch of collaborators
where we cataloged all of the options to using engagement
as a content ranking signal. It's called what we Know
about using non engagement signals and content marking. To just
try to get this knowledge out there and to socialize
it because a lot of this stuff is sort of
very diffuse across industry. People in industry know it but
(54:21):
can't talk about it because.
Speaker 4 (54:22):
It's all private.
Speaker 3 (54:23):
So what we did is we got together people from
eight platforms for an off the record discussion about what
can we say about how to do this better, and
then we reconstructed their conclusions from public sources scattered academic literature,
old company blog posts, but also many references from the
Facebook files which were the leaks that Francis Hagan brought
(54:45):
out in twenty twenty one. We sort of learned what
to look for in those files. So that's the first
thing I think social media can be better. We can
build it not to optimize for outrage, And in fact
the frontier is something called bridging based ranking, and the
idea there is you find content that both sides agree
(55:06):
is good. So, you know, think about this, do you
want the inflammatory news article that appeals to Democrats? Do
you want the inflammatory news article that appeals to Republicans?
Or do you want the article that everybody reads and says, yeah,
that's kind of good. Now, maybe you know, psychologically you're
much more likely to click on the inflammatory headline, but
that doesn't mean our better selves actually want that. And
(55:29):
so I'm involved with a bunch of experiments trying to
find this bridging content and promote it. So that's the
sort of direct changes, right, There's a bunch of algorithmic
changes that you know, I and many of my colleagues
are exploring and trying.
Speaker 4 (55:45):
To advocate for.
Speaker 3 (55:46):
But then one of the really interesting things about doing
this in the algorithm space is that, precisely because these
algorithms decide what everybody sees, changing them can change the
incentive for producers. So if less inflammatory material is downranked
and less popular, that means it's less profitable to produce,
(56:07):
and therefore that changes the kind of content that journalists, politicians,
you know, think tanks, et cetera, find successful, find reaches
an audience. And so this second order or indirect effect
is very interesting because it says that, you know, maybe
if you can get ten platforms to change their algorithms
(56:28):
to use bridging based ranking, that could have an effect
on a much broader media ecosystem.
Speaker 4 (56:34):
So it's a leverage point.
Speaker 1 (56:36):
Yes, how do you convince the social media platform to change?
Speaker 3 (56:40):
Well, I think there's a sort of three cases here.
So and this is why we're testing these algorithms. So
I am running something called the pro Social Ranking Challenge.
It is an open competition for better social media algorithms
where teams from around the world are competing, first of
all for a cash prize, but mostly we're going to
(57:02):
take the winning algorithms, as judged by a panel of scientists,
and we're going to test them on Facebook, Twitter, and
Reddit using a custom browser extension. And so we're actually
going to look to see if it changes polarization, wellbeing,
and other types of attitudes and outcomes, including engagement. Crucially,
(57:27):
we are testing whether it changes both short term and
long term use of these products.
Speaker 4 (57:32):
And so from that.
Speaker 3 (57:33):
We will learn which of three worlds we live in.
If the universe is kind, we will discover that producing
a better product that reduces polarization also increases long term retention.
So you make a higher quality product, people stay on
the platform, maybe not in the short term, but certainly
in the long term. And then you can do well
by doing good right, And then it's just sort of
(57:55):
getting the word out. Or we could live in a
world where you know, it has neutral but to maybe
slightly negative effects to using algorithms that reduce polarization. And
it's not unheard of for platforms to make, you know,
slightly revenue reducing changes to their algorithms in the interest
(58:15):
of public good. You know, I collect examples of this.
So then it's an advocacy campaign, right, this is the
right thing, you should do it. There's lots of groups
that exist to put pressure on companies to do the
right things. Or perhaps in some sense, the worst outcome
is we discover that making algorithms which produce better conflict
outcomes tends to reduce usage of the product in a
(58:39):
way that is meaningful from a business perspective. And then
what we have is a collective action problem. You have
a first mover disadvantage. Who in that, whoever changes their
algorithm to be better first loses money relatives to everyone else.
And then you have to look at regulation because that
can level the playing field, and very much the same
(59:00):
way that environmental regulation prevents a race to the bottom.
You know, nobody wants to be the first to use
a more expensive process that results in less pollution, because
then they would lose their market share. But if everybody
has to do it, then it's okay. So those are
the three outcomes, right, Either it's just a matter of
spreading the good word or its advocacy or its regulation
(59:20):
depending on where the science takes.
Speaker 1 (59:22):
Us, or what else. In conclusion, would you like to say.
Speaker 3 (59:26):
If we care about relating to each other better, and
I don't just mean like kumbay oh why can't we
all get along? But actually having a politics that functions better,
where we get to fight what we believe in without
dehumanizing the other side, without misperceiving what they're actually about,
without things turning ugly and violent. That's something that we
(59:49):
can do. There's many many ways to have better political conflict,
but it's going to take a fundamentally different attitude As
I said before, the the culture war is not winnable.
There is no world in which you get to exclude
your political opponents from politics indefinitely. That's what a democracy is, right.
(01:00:11):
We accept someone winning an election because the next election
they might lose. So there are ways to get involved,
there are things you can do.
Speaker 4 (01:00:20):
The situation is not hopeless.
Speaker 1 (01:00:27):
That was Jonathan's strain at Berkeley. So let's take the
work that Jonathan and others are doing to address our
illusions that people who disagree with us are misinformed trolls.
It's a very useful exercise to figure out how we
can look at somebody who disagrees with us as not
being cold and incompetent, but possibly someone who is kind
(01:00:50):
and generous and has a different opinion. This starts with
intellectual humility, understanding that we don't know it all, and
I don't know that from a philosophical point of view,
but from a neuroscience point of view. Because of brain plasticity,
we each form an internal model of the world based
on our very thin trajectory of space and time, and
(01:01:14):
we form our political opinions based on just the little
bit that we're exposed to. We shape our ideas from
the social networks we happen to be embedded in, and
we're not consciously aware that we're doing this. So at
the heart of all of this is a need for
intellectual humility, and we're going to need a lot of
(01:01:34):
this to address the kind of polarization, the kind of
fear and loathing that we're seeing across the globe. Meaningful
dialogues are a great start, but also there's a need
for scaling. How do we build this into our educational systems,
How do we build this into the fundamental algorithms underlying
(01:01:54):
our social media. How do we build what some scholars
like Heidi and Guy Burgess massively parallel peace building. There's
still a lot of work to be done on this front,
especially as we're moving from communicating via soapbox speeches and
hand delivered pamphlets to instant communication that allows you to
(01:02:16):
deliver your speech or your pamphlet to everyone's mobile rectangle
around the planet. So I suggest that an important angle
on all this is to understand the neuroscience at the
base of everything, why we think the way we do,
why we behave the way we behave, and then work
to build our societal structures like our media, our dialogue,
(01:02:38):
our education. With that in mind, we can no longer
make the romantic assumption that we each are just objective
holders of truth and that ours is this single logical
argument or position that should convince everyone. And this is
because our internal models of the world give us different
(01:02:58):
biases towards different groups. We have different sensitivities towards issues,
we have different levels of knowledge, we have different emotional
affiliations to things happening in the world. So the first
step to better conflict is to have a more realistic
understanding that we are each living on our own planet
mentally and emotionally, and the important goal is to bridge planets,
(01:03:23):
to set up some signaling across the vast reaches of
space between us. It's easy to say that the people
who disagree with you are ugly trolls, but as we discussed,
it can make slightly more sense to ask how someone
who is smart and kind can end up believing something
different than you do. This is not a plea to
(01:03:46):
agree with the other side, but to better understand their
motivation and their philosophy, and fundamentally, to better understand are
shared biology and therefore are shared human go to eagleman
dot com slash podcast for more information and to find
further reading. Send me an email at podcasts at eagleman
(01:04:10):
dot com with any questions or discussions, and check out
and subscribe to Inner Cosmos on YouTube for videos of
each episode and to leave comments. Until next time, I'm
David Eagleman and this is inner Cosmos