Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Stuff to Blow your Mind from how Stuff
Works dot com. Hey you welcome to Stuff to Blow
your Mind. My name is Robert Lamb and I'm Joe McCormick.
And today we're going to do an episode following up
on a panel Robert saw this year at the World
Science Festival in New York. So we're gonna talk about
(00:27):
topics having to do with bias, belief, public opinion, and
science communication. And I think we should start in some
territory that's sure to annoy at least a few listeners
right from the get go. So Robert, I got a
pop quiz for you. Don't look at the numbers. If
you had to guess how many Americans do you think
accept the scientific consensus on global warming? Oh, well, of
(00:50):
course that's a big one, because of course, when you
when you go to answer that question, you think about
your your immediate you know, sphere of influence and the
people you know, or perhaps you think of media representations
on the question. So it depends on on on on
the reporting, on the on the panels of experts you're
presented with. I would I would tend to just off
(01:11):
the top of my head, and I'd want to rate
it of people except the scientific consensus, you're not too
far off, but that's optimistic. Uh so. Gallup has been
tracking Americans beliefs about global warming for over a decade now,
and some of the questions they tend to ask people
are more subjective. It's things like do you worry a
(01:34):
great deal about global warming? And uh, intent of people
said yes. That's up from thirty seven percent in sixteen,
and up from thirty two percent in But technically, I mean,
it's worth pointing out that there's no objective fact about
whether you should be worried or not. Maybe you don't care.
(01:55):
Oh yeah, it comes down to worry and what what? What?
To what extent? Are you worrying a great deal about something? Right? Like,
you can you can realize something is a vital threat
to the human race. You can just not care anyway. Right,
you can say, well, if it happens, it happens, or
you know or what or well that's a problem for
the next generation to figure out. There are various ways
of calculating that question in your head. Right. It also
(02:16):
hinges on the word worry, like maybe you do care
about fighting climate change, but you wouldn't characterize your feeling
as worried. You're invigorated by the idea of trying to
do something about it. Right, Are you worried about it
versus do you think this is this is a problem
that government should work together too to address. Right. That
Some of the other questions have straightforwardly right or wrong answers.
(02:39):
For example, in seventy one percent of Americans said they
agree that most scientists believe global warming is occurring. Right,
So the question is, do you think most scientists believe
global warming is occurring? Se said yes. That's up from
sixty in sixteen and sixty, so there's there's a climb
(03:03):
in that number. More people are saying yes, I think
most scientists believe that the Earth is warming. There is
just an objective fact to the matter about whether most
scientists or most climate scientists believe the planet is warming.
They do. There's no debate about that. Now, what has
been reasonably debated is the exact figure of the agreement,
(03:25):
because it's not necessarily easy to calculate exactly what numbers
of scientists agree with a certain proposition. Right, Yeah, I mean,
if you if you just give yourself this assignment and
start hitting Wikipedia. Yeah, you're gonna find lists of scientists
who or either opponents or proponents. But then when you
start trying to peal back and figure out who these
(03:45):
people are and what their field of expertise are, it
just gets increasingly complicated. Right, But there are studies that
look into this. They try to impose a methodology and say, Okay,
what do scientists think or what has the published literature said. Now,
one study like this was published in two thousand nine
in EOS Transactions of the American Geophysical Union, Uh good
(04:07):
good professional publication title there, and it's called Examining the
Scientific Consensus on Climate Change. And so what they did
is they sent invitations out to more than ten thousand
Earth scientists, basically all of the geoscientists they could find
at universities and public research institutions, with two survey questions.
And these were the two questions. First question, when compared
(04:30):
with pre eighteen hundreds levels, do you think that mean
global temperatures have generally risen, fallen, or remained relatively constant?
And then the second question, do you think human activity
is a significant contributing factor in changing mean global temperatures?
So of the people they pinned with this survey, three thousand,
(04:52):
one forty six geoscientists responded. They said, that's about a
standard survey response rate over all of the earth sciences.
Nine of participants answered risen to the first question. Now,
that's geoscientists. That's people who study the Earth in anyway,
so geologist, ocean hydrologists, meteorologists, economic geologists. Um, what about
(05:14):
people who study the climate specifically? Well, Of the subset
of respondents who were experts in climate science and had
published more than half of their recent papers weighing in
on the subject of climate change, ninety six point two
percent or seventy six of seventy nine answered risen to
the first question. That's a high that's a high percentage
(05:35):
right there, right yeah. And there have been multiple other
studies that use different methodologies to ask slightly different questions,
but all of them have found overwhelming agreement among scientists
in general, and especially among climate scientists in particular, that
the planet is rapidly warming. So those seventy one percent
of Americans who say that most scientists believe global warming
(05:57):
is happening, they are factually correct. Those who disagreed are incorrect.
Though it is worth saying that a large share of
the people who didn't agree with that said they were unsure.
So if you're unsure, you're unsure. But if you if
you didn't agree with that, you are incorrect. But then,
of course, there you get into questions in the global
warming debate that are not quite as cut and dry
(06:18):
as whether the majority of scientists agree that the Earth
is warming. For example, what's causing the warming? Is global
warming caused by human activity primarily greenhouse gas emissions, or
by natural causes? Well, it shouldn't be surprising that there
have been plenty of attempts to study the opinion of
scientists on this question as well. So, for example, in
(06:39):
that same survey from two thousand nine we just mentioned
a d two percent of all Earth scientists said yes
that humans are a major contributing factor, and nine seven
point four percent of active climate researchers said yes. Again,
these are high percentages. If these were the experts telling
me that I should cut something out of my diet
(07:02):
or maybe, you know, make other some sort of major
change in my life, I would be seriously inclined to
listen to them. Okay, sure, but maybe maybe you say, well,
that's just one study, but has anybody else studied this?
Actually yes. So a commonly cited figure is from a
study in the journal Environmental Research Letters by John Cook
at All that what they did is looked at abstracts
(07:23):
of published papers on the subject. They looked at about
twelve thousand research papers published over the previous two decades,
and uh they found quote, sixty six point four percent
of abstracts expressed no position on anthropogenic global warming, meaning
human cause global warming, thirty two point six percent endorsed it,
and zero point seven percent rejected it, and zero point
(07:46):
three percent were uncertain about the cause of global warming.
So this means that among papers that expressed a view
on the cause of global warming, ninety seven point one
percent endorsed the consensus. Now, I want to just cut
in are real quick and say if it sounds like
we're just hitting you over the head with a bunch
of figures and numbers to drive home the fact that
(08:06):
climate change is caused by hum an activity, the vast
majority of this episode is dealing not with those facts,
but how we process those facts. Right, We just want
to establish clearly what the scientific consensus is beyond any
reasonable doubts. We will get to science, communication and public
consumption of the information in a bit right. Uh. So
you've got this Cook paper, and obviously there's a lot
(08:28):
of people in the general public don't agree with climate change.
So there's been plenty of criticisms of the cook papers methodology.
For example, in there was a Dutch born economist named
Richard Toll who criticized the Cook study and tried to
revise the estimates down. Uh. Toll is often cited as
a critic of the consensus on global warming. But even
when he revised the numbers down, he recrunched them and said, no,
(08:52):
actually it's not as high as they said. He found
nine one percent agreement instead of agreement. And I should
add that Cook also defended their original figure on in
a response article that attributed tolls lower figure of to
a math era one last study before we move on, Uh,
andreg at all expert credibility and Climate Change Proceedings of
(09:15):
the National Academy of Sciences two thousand ten. They used
a data set of one thousand, three seventy two climate
researchers to determine that of climate scientists in general were
convinced that human activity was the main cause of global
warming and of climate scientists who were actively publishing in
the field, between nine and percent of them agreed with
(09:38):
the findings of the Intergovernmental Panel on Climate Change, the
i p c C, which concluded that the main reason
the climate is changing is human activity, primarily greenhouse gas emissions.
So this is pretty unambiguous anyway you cut it. The
large majority of scientists, especially those that study the climate
directly and published in the field, agree that the Earth
(10:00):
is warming and that human activity is the main cause.
But let's go back to those gallop results. Only of
Americans agree with the objective fact that the majority of
scientists believe the Earth is warming, and only sixty eight
percent of Americans agree with the clear consensus that human
activity is a significant cause of this warming. And this
(10:22):
gap between the expert consensus in public opinion is sometimes
maddening to science and science communicators, Like if you are
a dissenting climate researcher and you've got some reason of
your own to disagree, sure, but why would non experts
like you and me disagree with the overwhelming majority of
(10:42):
people who know what they're talking about. And the ungenerous
question that comes out of frustration with this situation is
why don't they believe in science? And is that really
what's going on? So that's a question we wanted to
ask today is the city. Is the situation really that
they don't believe in science or is that there's something
(11:03):
particular to this issue where their ability to judge science
has been corrupted. Now, obviously you're listening to a science podcast,
so I'm we're not gonna try and belabor the point
here too much. But of course science, science is the
true path as a systematic exploration of the universe and
(11:24):
the properties that government science. I mean, this has allowed
humanity a path out of darkness, ignorance, and disease. It
underlies all the marvels of modern technology and provides us
with a chance for humanities a long term survival on
and potentially beyond Earth. I mean, if you listen to
the podcast, you know I'm not saying that that science
will answer all of life's questions, especially in teleological questions
(11:47):
related to him like why do I exist? What's my
purpose in life? Right? But if you're trying to answer
facts about the material universe with reliable accuracy, uh, I
mean we're science advocates around here, right, you have the
science you have the scientists, and here's the thing. For
the most part, we trust it, or at the very
least we claim to trust it. That's right. Is pointed
(12:10):
out in an often cited uh Pew research study from
two thousands sixteen. Americans in particular trust only the military
over scientists. So there's a whole ranking. Yeah. So they
repeatedly do this study where they'll put up public institutions
and they'll ask you how much trust you have in them,
like great deal of trust, a significant amount of trust,
(12:31):
not too much, exactly like that, And people will rank
these different institutions, including things like the media. You can
guess where that goes. But how does the ranking work out? Well,
based on this two thousand and sixteen ranking, it goes
as follows military at the top, then medical scientists, scientists
k through twelve principles, religious leaders, news, media, business, and
(12:51):
finally elected officials. Yeah. Now, one of the main people
we're gonna be talking about from this seventeen World Science
Festival panel, Dan Kahan, is what he made this point
in the panel that I thought was great. He said,
even religious people say on this survey that they generally
trust science more than they trust religious leaders. And and
(13:14):
percentages for trusting these groups a great deal, because, like
you said, there are different levels of it and the questions.
Uh So, the percentages for trusting these groups a great
deal range from thirty three for the military and twenty
four and twenty one for medical scientists and scientists, respectively,
down to a mere three percent for elected officials. Just
(13:34):
to let you know what politicians, Yeah, I don't know
why we don't trust them more. But but particularly noteworthy
here is the fact that of the Americans poll, the
participants expressed a fair amount of confidence in medical scientists
and scientists. So, uh, that alone, I think it is
is telling about our overall cultural trust in scientific expertise.
(13:59):
And of course that's not been getting into the fact
that if if you're talking about the military, aren't you
also talking about military scientists. But that's that's kind of
a separate question. Yeah, and so that's interesting. Generally, people
say they put at least a fair amount of trust
and scientists. A lot of people put a great deal
of trust in scientists. People claim to believe in science,
people claim to say that scientists are working with the
(14:20):
public's best interests at heart. But there are particular issues
where for some reason, the public's understanding of the scientific
consensus gets very out of whack. Yeah. Yeah, we have
this polarizing effect, or seemingly polarized based on what we
see and what we hear in the media. That how
(14:40):
can we hold such high opinions of scientists and science
in general and disagree on the clear scientific consensus regarding,
for instance, human driven climate change? Why are so many
of us scared away by the prospect of Frankenstein food? Um?
And then other other topics, of course are vaccine safety,
uh for um, evolution and whether your kids should watch
(15:02):
Dinosaur Train? What does Dinosaur train? Dinosaur Trains a lovely
show by the Jim Hinson Company where dinosaurs travel in
time on a train through a wormhole and your child
memorizes all of these complex dinosaur names. That's wonderful. Yeah.
And then of course another big one, This is what
I hadn't really thought about much, but I understand you've
you've covered in the past for Harvard thinking. Yeah, is
(15:24):
the deep geological isolation of nuclear waste. Yeah, so, high
level radioactive waste. I mean, there's pretty clear scientific consensus
that the best thing we need to do with it
is bury it deep underground. But when you pull the
general public, even the educated public, people who say they're
into science, you you do not get the high levels
of agreement with the scientific consensus on this. And what
(15:47):
some of these UH things should indicate is, I mean,
it's no secret that the consensus on climate change has
been particularly identified with one half of the political spectrum,
at least in the United States. But this is not
to criticize conservatives, because there are plenty of these issues
where where. Apparently if you pull people, liberals are more
(16:09):
out of tune with the scientific consensus than conservatives are.
Like last time I saw more liberals were out of
tune with the scientific consensus on vaccine safety, namely that
they are safe uh than conservatives. Now we've already mentioned
that the the World Science Festival panel that I attended
UH and then you watched online, and then you too, listener,
(16:29):
can watch online. I'll include either a link or embedded
video of it on the landing page for this episode
Stuff to Abow your Mind dot com. But the title
the discussion was Science in a Polarized World. It was
moderated by author and journalist John don Van UH and
the panelist included astrophysicist France Cordova, physicist and World Science
(16:50):
Festival co founder Brian Greene, geneticist Sir Paul Nurse, and
most notably that one of the individuals we are probably
gonna spend the most time with here today, Ale law
professor and science communication expert Dan Kahan. He's the Elizabeth K.
Dollard Professor of Law and Professor of Psychology at Yale
Law School, and his primary research interests are risk perception,
(17:14):
science communication, and the application of decision science to law
and policy making. Yeah, so I was looking at some
of his research in preparing for this episode, and it's
an interesting thing he's doing. Obviously he's not the only
person doing it, but trying to apply, for example, psychology
psychological science to laws. So he'd published things in the
Harvard Law Review that are saying, hey, you know, judges
(17:37):
should be aware that human brains tend to work like X, Y,
and Z like. For example, one thing that judges might
really benefit from being aware of is that research shows
that when you tell people to be rational and objective,
they do not become more rational and objective to get
more entrenched. Al Right. So in this discussion panel everybody.
(18:00):
He had some great commentary on on polarization regarding a
scientific consensus. Um Sir Paul Nurse was wonderful. Uh, Brian Green,
the physicist, it just was really fired up for this
one and it was just you know, enjoy to watch
and listen to. But one of the things that Brian
(18:21):
Green was saying in the panel was, you know, so
we've got this problem of polarization of public opinion on
scientific issues. There were some issues where the public, as
we talked about at the beginning of the episode, just
doesn't line up with what scientists are saying. Why is that?
And I think Brian Green was coming at it from
the point of view like, if we could just make
them understand science better, if we could just teach, we
(18:45):
could educate people better in the scientific process and what
science is about, then they would agree with the scientific consensus.
And Kahan had a really interesting response to this. I
think his answer was was, don't get ahead of yourself.
That's not necessarily the case, right, because I mean, because
Brian Green's question makes sense, right because you think, well,
(19:08):
you want to say, what do you realize that that
science is to quote Sir Paul Nurse, tentative knowledge that
we're it's it's not a complete, prepackaged understanding the universe.
It's a continued exploration. You want to say, don't you
understand the mistakes are part of it. This was definitely
Brian Green's argument that if you want to talk about
people who are skeptical of climate change, talk to the
(19:29):
climate change scientists who have studied it, because science is
about you. If you you have a hypothesis and you're
studying it, you were skeptical about it every step of
the way. Yeah, if you're not being skeptical about the
theory you advocate, then you're not doing science right, right,
You're not being very good at your job. So all
of the that, that entire argument makes sense, and it
does lead one to think, all right, it's just it's
(19:50):
scientific literacy or lack of reasoning that's being employed here.
But yeah, Kahan is saying that you'll find plenty of
people resisting sientific consensus who are highly literate in science
and highly logical, and they just wind up applying their
cognitive resources to fit their beliefs in worldview. In fact,
it gets crazier than this, because it's not just that
(20:13):
people who are highly educated in science, who apparently, if
you test them, they understand how science works. It's not
just that they can disagree with the scientific consensus, but
that people who have more rational capacity and who have
greater cognitive resources in fact, tend to apply those more
strongly on entrenching themselves against the scientific consensus when they
(20:37):
disagree with it. It's like people who are better educated
in science are better at coming up with reasons to
explain why they don't agree with scientists or why they
don't agree. Not just don't agree with scientists, but don't
agree about even objective facts like the fact that the
majority of scientists to endorse the fact that the Earth
(20:57):
is warming. Yeah, and it is worth noting here that
when we're talking about these these dissenting individuals, like it's
generally it's not across the board. They're not they're not
dissenting on all scientific consensus. It's about a particular topic,
be that topic climate change, or be that topic, um,
you know, vaccines or genetically modified organisms. Yeah, and it's
(21:20):
it's also important Kahan points out that you also see
this this sort of thing outside of science. You see
you see the same process involved with say, abortion or
military recruitment, any issue in which protests becomes a badge
of identity. And this place specifically into one paper that
Khan refers to, which is this paper they saw a protest. Uh.
(21:41):
So this comes up in the conversation where you you
can show people video of a protest taking place, right,
and you ask them just objective facts about what they
see at the video. Did you see the protesters, uh,
screaming in someone's face, did you see the protesters bocking
someone's path? Did you see the protesters doing this and that?
(22:03):
All these sort of negative behaviors that would render the
protest in a bad light. And it turns out people
claim to see different things in the video depending on
whether they think the politics of the protesters line up
with their own. So if you show a person with
certain politics a video and you say that it's people
(22:24):
protesting outside an abortion clinic, they'll have a very different
report of what they see in the video than if
you tell them that it's protesters outside a military recruiting
center protesting. Don't ask, don't tell. And this is a
specific example of motivated reasoning. Maybe we can talk about
motivated reasoning more later on, but it's the fact that
we we just do not process the facts of reality
(22:47):
and the evidence of our senses with perfect objectivity. We
in fact process them in a highly goal oriented way,
and a lot of times that goal is I don't
want people like me or the people in my group
to look bad. All Right, we're gonna take a quick break,
and when we come back, we're gonna jump jump right
back into Kahan's research. Than all right, we're back. So
(23:17):
one of the things that people obviously do when they
are motivated to arrive at a certain conclusion is that
they cherry pick facts. Right that you can. You can
always find stuff that makes your worldview look better or
stuff that makes the other person's worldview look better. And
it is in fact, incredibly easy and causes almost no
(23:38):
cognitive dissonance whatsoever for people to just say, Okay, this
fact that supports what I already believe. That's a good
fact that's legit and real and should be included. And
a fact I encounter that doesn't support my point of view, well,
that's that's a bunch of bunk. You know, why would
anybody believe that? And it extends to experts as well,
(23:59):
so that the same thing. It's like, here's this, uh,
this this expert, and I'm using expert in quotation marks
because to what degree they're an expert also depends on
your cherry picking this individual. Let's say this individual with
some sort of scientific background. Uh, they're making a statement.
I will consider them more of an expert based on
how their opinion matches up with my preconceived beliefs and worldview. Yeah,
(24:22):
and this is another point Kahan makes it it comes
straight out of that. So he says, it's not that
people who don't, for example, except the consensus on climate
change or on vaccine safety, don't believe in scientific expertise.
They do. They do believe in scientific expertise generally statistically
(24:43):
they do, but they don't think that people who disagree
with them are legitimate experts. Now, Kahan wrote about this
in a very recent paper like this month. Um, misconceptions, misinformation,
and logic of identity, protective cognition. When we're talking about
your views as as a badge of identity, that's what
(25:04):
we're getting to here. Um. This came out June for
the Cultural Cognition Project. So in this paper, Kahn tackles
what he refers to as the public irrationality thesis or pit.
So this is something he's setting up to be an
opposition to. Right. This is the idea that we can
touched on earlier, the idea that the general public largely
(25:25):
quote display only modest familiarity with fundamental scientific findings and
lack proficiency in the forms of critical reasoning essential to
science comprehension unquote and uh, and they're therefore easily swayed
by special interest groups who muddy the waters with non
scientific information. Yeah, I think this is a is a
common thesis people on both sides of a contentious issue
(25:49):
of fact in the public debate sphere. They just tend
to think that, well, people on the other side are
just ignorant, and they just they just don't understand and
they're just being swayed by propaganda. Yeah, you listen to
the wrong news channel, you listen to the wrong radical
and now you have you have a faulty understanding of
the facts. So Gahan argues that Pitt reflects a misconception
(26:13):
of science communication, like a basic misconception. Controversy over so
called decision relevant science is increasingly tied to identity protective cognition.
This is the quote tendency to selectively credit and discredit
evidence in patterns that reflect people's commitments to competing cultural groups.
And that's a concept, he says, it's rooted in the
(26:35):
two thousand to two thousand sixteen work of D. K.
Sherman and G. L. Cohen. Right, so maybe we should
try to go a little bit deeper into where this
idea of of identity protective cognition comes from. So obviously
there are a lot of ways to be wrong. Right.
You can be mistaken due to pure error, right, But
(26:55):
as we've already shown, you can also be mistaken for
a reason. Our our brains are not so made as
to perceive and judge the world objectively, like when you're reasoning,
and perceptions are skewed by a desire, conscious or unconscious,
to reach particular conclusions. This is what we call motivated reasoning.
And Kahan in an article he did for the Harvard
(27:17):
Law Review in that he reproduced an exerpt from on
his blog, he he said, quote motivated reasoning refers to
the unconscious tendency of individuals to process information in a
manner that suits some end or goal extrinsic to the
formation of accurate beliefs. That unconscious part. It is very
(27:38):
critical because because no, but we're not arguing that argument.
Here is not that someone is saying, well, I don't
I don't like this climate change. This is the expert
for me. This is it, or or vice versa, theyd
someone saying, Oh, I'm I don't really like the idea
of these GMO foods. I'm gonna listen to this expert
right here. This is taking place, uh in the unconscious. Yeah.
(27:59):
You you don't even and realize when it's going on.
And so there there is a classic, highly cited paper
in the history of psychology that he goes back to
to talk about early examples of motivated reasoning, and this
is a precedent, I guess for his They Saw a
protest paper. The original one was this paper called They
Saw a Game a case study, and it goes to
stuff that has nothing to do with politics, absolutely nothing.
(28:22):
You can take the politics and you can take the
science completely out and you still get the exact same effects.
And what this is is there was a football game
between Dartmouth and Princeton in ninette that had some highly
controversial behavior, leading injuries for a few players. Players were
hurting each other in the Researchers in this study recruited
(28:45):
Dartmouth and Princeton students to review footage of what happened
and answer questions about what they saw, and it turns
out what they saw depended on their school allegiance. Dartmouth
students claimed to see things favorable to Dartmouth's reputation. Princeton
students claim to see things favorable to Princeton's reputation. They
(29:06):
didn't just have different opinions about the game, they apparently
perceived a different reality based on institutional allegiance. They were
not reasoning impartially but in a motivated way, and lots
of studies over the years have reflected other versions of
these findings. It's totally clear when people have a goal,
when they consciously or unconsciously want things to be a
(29:27):
certain way, they're usually not capable of reasoning and perceiving
reality impartially. And to get back to the main example
of this that we we came in with is the
idea of identity protective cognition. We want to affirm our
membership in reference groups because we're social creatures, right, I mean,
(29:48):
one of one of the main things that's been hypothesized
that our brains evolved to do is to manage social relationships.
We were just talking about the social brain hypothesis another episode.
Uh yeah on one of the main things we appear
to be optimized for is for group membership and group
solidarity and understanding group dynamics. Yeah, I mean, survival has
(30:10):
a almost has a different definition when you're talking about
an individual versus a a larger especially a global culture.
We didn't evolve to save the planet from the human
caused climate change or or meteorites. Uh. We evolve to
survive um social dynamics, to to adapt our thinking to
(30:31):
fit in with the group that has access to the fire,
that has access to the uh to to the food
and the shelter that is necessary for survival. Yeah. And
so we deeply, deeply want we're highly motivated to affirm
our membership in reference groups and the character and the
reputation of those groups. When those things are at stake,
(30:52):
we are highly motivated to defend them. So the idea
here's the culture comes before fact. Perception of what acts
even are are shaped by values. So many of these
individual members of the public simply have a quote bigger
personal stake in fitting in with important affinity groups than
informing correct perceptions of scientific evidence. Yet again, this is
(31:15):
not necessarily done consciously. In fact, it's almost never done consciously.
You don't think I'm sacrificing knowing the truth for fitting
in with my group. That's just what your brain does
and doesn't really let you in on the fact that
that's what it's doing. Yeah, and the members of the
public that are most polarized over a topic are the
ones that have the highest degree of scientific comprehension. Where
(31:36):
I discussed that this is the nature of the dissenting expert.
The problem, then, Kahan points out, is not a gullible public,
not this pit scenario, but quote, a polluted science communication environment. Now,
he referred to a two thousand eleven study that that
he himself worked on with Jenkins, Smith and Brahmin, in
which a scientist headshot and credentials were presented along with
(32:00):
attributed quotes about climate change, and whether he this individual
was a true expert in the eyes of the subject
depended entirely on their particulars and their views. So people
are simply quote, using the consistency of new evidence with
their group's positions to determine whether the evidence should be
given any weight at all. And this is how deniers
(32:22):
of scientific consensus become stuck in their opinions. Right, So,
if somebody presents you an alternative opinion, the scientist comes
on TV or writes a book or something like that
and says, look, here's what the science says. It's pretty clear.
This is why scientists agree. This is where the consensus
comes from, and here's why the public should agree with
(32:42):
it too. If you are part of a group that
is culturally polarized against that scientific position, you don't think
I'm being anti science. You just think this person isn't
a real expert. Why should I trust what they say? Indeed,
and I also want to point out that the con
touches on on disinformation. He says that disinformation doesn't seem
(33:03):
to have as much impact as you might think. And
the embar in mind that there are several flavors of misinformation.
There's self misinformation, there's motivated consumption of misinformation. They're straight
up fake news. Uh. Kahan states that while such misinformation
certainly does have an impact on the world, UH, the
reality is is a little bit different. He says what
these individuals do with misinformation in most circumstances will not
(33:27):
differ from what they would have done without it. So
I find this whole scenario very, very illuminating. Uh, you know,
it's it's help It's a helpful model not only in
understanding or trying to understand individuals who have a differing
opinion in your own on scientific consensus, but also to
to self reflect and and try and think, well, how
do I think about scientific consensus? Yeah. Well, one of
(33:48):
the things that you should really take away from this,
and we should emphasize this very strongly, is that this
applies to you too. It applies to me and to you. Um,
it's not so much surprising that motivated reasoning happens, or
that identity protective cognition happens, but it's surprising that it
applies to you because it doesn't feel like it does. Yeah. Yeah,
(34:10):
when it just feels like I'm being objective, I'm trying
to figure out what's true, it's those other people who
are reasoning from their cultural point of view, right, Yeah,
I mean that that is how how it how it feels.
I mean, that's the that's one of the tricky parts
about this is you. You can't simply hold up the
mirror uh so much and say look, look, look how
you're thinking. Look at look at the way you're processing
(34:31):
your information. Yeah. And so this actually leads to problems,
and Cohn writes about this in uh. In that piece
in the Harvard Law Review, he points out how this
leads to really bad cultural situations, where so you've got
your group and another group who are both motivated to
perceive facts differently for reasons having nothing to do with
(34:51):
forming accurate beliefs. You know, you're both using motivated reasoning.
Each group correctly perceives that the other group is used
motivated reasoning, but each group incorrectly believes that it is
just looking at the plane obvious objective truth. And of course,
when you feel like, well, I'm just looking at the
plane obvious objective truth and this other group is deluding themselves,
(35:15):
that can lead to feelings of disgust and polarization. You're like,
why won't they accept reality? Why are they being so dishonest? Yeah,
And in this the divide deepens even more, right, Yeah,
And of course leads to these uh, these partisanship situations.
And of course this makes the problem even worse because
once you get entrenched partisanship on an issue in the
(35:37):
public conversation, this provides even more incentive to group a
line right, and so it re reinforces the motivated reasoning
that caused you to divide in the first place. Now,
there are some things that you might think you could
do to solve the problem. For one thing, you could say, hey,
what if we just tell people, no, don't don't think
(35:59):
with your culture, I don't think with your identity, be
rational and be objective. Does that solve the problem? Well,
Khan says, research says no. When people use motivated reasoning,
they tend to believe they're already being objective. They think, yes,
I am being objective. And this is due to what
he calls naive realism. This is just the belief that, well,
(36:20):
what I'm looking at is is a clear and and
accurate perception of reality. So we're we're all correctly perceiving
that other people are reasoning with motivation. We're buying into
naive realism about our own points of view, saying well,
I'm just looking at the facts. And this leads to
that horrible state of affairs of of cultural cognition, where
(36:40):
where partisanship rules. Uh, these certain issues that have been
infected with the toxic sludge of culture bleeding into questions
of fact. I always end up coming back to Dr
Seuss when thinking about these these issues, and not only
the sneeches, the starbellied sneeches, who who are so caught
up in in the the identity of their groups that
(37:02):
they're only cured of it due to just catastrophe. And
then there's a shorter story in that same book where
we have the North going Zacks in the South going Zacks.
These two individuals that need in the desert going in
a straight line, and they neither one budgets. They can't
move through each other, but neither one is going to
go around. Uh. And it just over time like a
city is built around them while they're just frozen in
(37:25):
their their their their their unshakable ability to either compromise
or to understand each other. Yeah. Well, so this situation
can really induce feelings of despair. I mean, there are
multiple problems here, one of which is that some issues
are becoming infected with this with this motivated reasoning, this
cultural cognition. A toxin is how Kahan referred to it
(37:49):
as a pollutant. Yeah, it's a pollutant that just infects
certain issues and then makes it impossible to have a
clear discussion on them because you get people retrenched in
their positions and don't budge. But then the retrenchment leads
to the general worsening of the situation. It's a it's
a self reinforcing cycle that just gets worse and worse.
(38:09):
It's like everybody's identity and their politics has all just
drained out into this, into this body of water. How
do you unpollute that enough that you can have the
unpolluted discussion again? Now, maybe that's what we should turn
to next. If you're hearing this and you're you're following
along with us, like, if you agree that these are
valid ways of examining what's going on in in these
(38:31):
public conversations, uh, you might be feeling to spare right,
How do we ever get out of this? If we
all use motivated reasoning and there are these horrible situations
where issues of fact and scientific questions are just polluted
by cultural partisanship, how do we get out of it?
We'll take a quick break and when we come back,
we can discuss why it's not necessarily always time to despair. Alright,
(39:01):
we're back. Okay, So we were saying, it can feel
like it's time to despair once you look at the
situation of partisanship, partisan reasoning, cultural cognition. But it's not
necessarily time to despair. First of all, if you're just
thinking about motivated reasoning and you accept the fact that
you use it too, it's not just those other people,
it's me, it's you. We all use it. How can
(39:23):
we ever know anything is true? Well, i'd say two
things to that. First of all, not every question is
settled through motivated reasoning, right, There are plenty of questions
where we actually do have the primary motivation of just
getting an accurate answer people. People do show identity splitting
on whether climate change is dangerous, but they don't show
(39:44):
identity based splitting on issues like whether X rays are
harmful to the human body. If you pull people based
on their ideology and political affiliation, all the other stuff
you'd be looking for their you know, liberals and conservatives,
or these other groups that are oft inside, like the
hierarchical individualist versus the egalitarian communitarian. These groups are in
(40:06):
agreement X rays are equally harmful to the human bodies
say they yeah, because, as Khan points out again, these
these instances of polarization over scientific consensus, these are are
pathological in the sense that they're harmful, but they're also rare. Yeah.
So there, it's just these certain issues that we're reasoning
this way about. Not everything suffers from this problem. Many
(40:27):
issues are uncontroversial. We generally approached them with no real
motivation other than just knowing what's true. The problem is
that even though you're not always using motivated reasoning, you're
probably not going to know it when you are, uh,
using motivated reasoning apparently feels similar to using actual objective reasoning.
(40:49):
You can you can know this firsthand by the fact
that you don't ever think you're using motivated reasoning. You
think you're just honestly judging things. But you also know
you're not right about everything. You're some of those things
you believe you're definitely wrong about, even though it feels
like you're just clearly judging what's true. So is there
any way to know what's true when issues are controversial
(41:11):
and when we're motivated to reason one way or another. Well,
I'd say this is when we come back to our
starting principle going with science, right, Science is exactly a
way of getting around motivated reasoning and bias if you're
doing it right. I mean, of course it's possible to
be really bad at science, but if you're following the
norms of science, what it is supposed to do is
(41:34):
make it really hard to get away with motivated reasoning
for an extended period of time. You've got obstacles built
into science that are specifically designed to kill motivated reasoning.
So you've got rigorous empirical method using objective measurement criteria,
trying to take your own subjective judgments out of things.
You've got blinding and double blinding of experiments where you know,
(41:57):
you get people who don't even know what's going on
to perform the experiment. People in the experiment don't necessarily
know what's going on. You've got peer review by critical experts,
you've got replication attempts, you've got professional competition. This is
the thing that often doesn't get emphasized enough, is that
there's professional and career based incentive in science to disprove
(42:17):
the consensus. Right. Yeah, and and again, like like we said,
skepticism is built into the recipe, right, So if you're
doing science in a motivated way, your science number one
is not going to look very strong to begin with,
and number two, you're not going to get away with
it for very long. People are going to figure out
what you're up to. And we've seen examples of this
(42:37):
when people get caught doing scientific fraud. It seems to
be fairly rare, but they get caught. Maybe people can't
replicate your results, people start noticing your regularities in your data.
I mean, it's a system that is just not very
forgiving to this kind of nonsense. Yeah, I mean, I don't.
It depends in each case, like I guess whether it
falls with fraud or bad science. But with artificial g
(43:00):
vity or gravity repelling technology is one example where you
do see studies that have come out where someone claims
to have developed a means of achieving this. Yeah, but
they can't be replicated. It doesn't, it doesn't work. It
doesn't it doesn't pass the tests that are built into
the scientific process, right, And so this is why science
is a good way of arriving at correct conclusions about
(43:20):
the world. I mean, you're not if you go with
the scientific consensus, you might not be right. Every time,
But it's your best bet for being right the most
times of any other thing, you'd go with um. So
the problem is, of course we can't all be scientists,
and even scientists themselves can't use all the tools of
science to solve every controversial question they encounter. Right, So,
(43:43):
even if you're a scientist, there's tons of stuff in
your life where you can't bring to bear all of
that machinery of skepticism and empiricism and impartiality, where you've
just got to work like everybody else. You've got to
decide on some issue of public substance what you think
about it without having the most impartial method possible. So
(44:03):
the question there, I guess is how can we avoid
deluding ourselves on issues where identity protective cognition come into play,
where we can't use the scientific method. Yeah, Like, one
of the points Gone brings up is like, how do
you how do you avoid these scenarios in the future,
Because there's one thing to figure out, how do we
unpollute this pool of scientific communication? But then how do
(44:25):
we how do we avoid polluting this one? How do
we avoid polluting pools that don't really exist yet? As
a matter of like, public consideration. Yeah. In fact that
he mentions in the panel that we should have quote
a science of science communication um, meaning that science communicators
should have some experiments they can draw on that show
them how to predict when an issue it's some just
(44:47):
innocuous question of fact, will become politicized where people suddenly
take cultural positions on it. I mean, there are a
lot of variables involved here. It depends on you know,
who's who's relaying the information there, Uh, that their identity is,
what what ideals they're pushing on everybody, and how that
ends up polluting the message. It also depends on a
(45:08):
number of cultural problems. I mean there are certain polarizing
issues that are issues here in the United States that
are not so in Europe, such as such as climate change,
and then the reverses. You see stuff like genetically modified
organisms being more of a of a hot topic in
say England than it is in the States. Totally. Yeah,
(45:29):
in the in the UK, there's way more controversy over
GM crops than there is in the United States. Not
to say there's not some controversy here, and so yeah,
how do you predict it? I mean, con mentions the
possibility of well, maybe you can run simulations, if there's
some sort of simulation system you could employ, which I
love because they instantly get this sort of star trek
Um hollow deck scenario where we're running simulations and and
(45:52):
trying to catch these these polarization points, these confusions, these
pollution points before they occur, and forget how do you
davocating how to communicate ahead of them? Yeah, and some
things are going to be more predictable than others, Like
there are some facts of science that, if true, tend
to be unfriendly to the world view that certain people hold,
(46:13):
tend to be unfriendly to their values. A couple of
examples Kahn gives is that if you're generally more of
an individualist and an anti communitary and action person, this
may make you inherently opposed to the idea of climate change,
because really the only way that you can do anything
about climate change is with organized communitary and action. Likewise,
(46:35):
if you are a person whose values are sort of
anti big business, that might predispose you to be against
GMOs because you see them as like a tool that's
being used by large agrib business to to get their
profits and to drive other people out of business. And
and you know, and whip the environment into the shape
they want it. And I mean, it'd be worth pointing
(46:57):
out that like you could, for example, use GMOs if
you're a big business in a way that would be
very unethical, very damaging to the environment. I mean, the
whole thing about the scientific consensus on GMOs is that
there's nothing inherently dangerous about GMOs as a rule, But
any individual genetically modified organism could be dangerous, just as
(47:19):
any other organism could. Yeah, that the process is not
the problem. The potential problem is in the product that's
created with it, which can be said of most processes. Yeah,
it can be said just as equally of products that
are created through traditional agriculture. It's not it's not the
gene and the lab that makes the problem. But on
the other hand, uh Kahn points out, you know, there
(47:42):
there can be other things that are not nearly as
determined by our core values. It's not necessarily the conservative
values or liberal values or whatever other kind of dichotomy
you want to establish in the culture determine how how
your opinion comes out on them. Some things are much
more accidental. It can just be that some prominent figure
on one side of the political spectrum just sort of
(48:04):
declares for one side of a factual disagreement, and because
of group affiliation and identity, the groups just start lining
up accordingly, even though it's not determined by anything inherent
to their values. Yeah, you know what I mean? Yeah, yeah, Yeah,
that that's a that's a good point because it's it's
easier to see it coming together a polarization effect occurring
(48:25):
when either the problem or the solution either disagree or
line up with your worldview, such as well, the solution
is for us all to come together as a as
a nation that have some sort of top down governmental
fix that's going to disagree with some people's world views.
If the if the solution is we're all going to
eat you know, plants that grow naturally in harmony with
mother Earth, that's going to fit another worldview more than another.
(48:48):
But but when it occurs outside of those parameters, yeah,
it becomes increasingly different. It's like, what is it a
slow newsweek? Is it just is this just a topic
that happened to be out there during particular politicians campaign
and they just took it up and ran with it,
pressed to distract from something else. I mean, I think
it looks very very possible that there can be issues
(49:08):
where there is cultural cognition going on, where society divides
on a question of scientific fact along cultural lines in
a way that really just doesn't have very much to
do with ideology or values at all. It's just one
side picked one side arbitrarily, and then the other side,
because they know they always disagree, picks the other side.
(49:29):
All right. So in terms of solutions here or possible solutions,
I mean, on one hand, there I feel like there
is validity in the idea that yes, we have to
continue to trust science as a process and trust scientists
that are speaking on behalf of it. Yeah. I mean,
if you're a person who if you're not yourself a scientist,
if you're not yourself an expert, and you don't have
(49:50):
a good reason based in expertise in the subject matter
for disagreeing with the consensus, I'd say it's usually your
best bet to go with what most of the people
who know what they're talking about are saying. Yeah. And
and then beyond that, I agree with without Khan is
saying that we do need a science of science communication,
We need ability to to predict and maneuver around potential
(50:11):
pollution points in our communication of science. Yeah, how can you?
How can you preemptively defend contentious facts about reality from
becoming politicized or becoming subject to cultural cognition? That would
be a really good thing. One thing that Kahan offers
that I think is very interesting is that I get
(50:31):
the impression that he subscribes to what he calls them
the law of social proof, meaning that if you want
to convince somebody to to agree on a point of
fact that they are resistant to for cultural reasons, don't
try to keep giving them more evidence that you're right
and they're wrong, because that's just not what works on us.
(50:52):
I mean, that's what should work on On one hand,
we feel like we should do that because that would
be the logical thing to do, but psychologically that is
not what changes people's positions. What would probably be more
effective is if they simply see people who are culturally
like them and part of their in group agreeing with
this fact. But then that can also come back into
(51:14):
science communication, like who who are the science communicators? Then
that are that are reaching out to these groups that
have a certain amount of polarization present within them. Well,
it makes me think that if if, what science communicators
want to do is try to get everybody across different
cultural groups on the same page. One thing that should
really be encouraged is cultural diversity of science communication, is
(51:36):
that there should be people from all different cultural groups
within a society, all communicating like, hey, here's what the
science says. So at least when it's a question of science,
you can be on the same page and not bring
in these cultural issues because it's not just people from
that other cultural group telling you what the science says.
You're hearing about it from people like you, so they
(51:59):
have it. Hopefully we gave you some some new tools
to illuminate not only the understanding of others, but growing understandings,
and also to better understand how science communication is working
and how these these blockages, the scientific communication breakdowns are occurring,
and how we might even treat them. Yeah, I hope,
I hope today maybe we said something of substance that
will help people. Uh, I don't know, bridge the partisan
(52:22):
divide and come to some agreement about the things we
should be able to agree on. Um. But yeah, it's tough, man,
the partisan divide. It's the thing I often think about
in in how we communicate stuff like this and it
can get you down at times, but we shouldn't despair.
We should try to find ways to get around it.
Come together, have have one of those big happy Uh
(52:43):
what were you talking about? Grow grow food together? Oh yes, yes,
uh in a nice communal Kumbaya moment. Yeah, you might
have just alienated something. Oh yeah, probably did so. Hey
did we alienate you? Did we did? We did we
illuminate anything for you? We of course love to hear
from you. Check out the podcast at stuff to bow
your Mind dot com, where you'll find all the episodes, videos,
(53:05):
blog post and links out to our various social media
accounts and uh and certainly you can always contact us
their Facebook, Twitter, Tumbler, Instagram, you name it. And if
you want to get in touch with us directly to
let us know feedback on this episode or any other,
or to suggest a future episode topic, you can email
us at blow the Mind at how stuff works dot
com for more on this and thousands of other topics.
(53:36):
Is it how stuff works dot com