All Episodes

August 24, 2019 73 mins

If a lie is repeated often enough, are we more likely to believe it? Sadly, the answer is yes. Psychologists call it the illusory truth effect and it influences both our daily lives and the larger movements of politics and culture. Join Robert  and Joe for a two-part discussion of untruths, the human mind and just what you can do to fight the big lies at work in your world. (Originally published July 10, 2018)

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, you welcome to Stuff to Blow Your Mind. My
name is Robert Lamb and I'm Joe McCormick, and it's Saturday.
Time to go into the Old Vault. This episode originally
aired July, and this was part one of our discussion
of the illusory truth effect, one of the many biases
that that, unfortunately affects all of our brains and makes
it harder for us to know what's true. Yeah, this

(00:28):
is one that's, uh, you know, this is gonna shake
some of your your foundation stones. I think you know,
this one's gonna make you rethink the way you interact
with with the world and how you interact with truth. So, uh,
you know, it's maybe a harrowing journey at times, but
I think you're gonna emerge on the other end stronger.
And Part two will come a week from today. Your today,
not our today, as we record this. Welcome to Stuff

(00:54):
to Blow Your Mind from how Stuff Works dot Com. Hey,
you welcome to Stuff to Blow Your Mind. My name
is Robert Lamb and I'm Joe McCormick. In today, we're
gonna be talking about one of our favorite subjects, our
tendency to believe things that aren't true now, Robert, I

(01:14):
wonder is there a false factoid or claim that you
just always find yourself recalling as true even though you've
checked it before and discovered it to be false in
the past. Yeah, this is an interesting question because I
feel like there are things that come up in research
all the time, certainly key things over the years. Uh,

(01:34):
you know, as we research and riot different topics where
I have to correct on, you know, where I think
I knew something and then I'm like, oh, well that's
that's actually now that actually do the research, it's not
not a not a fact. And then then you know,
the same goes for false beliefs, that beliefs that they
creep in the sort of Mandela fact type of scenario.
For instance, there was a time when I thought Gene

(01:55):
Wilder was dead prior to his actual death, and then
as he actually dead now he is actual really dead now.
So I thought he was dead before he was dead
exactly because and I think it was just a combination
of he was not as active anymore, and I wasn't
really keeping up with the Gene Wilder filmography and like
current events related to Gene Wilder, and something maybe I

(02:15):
picked up on some news piece at some point, and
somehow he got clicked to the dead category. And then
when I found out he was alive, it was it
was really like he came back to life. And I
had the same thing happened literally just the other day
with a standout comedian, Larry Miller. I don't think I
remember who that is. Oh. He he had to kind
of you know, dry observational comedy he had. I think

(02:37):
he still acts, but he would show up on say
night Chord. I think, oh, wait a minute, is seeing
some Christopher guest movies. He may have been, Yeah, but
for some reason years ago, I got into my head
that he had passed away, and so occasionally I would
think of Larry Miller and was like, oh, yeah, I
remember Larry Miller. Too bad he passed. And then I
actually looked him up the other day and it turns

(02:59):
out he was not passed away. He's still very much
alive and active. And I was just living in this
fantasy world of dead and Larry Miller's. You know, I
have false beliefs that recur with much more significance, like
I keep remembering that, Yeah, maybe it's just because I
was told this all the time when I was a
kid that vitamin C supplements will ward off colds. That

(03:22):
is not experimentally proven. That that is like not a
finding of science. And yet I just always if I
haven't checked in a while, it just seeps right back in, like, yes,
that is true, vitamin C it'll keep colds away. Well,
it's easy to fall into the trap. I do this
all the time with with various vitamins and some supplements
where I'm like, I don't know if it works, probably

(03:42):
doesn't work, but I'm going to go and take it
just in case, because it's it's vitamin C. You know,
what's the what's what's the harm there? It's kind of
like believing in God just in cases that he exists,
believing in vitamins. Yeah, but then you end up with
like a weird sort of vitamin tentacle going out of
your neck. And you didn't see that come digit, that's
fake news. The Joe vidamin C will not cause a

(04:02):
tentacle to grow out of your neck. But now you've
heard it, it's true. Um. You know, I feel like
there are things that have popped up where where I'll think, well,
I've always heard X, but I've never actually looked it up. UM,
And and then that's where the problem seeps in, you know,
where I just I think I know something, but I'm
not sure, but I don't care enough to actually investigate. UM.

(04:25):
There is one possible example that that comes up, and
that is there was this, of course, the idea that
George Washington Carver invented peanut butter. He know, he had
something to do with peanuts. He had, so yeah, he was.
He was. He is a famous inventor, important to African
American inventor. And I just I didn't know a lot

(04:45):
about him, and I had always heard the peanut butter thing,
but I didn't actually research until I helped my son
with a class project about him earlier this year, and
then I was able to definitely, you know, check that
one off on the mental list, like, okay, this is
the is this is false. He did not invent peanut
better he did, but he did do stuff with peanuts,
could do stuff with peanuts, but not peanut butter. Okay.

(05:08):
You know a huge place where you can see false
beliefs persisting, UM is in people's beliefs about sort of
like political facts or sociological data. A very very common
one is people's beliefs about crime. I think it's because
like crime is one of those like sensational types of

(05:29):
subjects and makes people think about violence, images of blood
they see on the news and stuff like that. You know,
poll conducted by Pew in the fall of get this
fifty seven percent, So a majority of people who had
voted or planned to vote in sixteen said that crime
had gotten worse in the United States since two thousand
eight by every objective measure. Exactly, the opposite is true.

(05:53):
That's just not true. FBI statistics, based off of like
nationwide police reports, found that violent cry time and property
crime in the United States fell nineteen percent and twenty
three percent, respectively between two thousand eight and and so
you think, okay, well, maybe that if that's just police reports,
maybe fewer people are reporting crimes to the police, right,

(06:15):
But also the U. S. Department of Justice Bureau of
Justice Statistics does direct annual surveys of more than ninety
thou households to see about rates of crime that might
not be reported to police. And quote the b j
S state to show that violent crime and property crime
rates fell twenty six percent and twenty two percent, respectively,
between two thousand eight and so a majority of people

(06:38):
are believing something that by every measure we know, is
not true. Crime has gone down, and yet a majority
of people believe it has gone up. And it's not
hard to see why that might be true when you
consider like the political messaging of certain politicians. It's also
you could also, of course think about just people negativity bias, right,

(06:58):
the tendency to believe things are worse than they are
in the broad sense or mean world syndrome looking at
dangerous stuff happening on the news and thus having an
overrepresentation of it in your mind. But I think we
would be wrong to ignore the effects of hearing specific politicians. Well,
for for example, in twenty sixteen, specifically it was Donald

(07:20):
Trump a lot talking about how crime is through the
roof right, And and to your point, we can look
to statistics on this. This is not something that is uh.
That is just you know, in the ether, we have
hard data. It's not a matter of opinion. It's just
like every measure we have says that's not correct. But
what about other beliefs. I mean that that's certainly not
in isolation. There are lots of cases where there are

(07:42):
widespread beliefs in things that are just simply factually not true. Yeah,
I'll run through a few here that that range and topic.
For instance, here's a nice science related when to kick
off with. In a two thousand fifteen Pew survey, only
thirty percent of Americans knew that water boils at all
lower temperature at higher altitudes. Thirty nine percent said it

(08:04):
would boil at the same temperature in Denver in l A,
again Denver being at a far higher altitude, and had
it reversed. So the majority, something like two thirds of
people were just flat wrong yes and uh yeah. And
to put that in perspective with another science fact, most
Americans in this two thousand fifteen survey correctly identified the

(08:27):
Earth's inner layer, the core, as its hottest part, and
nearly as many two knew that uranium was needed to
make nuclear energy and nuclear weapons. Well, should we be
comforted by the fact that that's what people know? I don't, like,
they don't know much about water, but they know about
nuclear weapons. Well, I mean, on one hand, nuclear weapons

(08:50):
was is and was more in the news. Uh, And
then on the other hand, like the the inside of
the earth is more engaging and and also completely on politicize. Uh, well,
well water is not politically boiling point of water is
not politicized. But it's also not very sexy. I guess
like it's it's it's one of those things where unless

(09:10):
you're actively moving from low to high altitudes or you know,
living part of your time and dinner from part of
your time in l A, I guess it's it's very
possible to live your entire life without really having any
real world experience with the difference. Uh though, I do
feel like if you read enough baking manuals, it comes up. Yeah,
but maybe you just don't remember which way it goes.

(09:31):
I guess that you know one of the ones you've
got here that that has come up many times in
my life. It's come up enough that I know the
right answer now. But the misconception that you can see
the Great Wall of China from space, Oh yeah, this
is one that I have to admit. I think I
used to adhere to again without really referencing it, because
it just it had that kind of truthiness to it, right,

(09:53):
and you want it to be to be real. The
idea that that this this epic structure aided Uh, you
know long Ago is visible from space, but it's actually
been been disproven multiple times. It's it's only visible from
low orbit under very ideal conditions, and it's not visible

(10:14):
from the Moon at all. Because that's another version of
it that it's that that the Great Wall of China
can be seen from the moon. Yeah, I guess there
are nuances to the word visible. What's what I mean?
But in the normal sense that you would mean it's
not visible from space. Correct. Now a two thousand sixteen
you gov poll, this was this is a UK group.
They looked at how belief in pizza gate that thing, Yeah,

(10:38):
how that shakes out across different voter groups. Now, that
was this vast conspiracy theory people had about how there
was a pizza restaurant in Washington, d C. That was
like running child slavery rings that was linked to the
Democratic Party. Yeah, it had to do with an idea
that Clinton campaign emails supposedly talked about human trafficking and pedophilia.

(11:00):
And according to this particular poll, sevent of polled Clinton
voters believed that that this was the case, this was
a reality, and of Trump voters did um. And then
there's another classic they looked out to to put this
in perspective, the idea that President Barack Obama was born
in Kenya. Kind of alarmingly enough, across both groups of voters,

(11:24):
both the Clinton voters and the Trump voters, they found
thirty six percent believed it, despite the fact that that too,
has been debunked time and time again. Yeah, I mean,
it's crazy that these types of beliefs can catch on
so well, especially like we we understand very well the
way that political ideology and tribal thinking affects the way

(11:45):
we form opinions. Obviously, our our opinions are deeply informed
by what people we view as are in group believe,
and so we want to be in line with the
in group and and stuff like that. But you also
can't really ignore the fact that these are things that
if you pay attention to certain sources, you're going to
be hearing over and over and over again. And what

(12:07):
effect that might have because it's you know, widely accepted
fulk wisdom that if you repeat a lie enough, people
start to believe that it's the truth. Right, That's one
of the things. I mean, I don't know, you've said
it enough times that I'm already convinced exactly, I mean
almost going along with this. Uh. There's a quote that
often gets sourced to the Nazi propaganda minister, Joseph Gebbles. Uh.

(12:29):
Some versions of the quote say something like, if you
repeat a lie often enough, people will believe it, and
you will even come to believe it yourself. I couldn't
find any evidence that just Gebbels actually said that. It
seems to be a misattribution, but it's sort of a
paraphrase of similar ideas that are, you know, within that
that frame of thinking. Like Adolf Hitler himself wrote in

(12:51):
mind comp quote, the most brilliant propagandist technique will yield
no success unless one fundamental principle is born in mind
constantly and with unflagging attention. It must confine itself to
a few points and repeat them over and over here
as so often in the world, persistence is the first
and most important requirement for success. So just the fact

(13:13):
that Hitler said it obviously shouldn't make us think, well,
you know, he's right with Hitler though. If if Hitler
was good at anything, it was getting lots of people
to believe lies. Certainly, so I think, because this is
such an important issue. And because widespread misconceptions are so common,
and because they can, in fact, especially in some political circumstances,
be so destructive, And because the repetition of lies and

(13:36):
false statements in in at every scale of existence, you know,
in in mass media and in our personal private lives,
is so common, I think it's worth looking at the
actual empirical case. Is this true, the idea that repeating
statements over and over does that actually change what we believe.
It's one of those things that you know, it sounds

(13:57):
so common sensical, you just assume it's true. But according
to the logic we're using now, those are exactly the
kinds of statements that maybe we should be careful about. Yeah,
and I think this is an important important topic for
for everybody. I don't care who you voted for in
any previous elections or which political party in your given
system that you adhere to. I think if you're listening

(14:19):
to this show especially, you want to think for yourself.
You want to reduce the amount of manipulation that's going
on with your your own view of reality. And and
and that's what we're going to discuss here today. We're
gonna discuss the degree to which false information can manipulate

(14:39):
our view of reality and ultimately, what are some of
the things we can do to to hold onto our
our our individuality and all of this exactly. So this
is going to be the first of a two part
episode where we explore the liar's best trick, the question
of repetition and exposure in forming our beliefs and changing
our attitudes. So that's going to be the jumping off

(15:01):
point for today's episode. Does exposure and repetition is hearing
a claim and hearing it repeated actually have the power
to change our beliefs? Or is that just unverified folk wisdom? Yeah?
And and of course it goes it goes well beyond politics.
It also gets into marketing. You know, we've we've touched
on the manipulative nature of marketing and advertisement on the

(15:23):
show before, and it's always one of the things you
always come back to. It's always about messaging, right, Like
what is the message of the product? What's the message
of the ad campaign? And how they just continue to
hammer that home. Why do brands have slogans? Yeah? Why
don't they just tell you a positive message about the
brand that's different every time? Why do they tell you

(15:43):
the same message in the same words in every commercial. Yeah,
why did those fabulous horror trailers from the nineteen seventies
say the name of the film eighteen times, don't go
in the basement, don't go in the basement, No one, no,
just seventeen will be admitted. Yeah, it's all. It's all
kind of part of the same situation. All right, Well,
we're going to take a quick break and when we

(16:04):
get back, we will dive into the research in the
history of psychology about repetition and exposure. Thank you, thank you.
All right, we're back. So the first question we're going
to be looking at today is whether anyone has actually
studied this question, this question of whether exposing people to
acclaim and then repeating the claim makes them believe it,

(16:27):
Whether anybody studied that in a controlled scientific context. And
the answer is a resounding yes. There are I think
dozens of studies on this subject in various forms. Probably
the flagship study on this, the first big one that
everybody sites that that really got people into the subject,
that got the ball rolling on it was from ninety
and it was by Lynn Hasher, David Goldstein, and Thomas Topino,

(16:51):
and it was called Frequency and the Conference of Referential
Validity in the Journal of Verbal Learning and Verbal Behavior,
And that was as I said in nineteen seven. So
the authors start out in the study by talking about
how most studies of memory that take place in the
lab involved useless or meaningless information units. So researchers, for example,

(17:13):
might try to see how well subjects remember a phrase
like I just made this up. The purple donkey was
made of soft spoken muscular elves, Like, can you remember
that word for word? The purple donkey was made by
muscular elves. Now, now you're close, not made of Yeah,
it makes a lot more sense to say made by wow.

(17:34):
But I see, I've already missed up the origin story
of the purple donkey. Statements like this have no importance
in the real world, and part of what they were
talking about is that we're testing for memory and things
that don't have any validity to reality. Um. So the
authors write that they're curious about what kind of processing
subjects do with information units that might have validity in

(17:56):
the real world. For example, factual statements like quote, the
total population of Greenland is about fifty thou which at
the time of the study it was. I checked, though,
that seems like a lot more people than should be
in Greenland, right, I was. I was surprised by that. Well,
but I would agree based on I'll be a limited
amount of of information that I've I've read and viewed

(18:18):
about Greenland. You know, you typically, in my experience, Greenland
shows up in nature documentaries, and of course you're going
to see rather barren uh locations in those films. Well,
greenlanders out there in the audience. Let us know if
you're listening, what's life like up there in Greenland. I'm
interested now. But anyway back, So, yeah, population of Greenland

(18:39):
at the times about fifty and so statements like this
both refer to something that could be true or false
in the real world, and there are also things that
people are probably uncertain about, like do you know what
the actual population of Greenland is? I didn't know before
I looked it up, And so we know that the
statement is either true or false, but we aren't sure
whether it's true or false. And of course, to go

(19:00):
back to a previous episode, you're kind of anchoring my
expectations by throwing out without any population data in my
head about Greenland, like that's suddenly all I have to
go on, Oh yeah, that's interesting. Also, so like it
could be three thousand or it could be like a million,
and either you're sort of moving your guests range toward fifty.

(19:22):
So the thing they point out is, even though most
people don't know what the population of Greenland is, we're
often willing, into some extent able to make guesses as
to whether statements like this are true. So where does
this semantic knowledge come from? When we feel like we
have knowledge to offer a guess about what the population
of Greenland is even when we don't really know what

(19:43):
is that what allows us to judge these questions? And
the authors note that frequency is a really powerful variable
in all kinds of judgments we make about the world.
So they hypothesize that quote frequency might also serve as
the major access route that plausible statements have into our

(20:04):
pool of general knowledge. So the idea is that we
build our knowledge base based on how frequently we are
exposed to ideas. You hear an idea a lot, and
that gets reinforced in the knowledge base. You've never heard
an idea before, or you don't hear it a lot,
it doesn't get reinforced, and it doesn't exist in the
knowledge base. So here's the experimental part. Researchers came up

(20:26):
with a list of a hundred and forty true statements
and false statements, crafted so they all sound plausible. Yeah,
they could be true, you know, the average person would
be unsure whether or not they're true. And the statements
were on all kinds of subjects like geography, arts and literature, history, sports,
current events science. A few examples of true statements included

(20:47):
things like Cairo, Egypt has a larger population than Chicago, Illinois,
and French horn players get cash bonuses to stay in
the U. S. Arm True. Well, i'd see it. I
should have joined the army after all. See, I was
a french horn player really in high school. Yeah, I
didn't know that. What's it like playing the french horn.
It's just, you know, there's a lot of spit and

(21:08):
a lot of shoving your hand up horns. That's it.
Otherwise it's like playing a trumpet. Now. See I actually
played trumpet, and it's a lot less fun than what
you're describing. Yeah, I mean, I don't know. There is
an elegance to the way you hold it, and you again,
you have your hand in the inside the horn. I
don't know. Being a trumpet player to me always felt

(21:28):
like being a person who's complaining at full volume. Yeah, yeah,
there is. There's more of an outward stance with the trumpet, right,
you're blasting outward, But in the French horn, you it's
it's more like you're playing music into yourself. That's quite beautiful,
more beautiful than any music I ever played on the
French horn. Uh So we got to get back to

(21:51):
the study. Okay, so that's that's supposedly true. French horn
players at the time got cash bonuses to stay in
the U. S. Army. Examples of false statements where things
like the People's Republic of China was founded in ninety
seven it was actually ninety nine. Or the copy bara
is the largest of the marsupials. That's not true. The
largest marcipial is the red kangaroo. The largest known in

(22:11):
the fossil record is this thing called the extinct dip protodon.
Now that in the capy bar is a rodent, right,
I only is the capy bar and marsupial. I didn't
even look it up. I only know that because I
go to a lot of zoos these days. But it
is a mammal, it's a it's a rodent. It is
a mammal, and it is a rodent, not a marsupial. Well,
there we go. So it's wrong in multiple ways. So

(22:32):
you got this big list that came up with of
true and false statements, and all of them should sound
plausible to the average person, But most people are not
going to be likely to know for sure whether they're
true unless they just happened to have some special random
knowledge or expertise. And so researchers held three sessions with participants,
each separated by two weeks, and on each of the sessions,

(22:53):
the participants were played back a tape of a selection
of sixty recorded statements from that list, and the subjects
were asked to judge how confident they were that the
statements were true. And this was on a scale of
one to seven, with like four being uncertain, five being
possibly true, six being probably true, seven being definitely true.

(23:14):
And in each session, some of the statements were true,
some were false. But here's where the real magic happened.
At the second session and the third session, each time
subjects got a mix of new true and false statements
that they've never seen before, plus true and false statements
that they had already seen in the previous sessions. So

(23:34):
while most of the claims they saw were new, a
minority got repeated each time. And what the researchers found
was that whether a statement was true or false, the
more times the students saw it, the more they believed it.
So again, this would be this, this, this would be
the principle in action. The more they're hearing this, uh,

(23:55):
this this false fact, the more they're coming to believe
that it is true. Yeah, even in this constrained, kind
of weird experimental context where they're aware that some of
these facts are going to be false, it's not like
they're being told this persuasively by a person trying to
convince them. They're just reading this from a list of
statements that are known to be either true or false.

(24:17):
I mean, there's no there's no persuasive aspect to this
at all, right, Right, These are not politically charged or
really charged by worldview at all. They're they're just plain
neutral statements that really have very little interest to most people.
Also probably, but what happened was whether the statement was
true or false, people believed it more if they saw

(24:39):
it more times. So I've got a little chart in
here of what happened with the false statements. You can
have a look at Robert as you can see the
new false statements. The false statements people saw for the
very first time covered around, you know, like four or
four point one across all three sessions. That would correspond
to people saying they're uncertain. I don't know. I don't
know whether French horn players get a cash bone us

(25:00):
for staying in the army and playing into the self
and sticking the hand up the horn. But in the
second session there repeated false statements jumped up from about
four point over four point one to about four point five,
and then in the third session up again to about
four point seven. And we only saw what happened with
two sessions. Who knows what have happened what might have

(25:21):
happened if you had continued adding more sessions. So just
seeing a statement more than once appeared to make it
more believable even though it wasn't true. And the pattern
was roughly the same for true statements, which isn't all
that surprising since the experiment was based on, you know,
statements that people didn't know whether they were true or
false to begin with. So, the authors wrote in conclusion, quote,

(25:42):
the present research has demonstrated that the repetition of a
plausible statement increases a person's belief in the referential validity
or truth of that statement. I don't know why they
had to say referential validity. They could have just said truth.
And that's that's some science writing for you. Uh, in
the truth of that statement. And indeed, the present experiment

(26:03):
appears to lend empirical support to the idea that quote,
if people are told something often enough, they'll believe it.
Uh so, so yeah, this is this is the first
real study to find this and a few other things
the authors thought were worth considering. Uh. The fact that
this effect was displayed in statements from a big broad
pool of different types of subject matter suggests this is

(26:24):
not extremely context dependent. Right, It's not just going to
be political beliefs that are subject to this. It seems
to be all different kinds of statements in all different
kinds of domains. Another thing they noted was that the
effect was present for true statements and false statements. Either way,
if students saw the claims more often, they believed them
with greater confidence. But another takeaway is that the effect

(26:46):
is not huge for false statements. Three exposures was roughly
enough to get you from I'm uncertain to it's possibly true.
But as we mentioned before, how many this is just
two or three sessions right right? Who knows what what
would have happened if maybe you had done this more
times in a row, or if there had been other

(27:07):
factors affecting whether people were likely to believe these things
to begin with, say if they had valences to the
person's political identity or something like that. Yeah, you know,
as we're researching this into and discussing it, I couldn't
help but think of notable examples of uh false stories
about generally like like celebrities from the past. And I'm

(27:29):
not going to mention any of them specifically. Why not, Well,
because you know they all tend to be a bit crude.
There there are several of them about like carrying down
various sort of you know, pretty boy rockers or actors
from the past, uh in in the in. The interesting
thing about him is these are generally like pre internet
um stories that had to circulate the word of mouth

(27:52):
or I think in one case there was talk of
like a whole bunch of of facts is going out
in Hollywood where someone was just basically just wanted to
take somebody down because they didn't like them. I remember,
I think, like ninth grade here is starting to hear
this bizarre story about rich your gear. Yeah, that's that's
the main one I'm thinking of. And I think it
basically comes down to Richard Geary's is a handsome, successful

(28:13):
guy and and for a lot of people, you want
to like really you know, and knock him down or
not you Yeah, and uh, and so when you encounter
a bit of slander or libel like that, or uh,
just a ridiculous story, you're going to be more inclined
to believe it if you kind of want it to
be true, right or if you or you're like, yeah,

(28:35):
let screw that guy. I'm gonna I'm gonna go ahead
and believe this, or even if I don't believe it,
I'm going to pass it on. But either way, whether
or not you're predisposed to believe it's true, it looks
like this initial study at least provides evidence that you
would be more disposed to believe it's true in either case. Like,
so whatever you're starting point is it's gonna nudge you up,
like if nothing else becomes word association, Like if you're

(28:58):
not really a fan, say Richard Gear's work, uh, and
you can't name your your favorite Richard Gear film off
the top of your head, that might be the primary
keyword that pops up when you hear his name. Yeah,
it could be. Uh. Yeah. So so this bizarre effect
that we're talking about where hearing a fact repeated, even
if you've got no good reason to believe it's true,

(29:21):
just hearing it repeated causes you to be more likely
to believe it. This came to be known first as
the truth effect, and then later on probably a better
title was the illusory truth effect. I think we should
use the second one because otherwise that makes it sound true. Yeah,
it just makes it sound like yeah, if you if
you repeat something, then it is then it is true.
There is no question anymore. So. The basic version of

(29:42):
the illusory truth effect is quote. People are more likely
to judge repeated statements as true compared to new statements,
And another way of putting it is that all other
things being equal, you're more likely to believe a claim
if you've heard it before than one you haven't heard before,
and the more times you you're the claim, the more
likely you are to believe it. But so far we've

(30:04):
just talked about one study, right, this this one nineteen
seventy seven study, uh what we can call it the
Star Wars study. If you want the Star Wars study here.
It's a fairly small sample, just one study. If you
want to be skeptical and rigorous, maybe especially because this
backs up ful quisdom, which is always something you should
be careful about. We should see if the effect has
been replicated by other researchers, and boy, howdy it has.

(30:27):
That's right. This next one comes to us from nineteen
seventy nine Journal of Experimental Psychology, Human Learning and Memory,
the work of Frederick T. Bacon. Yeah, this was called
the Credibility of repeated Statements memory for trivia. So Bacon
performed he was trying to replicate this this effect. He
performed additional experiments to test the previous team's conclusions and

(30:49):
add some nuance. So his first experiment, you got ninety
eight undergrads and they had two sessions in which they
were asked to rate sentences as true or false, with
three weeks between the two Sessions, and Bacon found that
the repetition illusory truth effect was modulated by whether the
subjects consciously believed that a sentence had been repeated. That is,

(31:10):
if they remembered that they had seen the sentence last time,
they were more inclined to believe it. If they believed
they were seeing a sentence for the first time, they
were less likely to believe it. And this was true
regardless of the statements themselves. I can't help but think
of our modern version of this with Facebook feed right,
because you're inevitably, if you're your Facebook user, or perhaps

(31:32):
if you're a Twitter user or some other social media
you're you're scrolling down right and there are a lot
of sentences coming at you. Some you just kind of
read in passing. So maybe you don't read at all.
But are you actually stopping to really think about what
a particular headline or you know, or paragraph is saying
or is it just kind of scrolling in the background
of your mind. Yeah, And the result of this one

(31:54):
experiment here would seem to indicate if it has validity,
it would mean the ones you stop and pay attention
to you and make a memory about are the ones
you're more likely to believe later on. But then also
in another experiment, he had a group of sixty four
undergrads and he replicated the illusory truth effect and found
that students believed repeated statements to be more credible even

(32:15):
if the students were informed that the statements were being repeated.
So you can directly tell somebody, hey, I know, I
just asked you if it was true or false that
zebras could automatically detach their own tongues and fling the
tongues at attacking hyenas. I asked you that same thing
three weeks ago. It may or may not be true,
And even in this case, repeating the statement still makes

(32:38):
them judge it to be more true than statements they're
seeing for the first time. So you can warn people
that something fishy is going on and they still fall
for it. So you could you could straight up share
a piece of just undeniably fake news on social media
and and said, hey, guys, this is this is something news.
This has been totally debunked. Um, you can look it

(33:00):
up on Snopes, etcetera. And that's still not going to
completely disarm the piece that you're sharing. Well, we will
talk so I would say, yes, we will talk more
about that in the second episode where this kind of
thing comes into conflict with real world beliefs. And just
to be clear, I made up that zebra thing that
that wasn't from the vacant study. I thought that would
be clear. But that's number one, not true. Number two,

(33:22):
as far as I know, not one of the examples.
Bacon used, Right, Well, I'm sorry you had to drag
Zebras into all this. Joe. Well, you know, I like
the idea of a weaponized tongue that's gonna go beyond
the X Men. Surely that exists in reality. Well yes,
but but not with Zebras. No. No, I guess that's
amphibians and stuff. Okay, okay, So back to the study,
so Bacon says in his abstract quote. It was further

(33:45):
determined that statements that contradicted early ones were rated as
relatively true if misclassified as repetitions, but that statements judged
to be changed were rated as relatively false. So even
if you remember that you saw something before, you're more
likely to believe it's true. It's kind of odd. That

(34:06):
makes you wonder, what is the initial uh, what's the
initial stimulus that caused you to misremember that you had
seen it before. Well, as we've discussed on the show before,
I mean, there are multiple ways that false memories can
be can be encoded. Oh yeah, absolutely. And so Bacon
concludes that basically, people are predisposed to believe statements that

(34:26):
affirm existing knowledge and to disbelieve statements that contradict existing knowledge.
That's not all that unusual, right, But but it's specifically
the repetition effect that seems to be playing a role here.
Let's take a look at another study. How about nine
two Marian Schwartz Repetition and Rated Truth Value of Statements
from the American Journal of Psychology. So Schwartz here has

(34:49):
conducted two experiments on what psychologists were by this time
calling the truth effect what we're calling the illusory truth
effect UM. So, experiment one, you get a group of
subject and they rate claims on a seven point truth
value scale, just like in the first study, the star
Wars study, the seventy seven study, UM and a different
group of subjects rated the same statements on a seven

(35:10):
point scale of how familiar they were with the statements
before the experiment started. How familiar are you with this
repetition increased both ratings. So both pre experimental familiarity as
well as the perceived truth value. They both went up
when people saw them more than once. That's not surprising.
Again the replication, and then also the fact that you

(35:33):
have seen something before, we'll tend to make you more
familiar with it. Then you've got another experiment here. Second
one replicated the illusory truth effect. Again found that it
didn't matter whether you mixed up repeated statements that people
had seen before with new statements or only showed them
repeated statements. Either way, belief and repeated statements went up.

(35:53):
And this was done so that they could rule out
the possibility they're thinking, you know, maybe it's only by
contrast with new and unfamiliar statements that repeated one seem
more credible. That is not the case. Either way you
do it, if you've seen it before, you believe it more.
And so this study has taken as evidence that the
feeling of familiarity with an idea might be an important part,
or even the most important part, of how we judge

(36:16):
something as true or plausible. But we should shift to
asking the question of why why would increasing familiarity with
the statement through repetition make it seem more true to us.
It makes me think about this passage from Wittgenstein in
his Philosophical Investigations, about how absurd it would be to
use repetition of a mental representation as evidence that the

(36:39):
representation is correct. He writes, quote, for example, I don't
know if I've remembered the time of departure of a
train right, And to check it, I call to mind
how a page of the timetable looked. Is it the
same here? No, For this process has got to produce
a memory which is actually correct. If the mental image

(36:59):
of the timetable could not itself be tested for correctness,
how could it confirm the correctness of the first memory?
As if someone were to buy several copies of the
morning paper to assure himself that what it said was true.
And that's that's kind of what we're doing. Like he's
talking about mental images, But the general point is a
good one. We're essentially buying several copies of the same

(37:22):
newspaper to to increase our belief that what the newspaper
says is actually accurate. Now when it possible, interpretation that
comes to mind is just like the idea of say,
walking us picking out stepping stones to cross a creek. Right,
you step to one stone and it doesn't slip out

(37:42):
from underneath you, and so you use that too to
make your way across the other stones and hopefully make
it across the entire creek without getting your feet wet
or falling in being swept down stream to the to
the waterfall. So to what extent are we just like
trusting anything that hasn't resulted in catastrophe thus fire. Well,

(38:02):
I would say that it would make more sense for
that to be true with sort of embodied physical, experimental
knowledge about the world than it would for that to
make sense for that to apply to semantic knowledge of
things people tell us. Or maybe our brains just aren't
good at differentiating between semantic knowledge that's imparted through words.
You know, maybe somebody saying all those stones will hold

(38:25):
you up is encoded by the brain in sort of
the same way as testing out one stone at a time.
Uh yeah, I don't know. So this is what we
should explore for the rest of the episode. I think,
why should repeatedly exposing ourselves to the same information increase
our confidence in it? If we didn't have good reasons
to believe it the first time. It's clear that this

(38:45):
is what's happening, But why does it happen this way?
All right, we'll take one more break and when we
come back, we'll we'll jump into this. Thank alright, we're back.
So we're asking this question of why repeatedly exposing ourselves
to the same information would increase our confidence if we
didn't have good reasons to believe the information the first time.

(39:07):
It's clear from several experiments that this is what happens
in our brains. If if a statement is repeated, we
believe it more. But why do our brains work that way?
It doesn't necessarily make sense. Yeah, And one possible interpretation
that came to mind is, of course we've touched on
this before, that that we're all social animals. Yeah, so
I've I've wondered if there this is a byproduct of

(39:29):
the drive to fit in with a given group or tribe,
that there's ultimately a survival advantage and getting along with
the group, and so does that bleed over into highly
repeated or highly circulated lies or untruths. So basically like
if there is a lie going around in the group.
You'll get along with the group better if you just
accept the lie. Yeah, And I'm not you know, certainly

(39:50):
after looking at more of the research, I'm not arguing
that that is the core um mechanism involved here. This
is worth exploring that. But but I but I do
like wonder to what extent that is that's playing a role.
Because you we we all have our our groups that
we are involved in, our our our friends, our family,
or our work groups, our social media groups are are
sort of echo chambers that we find online. And uh,

(40:12):
does it make you more susceptible to the lie just
because there is this this ingrained need to fit in
with that group too, to share the same values, and
to put it on in the prehistoric framework, to to
continue to have access to the fire and the and
the feast. Yeah, I think I think that's a possibility

(40:32):
worth exploring. Let's let's take a look at it. Okay. Well,
I started looking into this a little bit and I
ran across a paper titled the Evolution of Misbelief misbelief
misbelief from two thousand nine. This is published in Behavioral
and Brain Sciences, and it was by Ryan T. McKay
and Daniel Dinnett. Daniel Dinnett, all right, so they approached

(40:54):
the following. I guess you could call it a paradox
in the paper. Given that we evolve to thrive in
a fact based world, what other kind of world could
there exactly? Yeah, I mean, we we're dealing with with
actual reality here. But but given that we've evolved to
thrive in this world, shouldn't true beliefs be adaptive and
misbeliefs be maladaptive. It's clear that in many cases, probably

(41:18):
most cases, that is the way things are. Right. Believing
that you are able to fly off the edge of
a cliff is not good for you. Believing that polar
bears want to cuddle with you is not advantageous. Holding
false beliefs like this doesn't work out well for people. Yeah,
they're they're they're reckless and dangerous misbeliefs that clearly like,

(41:38):
if you've reached the point where you're believing in that,
you're going to go extinct. So it's obvious that there
is going to be at least some kind of major
selection pressure in the brain for shaping brains that believe
mostly true things, unless there are cases where believing something
that's false outweighs the negative the drawbacks essentially. So here here,

(42:00):
here's what they wrote. Quote on this assumption, our beliefs
about the world are essentially tools that enable us to
act effectively in the world. Moreover, to be reliable, such
tools must be produced in us, it is assumed by
systems designed by evolution to be truth aiming and hence
barring miracles, These systems must be designed to generate grounded beliefs.

(42:24):
A system for generating ungrounded but mostly true beliefs would
be an oracle, as impossible as a perpetual motion machine.
I like that. Yeah, So there's got to be like
a grounding procedure through which we can discover true beliefs
if we're going to have them. Otherwise we're just talking
about magic. But we have to account for these varying
levels of misbelief and self deception in the human experience.

(42:46):
They write, If evolution has designed us to appraise the
world accurately and to form true beliefs, how are we
to account for the routine exceptions to this rule instances
of misbelief? Most of us, at times believe oppositions that
end up being disproved. Many of us produce beliefs that
others consider obviously false to begin with, and some of

(43:07):
his form beliefs that are not just manifestly but bizarrely false.
How can this be? Are all these misbeliefs just accidents,
incidences of pathology or breakdown or at best undesirable but
tolerable byproducts. Might some of them contrat the default presumption
be adaptive in and of themselves. I like this distinction

(43:29):
they're making. I think this is actually useful. So they're
breaking misbeliefs down into two basic kinds of categories, right
right that they're talking about. One those resulting from a
breakdown in the normal functioning of the belief formation system.
This would be delusions malfunctions, so things like face blindness
or or catard syndrome. Okay, this is when the brain

(43:52):
is creating incorrect beliefs because it's not working right, it's
not doing what it's supposed to be doing. But then
the second category are those that are arising in the
normal course of that system's operations, So beliefs based on
incomplete or inaccurate information. These would be This would be
a case of manufacture. And we'll get into examples of

(44:12):
this in a second There could be tons of examples.
One that comes to my mind that would be an
example of this would be optical illusions. When you when
you witness an optical illusion, you have a false belief
that has been generated by your brain. But it's not
because your brain is doing anything wrong. It's just because,
like it's being exploited by a situation that's not part
of its normal what it normally needs to do. Right. Yeah,

(44:34):
they point out that it's it's easy to think of
these in light of of an artifact. Is it failing
due to a limitation in the design in a way
that is culpable or tolerable? Examples here being say a
clock that doesn't keep time, keep good time versus a
toaster oven that doesn't keep time at all. You can't
expect the toaster of and to keep time unless it's
got a time right. Well, yes, so that's true. Yes,

(44:55):
I would have said a purple donkey built by muscular
elves that doesn't keep because you wouldn't even expect that's true?
Yes now, But but it gets more complicated when you
go into the biological realm, because what counts as immune function, dysfunction,
a pathogen infection, but what it would have. Ultimately, the
immune system airs by defending the body against say, a

(45:16):
transplant organ, and may ensure its survival because the body
is going to reject that attempt to reject that that
heart transplant, even though the heart transplant could save the patient,
will save the patients. So it seems like in order
to understand this, you almost have to understand the context, right, right,
They say that they invoked the work of Ruth Garrett Milliken,

(45:38):
who in saying that we can't look to an organ's
current properties or disposition, we have to look to its history.
That makes sense to me. Organ transplants, of course, are
not part of our evolutionary history. So this is just
the body functioning normally in rejecting the invader heart. Right.
The body is not malfunctioning, is doing what it's supposed
to do. We're just throwing a situation at it that
it's not prepared to deal with. Yeah, so that brings

(46:01):
us to the more human examples, you know, lies and
uh and so forth. Oh, that's interesting. So a lie
could be like a thing that our bodies were not
really a prepared to deal with very well, which is
weird to think of because of how common lies are. Yeah,
they right. However adaptive it may be for us to
believe truly, it may be adaptive for other parties if

(46:23):
we believe falsely. Now that, of course this is something
Just to to interject here, I think this is something
that we ultimately see holds holds true with other animals,
like the role of deception of course in u in
in in certainly in hunting and defense, in even acquiring mates,
they continue an evolutionary arms race of deceptive ploys and

(46:44):
counterploys may thus ensue. In some cases, the other parties
in question may not even be animate agents, but cultural
traits or systems. Although such cases are interesting in their
own right, the adaptive misbeliefs we pursue in this article
are beneficial to their consumers. Misbeliefs that evolved to the
detriment of their believers are not our quarries, so they

(47:07):
stress the difference between beliefs and what they referred to
as a leafs uh and uh. For for instance, if
I'm freaked out by tall buildings, I might not believe
that I'm going to fall off, but I might a
lieve that I'm going to fall off the leave as
in like like a moral yes. Yeah, and in this
case it seems to be something that is tall a

(47:28):
tolerated side effect of an imperfect system. But it's not
McKay and dinn itt that end up bringing up the
illusory truth effect, but psychologists Pascal boy Yer in commentary
on the paper, Uh, this particular paper from from McKay
and Dinnet, by the ways available online. I'll try to
include a link to it on the landing page for
this episode. But in his commentary, Boyer rights dramatic memory

(47:52):
distortion seem to influence belief fixation. For instance, in the
illusory truth effect, statements read several times are more likely
rated as true than statements read only once. People who
repeatedly imagine performing a particular action may end up believing
they actually performed it. Oh yeah, this is something I've
read before. If yeah, if so? If you just like
have people walk through a task in their mind and

(48:15):
then ask them later if they remember doing it. A
lot of times they remember physically acting it out. Yeah,
I've certainly had this occur with me, Like there'll be
something I need to do and I'm thinking about doing it,
and then I can't remember if I actually carried it out,
and this is uh, this is called imagination inflation, He writes.
Misinformation paradigms show that most people are vulnerable to memory

(48:37):
revision when plausible information is implied by experimenters in social
contagion protocols, people tend to believe they actually saw what
is in fact suggested by the confederate with whom they
watched a video. So that he's just listing lots of
the ways that we are end up with false beliefs.
There's a plethora of examples of mechanisms for putting false

(48:59):
beliefs in our brain. Right. Yeah, I know there's a
lot of territory covered in this paper in the attached responses,
but I can't I come back to the sort of
key reason that I sought it out, Like, like, when
is self self deception helpful? Is it necessary for the
deception of others? It doesn't quite seem to be, Like,

(49:19):
you don't have to believe the lie yourself to tell
someone else the lie, regardless of what telling the lie
repeatedly might do to you. Well, so, boy, he's skeptical
of the idea, right, So is he basically saying, like,
you don't want to overstate the the adaptiveness of believing lies.
But yeah, he drives something that memory need only be

(49:39):
as good as the advantage and decision making it affords. Okay,
so he's essentially going for the byproduct thing for most
most beliefs. He's he's saying like, look, you know, memory
needs to do certain things, and in the course of
doing those things, it may generate some false beliefs. We
don't have to assume that those false beliefs themselves are beneficial, right, Yeah,

(50:01):
And and to come back to McKay and Dinnett, they
point out that natural selection doesn't seem to care about truth.
It only cares about reproductive success, so that there are
various cases where a particular false belief or misbelief is
seemingly adaptive. You believe in a non existent fire god, Okay,
but say that its laws inhibit overt selfish behavior that
gets you in trouble and not work out for you

(50:23):
in the long run. So in that case, you have
an adaptive misbelief. Now, if the fire God wherever to
actually appear, then this would be an adaptive belief. But
then there are arguably a whole host of other false
ideas that seem adaptive positive self deceptions about ability the
placebo effect for instance. Um, they bring up the self
theories of intelligence, entity and incremental view of intelligence. Um.

(50:47):
This being like, m Am, I born with a certain
and uh intellect? Or do I develop it over time?
And how those different core beliefs can affect your effectiveness
in life Like doesn't mean like, oh, I've to work
really hard in order to stay stay on top of
this or is it a situation where oh, I'm I'm brilliant,
I can accomplish anything. And and of course I think

(51:08):
you can argue for pitfalls on both sides. And of
course there's always the optimal margin of illusion in play,
which comes to us from Roy f Ball moister. Uh.
And you know, ultimately crazy over confidence as we do,
as we discussed, is going to lead to extinction. Right,
you don't want to cuddle the polar bear. Right, cuddling
the polar bear thinking you can fly? Uh, These are

(51:28):
going to lead you falling off the side of a
mountain or winding up at a polar bear's tummy. Yeah. Now,
I could certainly understand the idea of socially adaptive misbeliefs.
I think that thing. Those things definitely do exist, and
in some cases there might be some overlap with the
types of things that get repeated so often, Like reasons
for believing untrue things can also compound each other. I mean,

(51:53):
I'm about to explain why I think false beliefs gained
through exposure and repetition are not adaptive in themselves. Uh,
but you can have more than one reason for believing
something that's untrue. Think about objectively untrue statements that get repeated,
as we were talking about earlier in a political context.
The evidence shows that we believe them partially because of
how often they repeated, but there's also social cognition and

(52:16):
also identity protective cognition. In other words, we tend to
believe things that members of our political tribe and social
in groups say, and for social cohesion reasons that that
is adaptive for us. We also believe things that validate
our sense of personal identity. But I think it's pretty
clear that that these types of effects can work in
a nasty perverse tag team format, boosting and complementing one another.

(52:42):
But even if we we put aside these complementary effects,
put aside uh, social and identity protective cognition, put those
aside and just focus on the explanation for the illusory
truth effect and repetition. There's a really interesting thing that
comes out, and this is based on the idea of
pros tessing fluency, which is it's a it's a concept

(53:03):
that is way more interesting than the name would let
you real. So the dominant explanation for the illusory truth
effect in the psychology literature, which we're about to get into, um,
it fits into this byproduct category that we were just
talking about. Based on all I read, it seems the
informed majority opinion of psychologists is that the illusion of

(53:24):
truth that we get from exposure and repetition is an
unfortunate byproduct of generally useful cognitive heuristic. Now, a heuristic,
as we've talked about before, is a mental shortcut. It's
a fast and cheap trick that the brain uses to
arrive at a judgment or produce some kind of result
without using too much effort. And it's it's worth driving

(53:46):
home that our brains need fast and cheap tricks. Brains
are very energy hungry. Yeah yeah, there's only there's only
so much power to go around there, so it's uh,
it's kind of has to hold everything together with a
bunch of tricks. Yeah, so it works something like this.
Let's go on, let's go with it. Assume that on balance,
true statements get uttered more often than lies. As cynical

(54:10):
as we like to be, that's probably true, right, True
statements are generally more useful to people. Also, there's a
sort of convergence effect where there's only one way for
a true statement to be true, but there are lots
of different ways to say a lie about the subject
of that statement. So like, true statements on a subject
are going to be more consistent usually than lies about

(54:32):
the subject, because a lie about the subject could be anything. Well,
and also lies lies it in large part have to
be believable. Like think about the various true statements and
uh and false statements that might be uttered during the
course of a given day at work. Yes, somebody, hey,
where's the bathroom? You know it's a new building. Say,

(54:52):
then they're they're gonna probably say, oh, it's over there,
and they're they're they're probably going to tell you the truth.
It generally does not serve people well to lie about
the location of the bathroom, right, because you're gonna find
out and then you're gonna say, hey, why did you
tell me the bathroom is over there and not over there?
Are you insane? But then some of the false statements
you're liable to hear might be, hey, if you, uh,

(55:14):
I don't know, let's see have you started on that
report yet, Let's do on Friday, And they'll say, oh, yeah,
I've got it, I've got it taken care of. I'll
get it to you on Friday. You know. There there's
there are a lot of statements like that that are
that ultimately you can't really check in on, like you're
just gonna have to take their word for it. Then
that kind of lie, yeah, you'll never find out, you know,
yeah exactly. Or I can't come into work today because

(55:34):
I'm sick, Well all right, I'm you know, we're not
going to ask for a doctor's note. You might be lying,
you might not, but it's just kind of a gimme
on that situation. Yeah, that's another reason that we're more
likely to be exposed to true statements generally, or at
least that were more likely to detect true statements generally,
because false statements are harder to verify, usually by design

(55:55):
of the person making them. So you're able to find
yourself in an environment that's mostly built out of uh
true statements and believable lies. Right, So, on this assumption,
you know you're you're in a hurry, and your brain
it is not designed to consume infinite energy. It wants
to try to be efficient. You don't have time to
evaluate all claims rigorously. I mean, even no matter how

(56:18):
skeptical you want to be, we can confirm this eventually.
You are just not going to have time to look
into everything you believe. You're just gonna have to take
somebody's word for it. It's not practical to try to
live by verifying every single belief. Oh yeah, I mean
it would. You've just got to have something firm underneath
your feet in order to proceed. Oh yeah, you've got

(56:39):
a bedrock. But then you've also got to have you
just I mean, you take somebody's word on where the
bathroom is, like, you're not gonna try to fact check them.
You know, well, I guess you will by trying to
go there. But other other things like that, mundane things
people tell you throughout the day, You're just gonna have
to believe them. There's just no, it doesn't make any
sense to try to verify all of it because you

(57:00):
don't have time. So therefore, an easy shortcut for assuming
that a statement is more likely true is have I
heard this statement before? Statements they get uttered more often
are more likely to be from that class of true statements. Okay,
I can roll with that. Now, there's another type of
parallel thinking that says, uh, that says, you know, also,

(57:23):
it's actually more difficult to disbelieve something than it is
to believe it, because and I don't know if this
is really confirmed or if this is just one theory
about how the information processing in the brain works. But
just as a quick tangent, there is a model of
thinking that says, Okay, to believe a statement is true,
To hear a statement and say I believe it is

(57:44):
just one step in the brain. To hear a statement
and reject it as false is a two step procedure
where first you have to hear it and believe it
to understand it, and then you have to go back
and revise what you just did and say, but it's
not true. Yeah. It's ultimately like a king setting down
at a banquet table, right, is the king to simply

(58:06):
eat every Uh? Every food item on the plate and
trust in it, and trust that he's not going to
be poisoned or is he going to independently test each thing.
Has the food taster come up, put the mid transfer
this gobblet of wine into the rhinoceros horn, etcetera. Hold
the magic crystal over this plate of beans. And you know,
another thing that came to mind was some of our

(58:27):
discussions we've had in the past about consciousness and imagination
as a simulation engine, that we use our imagination to
mentally simulate possible outcomes so that we can best choose
how we're going to react to the world. And when
I'm presented with something that might be a lie or
or of some sort of untruth or a bit of

(58:47):
of misinformation, I still can't help but imagine it, right,
I'm having to create a mental picture of it. Um
in a sense you're kind of believing it for the moment. Yeah, yeah,
because I have to simulated in my head. And in
cases of people who can form mental pictures, you have
to form those mental pictures. And uh, now you know,
and I imagine a lot of this what shakes out

(59:09):
after has to do with an individual's particular worldview. But
I wonder if in some cases it's like a type
one error in cognition, you know, it's a false positive.
Uh uh that uh, that I I'm imagining this is
a possible outcome, and then maybe I'm more inclined to
believe it just so that I can keep it from

(59:29):
harming me. Yeah. I think that's a very very reasonable
way of imagining it. But so here's where we get
into the final part of our discussion today, which is
the idea of what I mentioned a minute ago, processing fluency.
So processing fluency just means how easy it is to
process incoming information. And you wouldn't believe the research on

(59:50):
how many of our decisions and mental outcomes seem to
be based at least in part on processing fluency. The
brain really really likes things to be easy. It really
likes things to be to go smooth, to not be
too difficult. Uh So, to start off, based on existing research,
it definitely seems true that people have an easier time

(01:00:12):
processing statements and information they've heard before. In fact, Robert,
you probably know this from direct experience. Like a familiar
statement when used in the context of a sentence or
an argument, is processed quite smoothly, but a new, unfamiliar
statement in the same context often causes you to say, wait,
hold on, back up, I need to wrap my head

(01:00:33):
around this. Familiar is easy, unfamiliar is difficult. But how
would you test whether the ease of processing information we're
actually affecting our judgment of the truth of a statement.
And I want to get into a couple of quick,
really interesting studies on this that we're so simple and
so brilliant. So in Rayburn Schwartz did a study and

(01:00:57):
consciousness and cognition called Effects of perceptual Fluency on judgments
of truth, and they took true or false statements, kind
of like in the studies we've seen before of the
variety osorn no Is in Chile or Greenland has roughly
fifty inhabitants, and they presented those statements to people, and
the main independent variable was that they presented the statements

(01:01:17):
either against a white background, in a high contrast, easy
to read color, or in a low contrast, hard to
read color, And apparently that made all the difference in
the world. The idea is that the hard to read
one has low processing fluency, it's difficult, and the easy
to read one has high processing fluency. It's easy to process.

(01:01:39):
And they found that this made a big difference in
what people believed was true or false. Uh that quote.
Moderately visible statements were judged as true at chance level,
whereas highly visible statements were judged as true significantly above
chance level. We conclude the perceptual fluency affects judgments of truth.

(01:02:00):
This is another one that makes sense from a marketing standpoint, right,
just make your message very clear, very very easily absorbed,
and people will begin to buy into it. Oh. Absolutely,
And this has actually been studied in marketing and consumer
preference like there is one study from Novimski at All
published in two thousand seven in the Journal of Marketing
Research that in short, it found that consumers more often

(01:02:22):
tend to choose brands that represent ease and fluency. Like say,
if the information about a brand is easy to read,
consumers are more likely to choose that brand that's the
one they want. So that makes me wonder why Coca
cola is written in cursive. It's just like you would
want it just very clear, but old letters. Well, didn't
they try to change the can when agod I haven't
really looked at a can recently, maybe it's not incursive anymore,

(01:02:46):
you know, they're Actually you might have two things in conflict, right,
So you could have in conflict if you if you've
got an old logo that people are familiar with, but
it's hard to read, that the hard to read part
might be undercutting their preference for it, but the fact
that it's familiar might be boosting their preference for it.
If you try to change it to something that's easier
to read, the change might introduce more difficulty in processing

(01:03:10):
than the ease of reading would improve processing. Yeah, that
makes sense, all right, So I want to cite one
more study, a study by Christian uncle Bach in two
thousand seven from the Journal of Experimental Psychology, Learning, Memory,
and Cognition. And uncle Bach does an interesting thing in
the study where he's got a hypothesis he wants to test.
He writes, quote, I argue that experienced fluency is used

(01:03:34):
as a que in judgments of truth according to the
cues ecological validity, meaning like successfulness in the real world. Quote.
That is, the truth effect occurs because repetition leads to
more fluent processing of a statement, and people have learned
that the experience of processing fluency correlates positively with the

(01:03:55):
truth of a statement. So this is sort of what
we were talking about earlier. It's a heuristic that you know,
you're more likely to encounter true statements in the wild.
People learn this through experience, and then they use the
the the queue of processing fluency to to be the
judge of whether something is familiar or not. And if

(01:04:15):
it's familiar and they get that processing fluency bump it's
easy to process, then they're more likely to believe it's
true because that's what has worked for them in the past.
And if this is true, Uncle Box says, I bet
I could reverse it with a little bit of training,
and he does. He's got an experiment where with a
training phase he actually does three different experiments, and essentially

(01:04:37):
what he does is that he trains people in a
scenario where things that are easier to process, either because
of being easier to read or because of repetition and familiarity.
Either way, those things are more correlated with the thing
with the thing being false, and when people get trained
in sessions like that, they lose the effect. So the

(01:04:58):
good takeaway there is that if he's correct, it would
probably also mean that your susceptibility to the lusory truth
effect is dependent on what kind of environment you've trained in,
and that you could potentially untrain yourself on it. But
that would be hard to do because we all live
in this world all the time where most of the
time people are telling us true things, right, and and again,
the brain is still going to need all of these

(01:05:20):
shortcuts in order to function properly. Yeah, exactly, But you
could just be using the opposite shortcut, Like if you
live in a world where people lie to you all
the time. Uncle Bock's results here would suggest that you
would eventually adapt to this, and you would instead become
exactly the opposite. New claims you've never heard before would
seem more true to you, and repeated claims that you're
familiar with would seem like lies to you. Okay, so

(01:05:42):
there's hopeful us after all. Yeah, I mean, but we
can't expect to live in a world like that, and
we don't want to live in a world like like
like that, Like you don't want to train your brain
to live in a world where everything is assumed to
be a lie. And make surely somebody has has considered
exploring this in fiction. Yeah, it would be it would

(01:06:02):
be a delicate affair to to really put it together
and make it work on paper. But it's a world
that I don't want to live in, but I kind
of want to visit fictionally. Oh yeah, i'd go there
with you. That that's a good one to to come
back to. But just as a quick note before we
close out today, I think this idea of processing fluency
is a really interesting one. There's tons of research on it, Like, uh,

(01:06:22):
there is a study I found by Sasha Topalinski fromen
in Cognition and Emotion about how processing fluency affects how
funny we find jokes that apparently if a joke is
easier to process, we've got high processing fluency on the joke,
we think it's funnier. I guess it just like feels
good to get it without with less effort or something.

(01:06:44):
Uh So there were multiple experiments, but basically here, let
me let me give you a quick preview. I'm gonna
say a word, Robert, peanuts. Do you like that word
when you think about it? Peanuts? Peanuts? It's pretty good.
It's it's not the funniest, where's no cheese, but but
but I like it Okay, I just said that word.
So one example of this type of study would be
if you prime somebody with significant nouns from the punch

(01:07:07):
line of a joke fifteen minutes or even up to
just one minute before you tell them the joke, people
find the joke more hilarious. However, if you tell them
a significant noun from the punch line immediately before the joke,
they find the joke less funny, and the authors think
this is probably just or the author thinks this is
because if you tell them right before the joke, it

(01:07:28):
sort of spoils the punch line. But knock, knock, who's there? Cash? Cash? Who? No? Thanks?
I prefer peanuts. Ah see, it works. It's not even aigg,
not even but you already established peanuts, so it helped, right.
I tried to let a minute or so elapse there.
I don't know if it worked well. It's also complicated
because we did bring up peanuts and peanut better earlier

(01:07:50):
in the episode. I didn't even think about that, but
this actually I am not a student of stand up
comedy by any stretch of the imagination, but I watching
us stand up to see that just that that common
structural tool that they use where you have the call
back to a previous joke, and they'll often do it
right at the end and then it's good night everybody.

(01:08:10):
That's the high note. And it it's not even necessarily
like a call back to their to the funniest moment
in the bit or the funniest bit in the in
the stand up performance, but just the fact that they've
brought your mind back to it. Yeah, it generates laughter,
and it's the moment to end the show on. Yeah.
The theory is that it's it's very satisfying to have

(01:08:32):
a joke that you've where you've been primed for the
punch line already, because it's so much easier to get
the punchline quickly and have that experience of familiarity in
the yaha movement moment because when you say a word
and then you say the word again later, the second
time you hear the word, you've been primed, like you know,
it's more fluid. So Yeah, I think that may very

(01:08:53):
well be going on with callbacks. Another part of the
same study was that, like the studies we've been seeing before,
jokes presented in an easy to read font were rated
as funnier than jokes presented in a really hard to
read font that's kind of not surprising, but processing fluency
it plays into all this stuff like there is research
about how opinions that are repeated more often, even just

(01:09:16):
by a single person in a group, come to seem
more prevalent in a group. So you've got ten people
standing around, then you've just got Jeff over here, and
Jeff keeps saying the same opinion over and over again,
even if you're aware it's just Jeff saying it in
the end, if he does that, you will think that
that opinion is more prevalent in the entire group the
more people hold it. Well, that would make sense. You

(01:09:38):
have one person in a group who say continually trashes
on the movie Aliens. Oh no, why would that happen?
I don't know, but let's say it it happens. You know,
I could see where it could reach the point where
you're kind of like, I don't really know how I
feel about Aliens now, because I sure do here here
Jeff uh talking trash about it all the time. Or
you could walk away from it being like, man, I
don't understand all these people who hate Aliens, even though

(01:09:59):
it's just one person. Yeah, that that seems to be
something that would go on. Processing fluency also appears to
have something to do with aesthetic pleasure. There's been a
lot of research and theory about this that that's a
major component of what feels aesthetically pleasing to us is
based on what's easy to process. Another part is that
processing fluencing fluency apparently affects how credible a face looks.

(01:10:23):
So if are you going to believe somebody, well, it
turns out if their face is easier to process, especially
because you've seen it a bunch of times before, you're
more likely to believe it. Even if they're not famous
and they're not somebody you know, they're not like somebody
you've had experience with that you can you know, judge
their credibility just random faces shown to you in different
sessions of an experiment. If you've seen them before, they're

(01:10:45):
more credible. Of course, that reminds me of various experiments
over the years involving the believability of people with beards.
People with facial hair or beards harder to process. They
have I have not looked into it recently, so I
don't know if there any are recent studies that that
the crack this nut. But but there there have been
studies that have looked in the past where they make

(01:11:05):
the argument that, yes, an individual with a beard, you're
going to have a little more distrust towards them. Well,
obviously nobody should trust me. Well, no, we trust you
because we know you. Joe. Yeah, do you really? Do
you ever really know someone? Well, I'll tell you one
thing I know, and that's peanuts stuff. You got me,
You got me there, you made me laugh and my
joke didn't make you. Okay, Okay, So we gotta wrap

(01:11:29):
up there. We've gone long here, but so we'll be
back in the next episode to explore more recent findings
and some of the ways that the illusory truth effect
really does matter in our in our political and social world. Um,
but so main takeaways I would say today is that
the illusory truth effect is real. Exposure and repetition really
does change our beliefs. The illusory truth effect is small,

(01:11:52):
meaning it doesn't automatically overwhelm other criteria in our decision
making and judgment. In fact, in many cases it appears
that there are not a statement is actually true is
more important to our judgment than whether or not it's
repeated or made easier to read, or any of these
other processing fluency boosts. But on average, over lots of repetitions,
it's easy to see how this could have a big effect,

(01:12:14):
especially when you bring it back to propaganda purposes on
things we believe as a society, things that shift voting
patterns in small but significant ways, and stuff like that. Yeah,
that's the key. That it's not occurring within a vacuum.
It's uh, it's it's it's affecting and being affected by
all these other um mental processes and factors that are

(01:12:34):
affecting our decision making and worldview totally. But we will
get more into that in our next episode. In the meantime,
be sure to check out all the episodes of Stuff
to Blow Your Mind at Stuff to Blow your Mind
dot com. That is the mothership. That is where you
will find everything, as well as links out to our
various social media accounts. If you want to support the show,
we always urge you to leave us a positive review,

(01:12:58):
leave us some stars or whatever the rating system is.
Just rate and review us wherever possible. Big thanks as
always to our excellent audio producers Alex Williams and Tory Harrison.
If you would like to get in touch with us
to let us know your feedback on this episode or
any other, or to let us know a topic you'd
like us to cover in a future episode, you can
always email us at blow the Mind at how stuff

(01:13:19):
works dot com for more on this and thousands of
other topics. Does it how stuff works dot com, gol

Stuff To Blow Your Mind News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Lamb

Robert Lamb

Joe McCormick

Joe McCormick

Show Links

AboutStoreRSS

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.