All Episodes

July 10, 2018 73 mins

If a lie is repeated often enough, are we more likely to believe it? Sadly, the answer is yes. Psychologists call it the illusory truth effect and it influences both our daily lives and the larger movements of politics and culture. Join Robert Lamb and Joe McCormick for a two-part discussion of untruths, the human mind and just what you can do to fight the big lies at work in your world. 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
My welcome to Stuff to Blow Your Mind from how
Stuff Works dot com. Hey you welcome to Stuff to
Blow your Mind. My name is Robert Lamb and I'm
Joe McCormick. In. Today, we're gonna be talking about one
of our favorite subjects, our tendency to believe things that

(00:22):
aren't true. Uh. Now, Robert, I wonder is there a
false factoid or claim that you just always find yourself
recalling as true even though you've checked it before and
discovered it to be false in the past. Yeah, this
is an interesting question because I feel like there are
things that come up in research all the time, certainly

(00:43):
key things over the years. Uh. You know, as we
research and riot different topics where I have to correct on,
you know, where I think I knew something and then
I'm like, oh, well that's that's actually now that actually
do the research, it's not not a not a fact.
And then then you know, the same goes for false beliefs,
that beliefs that they creep in the sort of Mandela
fact type of scenario. For instance, there was a time

(01:04):
when I thought Gene Wilder was dead prior to his
actual death, and then as he actually dead, now he
was actually dead now. So I thought he was dead
before he was dead exactly because and I think it
was just a combination of he was not as active anymore,
and I wasn't really keeping up with the Gene Wilder
filmography and like current events related to Gene Wilder and

(01:24):
something maybe I picked up on some news piece at
some point and somehow he got clicked to the dead category.
And then when I found out he was alive, it
was it was really like he came back to life.
And I had the same thing happened literally just the
other day with a standout comedian, Larry Miller. I don't
think I remember who that is. Oh. He he had
to kind of you know, dry observational comedy. He he

(01:47):
had a I think he still acts, but he would
show up on say night chord. I think, oh wait
a minute of seeing some Christopher Guest movies. He may
have been. Yeah, But for some reason years ago, I
got into my had that he had passed away, And
so occasionally I would think of Larry Miller and be like, oh, yeah,
I remember Larry Miller. Too bad he passed. And then

(02:07):
I actually looked him up the other day and it
turns out he has not passed away. He's still very
much alive and active, and I was just living in
this fantasy world of dead Larry Miller's. You know, I
have false beliefs that recur with much more significance, Like
I keep remembering that. Yeah, maybe it's just because I
was told this all the time when I was a kid,

(02:27):
that vitamin C supplements will ward off colds. That is
not experimentally proven, that that is like not a finding
of science. And yet I just always if I haven't
checked in a while, it just seeps right back in, like, yes,
that is true, vitamin C it'll keep colds away. Well,
it's easy to fall into the trap. I do this
all the time with with various vitamins and some supplements

(02:50):
where I'm like, I don't know if it works, probably
doesn't work, but I'm gonna go and take it just
in case, because it's it's vitamin C. You know, what's
the what's what's the harm there? It's kind of like
believing in God's in cases that he exists. Believing in vitamins. Yeah,
but then you end up with like a weird sort
of vitamin tentnacle going out of your neck and you
didn't see that coming. That's fake news. The Joe Vitamin

(03:11):
C will not cause a tentacle to grow out of
your neck. But now you've heard it, it's true. Um.
You know, I feel like there are things that have
popped up where where I'll think, well, I've always heard X,
but I've never actually looked it up. Um. And and
then that's where the problem seeps in, you know, where
I just I think I know something, but I'm not sure,

(03:31):
but I don't care enough to actually investigate. Um. There
is one possible example that that comes up, and that
is there was this, of course, the idea that George
Washington Carver invented peanut better. He did know he had
something to do with peanuts. He had, so yeah, he was.
He was. He is a famous inventor, important to African

(03:51):
American inventor. And I just I didn't know a lot
about him, and I had always heard the peanut better thing,
but I didn't actually research until I helped my son
with a class project about him earlier this year, and
then I was able to definitely, you know, check that
one off on the mental list, like, okay, this is
this is this is false. He did not invent peanut butter.

(04:12):
He did, but he did do stuff with peanuts, do
stuff with peanuts, but not peanut butter. Okay. You know.
A huge place where you can see false beliefs persisting
um is in people's beliefs about sort of like political
facts or sociological data. A very very common one is

(04:32):
people's beliefs about crime. I think it's because like crime
is one of those like sensational types of subjects and
makes people think about violence, images of blood they see
on the news and stuff like that. In a poll
conducted by Pew in the fall of get this, fifty
seven percent, so a majority of people who had voted
or planned to vote in ten said that crime had

(04:56):
gotten worse in the United States since two thousand eight
by every objective measure. Exactly. The opposite is true. That's
just not true. FBI statistics based off of like nationwide
police reports UH found that violent crime and property crime
in the United States fell nineteen percent and twenty three percent, respectively,
between two thousand and eight. And and so you think, okay, well,

(05:20):
maybe that if that's just police reports, maybe fewer people
are reporting crimes to the police, right, But also the
U S Department of Justice. Bureau of Justice Statistics does
direct annual surveys of more than ninety thousand households to
see about rates of crime that might not be reported
to police, and quote the b j S state to
show that violent crime and property crime rates fell twenty

(05:42):
six percent and twenty two percent, respectively between two thousand eight.
And so a majority of people are believing something that
by every measure we know, is not true. Crime has
gone down, and yet a majority of people believe it
has gone up. And it's not hard to see why
that might be true when you consider like the political

(06:02):
messaging of certain politicians. It's also you could also, of
course think about just people negativity bias, right, the tendency
to believe things are worse than they are in the
broad sense or mean world syndrome looking at dangerous stuff
happening on the news and thus having an overrepresentation of
it in your mind. But I think we would be
wrong to ignore the effects of hearing specific politicians well,

(06:27):
for for example, in twenty sixteen, specifically, it was Donald
Trump a lot talking about how crime is through the roof, right,
And to your point, we can look to statistics on this.
This is not something that is uh, that is just
you know, in the ether, we have hard data. It's
not a matter of opinion. It's just like every measure
we have says that's not correct. But what about other beliefs,

(06:47):
I mean that that's certainly not in isolation. There are
lots of cases where there are widespread beliefs in things
that are just simply factually not true. Yeah. I'll run
through a few here that that range on topic. For instance,
here's a nice science related one to kick off with. UH.
In a two thousand fifteen PU survey, only thirty percent

(07:08):
of Americans knew that water boils at a lower temperature
at higher altitudes. Thirty nine percent said it would boil
at the same temperature in Denver in l A, again
Denver being at a far higher altitude, and had it reversed.
So the majority, something like two thirds of people were
just flat wrong, yes and uh yeah. And to put

(07:29):
that in perspective with another science fact, most Americans in
this two thousand fifteen survey eight correctly identified the Earth's
inner layer, the core, as its hottest part. And uh
nearly as many two percent knew that uranium was needed
to make nuclear energy and nuclear weapons. Well, should we
be comforted by the fact that that's what people know.

(07:51):
I don't like, they don't know much about water, but
they know about nuclear weapons. Well. I mean, on one hand,
nuclear weapons is was is and was more in the news. Uh.
And then on the other hand, like the the inside
of the earth is more engaging and and also completely unpoliticized. Uh. Well, well,
water is not politically boiling point of water is not politicized.

(08:14):
But it's also not very sexy. I guess like it's
it's it's one of those things where unless you're actively
moving from low to high altitudes, or you know, living
part of your time and dinner from part of your
time in l A, I guess it's it's very possible
to live your entire life without really having any real
world experience with the difference. Uh. Though, I do feel

(08:35):
like if you read enough baking manuals, it comes up. Yeah,
but maybe you just don't remember which way it goes.
I guess I say that you know one of the
ones you've got here that that has come up many
times in my life. It's come up enough that I
know the right answer now. But the misconception that you
can see the Great Wall of China from space. Oh yeah,
this is one that I have to admit. I think

(08:56):
I used to adhere to again without really a rap
princing it, because it just it had that kind of
truthiness to it, right, and you want it to be
to be real. The idea that that this this epic
structure created uh you know long ago, is visible from space,
but it's actually been been disproven multiple times. It's it's

(09:19):
only visible from low orbit under very ideal conditions, and
it's not visible from the moon at all. Because that's
another version of it, that it's that that the Great
Wall of China can be seen from the moon. Yeah,
I guess there are nuances to the word visible. What's
that mean? But in the normal sense that you would
mean it's not visible from space. Now a two thousand

(09:39):
sixteen you gov poll, this was this is a UK group.
They looked at how belief in Pizza Gate, that thing,
how that shakes out across different voter groups. Now, that
was this vast conspiracy theory people had about how there
was a pizza restaurant in Washington, d C. That was
like running child slavery ings that was linked to the

(10:01):
Democratic Party. Yeah, it had to do with an idea
that Clinton campaign emails supposedly talked about human trafficking and pedophilia.
And according to this particular poll, seventeen percent of polled
Clinton voters believed that that this was the case, this
was a reality, and forty percent of Trump voters did. Um.

(10:22):
And then there's another classic they looked out to to
put this in perspective, the idea that President Barack Obama
was born in Kenya. Kind Of alarmingly enough, across both
groups of voters, both the Clinton voters and the Trump voters,
they found thirty six percent believed it, despite the fact
that that too, has been debunked time and time again. Yeah,
I mean, it's crazy that these types of beliefs can

(10:46):
catch on so well, especially like we we understand very
well the way that political ideology and tribal thinking affects
the way we form opinions. Obviously, our our opinions are
deeply informed by what people we view as our in
group believe, and so we want to be in line
with the in group and and stuff like that. But

(11:07):
you also can't really ignore the fact that these are
things that if you pay attention to certain sources you're
going to be hearing over and over and over again,
and what effect that might have, Because it's, you know,
widely accepted folk wisdom that if you repeat a lie enough,
people start to believe that it's the truth. Right, That's
one of the things. I mean, I don't know, you've

(11:28):
said it enough times that I'm already convinced exactly, I
mean almost going along with this. Uh. There's a quote
that often gets sourced to the Nazi propaganda minister Joseph Gebbele's. Uh.
Some versions of the quote say something like, if you
repeat a lie often enough, people will believe it, and
you will even come to believe it yourself. I couldn't
find any evidence that just Gebbels actually said that. It

(11:50):
seems to be a misattribution, but it's sort of a
paraphrase of similar ideas that are you know, within that
that frame of thinking. Like Adolf Hitler himself throat in
mind comp quote, The most brilliant propagandist technique will yield
no success unless one fundamental principle is born in mind
constantly and with unflagging attention. It must confine itself to

(12:13):
a few points and repeat them over and over here,
as so often in the world, persistence is the first
and most important requirement for success. So just the fact
that Hitler said it obviously shouldn't make us think, well,
you know, he's right. He was Hitler though. If if
Hitler was good at anything, it was getting lots of
people to believe lies. Certainly. So, I think because this

(12:34):
is such an important issue, and because widespread misconceptions are
so common, and because they can, in fact, especially in
some political circumstances, be so destructive, and because the repetition
of lies and false statements in in at every scale
of existence, you know, in in mass media and in
our personal private lives, is so common, I think it's

(12:55):
worth looking at the actual empirical case. Is this true,
the idea that repeeding statements over and over does that
actually change what we believe. It's one of those things
that you know, it sounds so common sensical you just
assume it's true. But according to the logic we're using now,
those are exactly the kinds of statements that maybe we

(13:15):
should be careful about. Yeah, and I think this is
an important important topic for for everybody. I don't care
who you voted for in any previous elections, or which
political party in your given system that you adhere to.
I think if you're listening to this show especially, you
want to think for yourself. You want to reduce the
amount of manipulation that's going on with your your own

(13:38):
view of reality and and and that's what we're going
to discuss here today. We're gonna discuss the degree to
which false information can manipulate our view of reality and ultimately,
what are some of the things we can do to
to hold onto our our our individuality and all of
this exactly. So this is going to be the first

(13:59):
of a two part episode where we explore the liar's
best trick, the question of repetition and exposure in forming
our beliefs and changing our attitudes. So that's going to
be the jumping off point for today's episode. Does exposure
and repetition Is hearing a claim and hearing it repeated
actually have the power to change our beliefs or is

(14:21):
that just unverified folk wisdom? Yeah? And and of course
it goes it goes well beyond politics. It also gets
into marketing. You know, we've we've touched on the manipulative
nature of marketing and advertisement on the show before, And
it's always one of the things you always come back to.
It's always about messaging, right, like what is the message
of the product, what's the message of the ad campaign?

(14:42):
And how they just continue to hammer that home. Why
do brands have slogans? Yeah? Why don't they just tell
you a positive message about the brand that's different every time?
Why do they tell you the same message in the
same words in every commercial? Yeah? Why did those fabulous
horror trailers from the nineteen seventies say the name of

(15:02):
the film eighteen times, don't go in the basement, don't
go in the basement? No one er seventeen will be admitted. Yeah,
it's all. It's all kind of part of the same situation.
All right, Well, we're going to take a quick break
and when we get back we will dive into the
research in the history of psychology about repetition and exposure.
Thank alright, we're back. So the first question we're going

(15:26):
to be looking at today is whether anyone has actually
studied this question, this question of whether exposing people to
acclaim and then repeating the claim makes them believe it,
Whether anybody studied that in a controlled scientific context. And
the answer is a resounding yes. There are I think
dozens of studies on this subject in various forms. Probably

(15:47):
the flagship study on this, the first big one that
everybody sites that that really got people into the subject,
that got the ball rolling on it was from ninety
seven and it was by Lynn Hasher, David Goldstein, and
Thomas Topino, and it was called Frequency and the Conference
of Referential Validity in the Journal of Verbal Learning and

(16:08):
Verbal Behavior. And that was, as I said, in nineteen
seventy seven. So the authors started out in the study
by talking about how most studies of memory that take
place in the lab involved useless or meaningless information units.
So researchers, for example, might try to see how well
subjects remember a phrase like I just made this up.
The purple donkey was made of soft spoken muscular elves, Like,

(16:31):
can you remember that word for word? The purple donkey
was made by muscular elves, now, now you're close, not
made of Yeah, it makes a lot more sense to
say made by wo But I see, I've already messed
up the origin story of the purple donkey. Statements like
this have no importance in the real world, and part

(16:51):
of what they were talking about is that we're testing
for memory and things that don't have any validity to reality. Um,
so the authors right that they're curious about what kind
of processing subjects do with information units that might have
validity in the real world. For example, factual statements like quote,
the total population of Greenland is about fifty thou which

(17:14):
at the time of the study it was I checked. Though,
that seems like a lot more people than should be
in Greenland, right I was. I was surprised by that. Well, yeah,
I would agree based on I'll be a limited amount
of of information that I've I've read and viewed about Greenland.
You know, typically in my experience, Greenland shows up in
nature documentaries and of course you're going to see rather

(17:36):
barren h locations in those films. Well, greenlanders out there
in the audience. Let us know, if you're listening, what's
life like up there in Greenland? I'm interested now. But
anyway back, so, yeah, population of Greenland at the times
about fifty and so statements like this both refer to
something that could be true or false in the real world,

(17:56):
and there are also things that people are probably uncertain about,
like do you know what the actual population of Greenland
is I didn't know before I looked it up, And
so we know that the statement is either true or false,
but we aren't sure whether it's true or false. And
of course, to go back to a previous episode, you're
kind of anchoring my expectations by throwing out without any

(18:17):
population data in my head about Greenland, like that's suddenly
all I have to go on. Oh yeah, that's interesting. Also,
it's so like it could be three thousand or it
could be like a million, and either you're sort of
moving your guests range toward fifty. So the thing they
point out is even though most people don't know what
the population of Greenland is, we're often willing, into some

(18:38):
extent able to make guesses as to whether statements like
this are true. So where does this semantic knowledge come from?
When we feel like we have knowledge to offer a
guess about what the population of Greenland is even when
we don't really know what is that what allows us
to judge these questions? And the authors note that frequency

(19:00):
is a really powerful variable in all kinds of judgments
we make about the world, so they hypothesize that quote
frequency might also serve as the major access route that
plausible statements have into our pool of general knowledge. So
the idea is that we build our knowledge base based
on how frequently we are exposed to ideas. You hear

(19:23):
an idea a lot, and that gets reinforced in the
knowledge base. You've never heard an idea before, or you
don't hear it a lot, it doesn't get reinforced and
it doesn't exist in the knowledge base. So here's the
experimental part. Researchers came up with a list of a
hundred and forty true statements and false statements, crafted so
they all sound plausible. Yeah, they could be true. You know,
the average person would be unsure whether or not they're true,

(19:47):
and the statements were on all kinds of subjects like geography,
arts and literature, history, sports, current events science. A few
examples of true statements included things like Cairo, Egypt has
a larger population than Chicago, Illinois, and French horn players
get cash bonuses to stay in the U. S. Army. Well,

(20:08):
i'd see it. I should have joined the army after
I'll see I was a french horn player really in
high school. Yeah, I didn't know that. What's it like
playing the French horn? It's just you know, there's a
lot of spit and a lot of shoving your hand
up horns. That's it. Otherwise it's like playing a trumpet.
Now see I actually played trumpet, and it's a lot
less fun than what you're describing. Yeah, I mean, I

(20:29):
don't know. There is an elegance to the way you
hold it and you again, you have your hand in
the inside the horn. I don't know. Being a trumpet
player to me always felt like being a person who's
complaining at full volume. Yeah, yeah, there is. There's more
of an outward stance with the trumpet, right, you are
blasting outward, But in the French horn, you it's it's

(20:50):
more like you're playing music into yourself. That's quite beautiful,
more beautiful than any music I've ever played on the
French horn. Uh so we got to get back to
the study. Okay, So that's that's supposedly true. French horn
players at the time got cash bonuses to stay in
the U. S. Army. Examples of false statements where things
like the People's Republic of China was founded in nineteen

(21:11):
forty seven, who was actually nineteen forty nine, or the
copy Bara is the largest of the marsupials. That's not true.
The largest marcipial is the red kangaroo. The largest known
in the fossil record is this thing called the extinct
dip protodon. Now that in the capy bar is a rodent, right,
only is the capy bar and marsupial. I didn't even

(21:31):
look it up. I only know that because I go
to a lot of zoos these days. But it is
a mammal, it's a it's a rodent. It is a mammal,
and it is a rodent, not a marsupial. Well, there
we go. So it's wrong in multiple ways. So you
got this big list that came up with of true
and false statements, and all of them should sound plausible
to the average person, But most people are not going
to be likely to know for sure whether they're true

(21:52):
unless they just happen to have some special random knowledge
or expertise. And so researchers held three sessions with participants,
each separated by two weeks, and on each of the sessions,
the participants were played back a tape of a selection
of sixty recorded statements from that list, and the subjects
were asked to judge how confident they were that the

(22:14):
statements were true. And this was on a scale of
one to seven, with like four being uncertain, five being
possibly true, six being probably true, seven being definitely true.
And in each session, some of the statements were true,
some were false. But here's where the real magic happened.
At the second session and the third session. Each time,
subjects got a mix of new true and false statements

(22:38):
that they've never seen before, plus true and false statements
that they had already seen in the previous sessions. So
while most of the claims they saw were new, a
minority got repeated each time. And what the researchers found
was that whether a statement was true or false, the
more times the students saw it, the more they believed it.

(23:00):
So again, this would be this, this, this would be
the principle in action. The more they're hearing this, uh,
this this false fact, the more they're coming to believe
that it is true. Yeah, even in this constrained, kind
of weird experimental context where they're aware that some of
these facts are going to be false, it's not like
they're being told this persuasively by a person trying to

(23:20):
convince them. They're just reading this from a list of
statements that are known to be either true or false.
I mean, there's no there's no persuasive aspect to this
at all, right, right, these are not politically charged or
really charged by worldview at all. They're they're just plain
neutral statements that really have very little interest to most people.

(23:41):
Also probably, but what happened was whether the statement was
true or false, people believed it more if they saw
it more times. So I've got a little chart in
here of what happened with the false statements. You can
have a look at Robert as you can see the
new false statements. The false statements people saw the very
first time covered around you know, like four or four

(24:03):
point one across all three sessions. That would correspond to
people saying they're uncertain. I don't know. I don't know
whether French horn players get a cash bonus for staying
in the army and playing into the self and sticking
the hand up the horn. But in the second session
there repeated false statements jumped up from about four point
over four point one to about four point five, and

(24:23):
then in the third session up again to about four
point seven. And we only saw what happened with two
sessions who knows what have happened, what might have happened
if you had continued adding more sessions. So just seeing
a statement more than once appeared to make it more believable,
even though it wasn't true. And the pattern was roughly
the same for true statements, which isn't all that surprising

(24:44):
since the experiment was based on, you know, statements that
people didn't know whether they were true or false to
begin with. So the authors wrote in conclusion, quote, the
present research has demonstrated that the repetition of a plausible
statement increases a person's belief in the referentialvalidity or truth
of that statement. I don't know why they had to
say referential validity. They could have just said truth. That's

(25:06):
that's some science writing for you. Uh, in the truth
of that statement. And indeed, the present experiment appears to
lend empirical support to the idea that quote, if people
are told something often enough, they'll believe it. Uh So so, yeah,
this is this is the first real study to find
this and a few other things the authors thought were
worth considering. Uh. The fact that this effect was displayed

(25:29):
in statements from a big, broad pool of different types
of subject matter suggests this is not extremely context dependent. Right,
It's not just going to be political beliefs that are
subject to this. It seems to be all different kinds
of statements in all different kinds of domains. Another thing
they noted was that the effect was present for true
statements and false statements. Either way, if students saw the

(25:51):
claims more often, they believed them with greater confidence. But
another takeaway is that the effect is not huge for
false statements. Through exposures was roughly enough to get you
from I'm uncertain too. It's possibly true, but as we
mentioned before, how many this is just two or three
sessions right right? Right? Who knows what what would have

(26:11):
happened if maybe you had done this more times in
a row, or if there had been other factors affecting
whether people were likely to believe these things to begin with,
say if they had valences to the person's political identity
or something like that. Yeah, you know, as we're researching
this into and discussing it, I couldn't help but think
of notable examples of uh false stories about generally like

(26:37):
like celebrities from the past, And I'm not going to
mention any of them specifically, Why not well, because you know,
they all tend to be a bit crude. There. There
are several of them about like tearing down various sort
of you know, pretty boy rockers or actors from the past,
uh in in the in. The interesting thing about him
is these are generally like pre internet um stories that

(26:59):
had circulate the word of mouth or I think in
one case there was talk of like a whole bunch
of of facts is going out in Hollywood where someone
was just basically just wanted to take somebody down because
they didn't like them. I remember, I think like ninth
grade here starting to hear this bizarre story about Richard
gear Yeah, that's that's the main one I'm thinking of.
And I think it basically comes down to Richard Geary's

(27:21):
a is a handsome, successful guy and and for a
lot of people you want to like, really, you know,
knock him down and not you Yeah, and uh and
so when you encounter a bit of slander or libel
like that or uh, just a ridiculous story, you're gonna
be more inclined to believe it if you kind of

(27:42):
want it to be true, right or if you or
you're like, yeah, let screw that guy. I'm gonna I'm
gonna go ahead and believe this, or even if I
don't believe it, I'm gonna pass it on. But either way,
whether or not you're predisposed to believe it's true, it
looks like this initial study at least provides evidence that
you would be more disposed to believe it's true in
either case. Like, so, whatever you're starting point is, it's

(28:03):
gonna nudge you up. Like if nothing else, it becomes
word association. Like if you're not really a fan of say,
Richard Gear's work, uh, and you can't name your your
favorite Richard Gear film off the top of your head,
that might be the primary keyword that pops up when
you hear his name. Yeah, it could be. Uh. Yeah.
So so this bizarre effect that we're talking about, where

(28:26):
hearing a fact repeated, even if you've got no good
reason to believe it's true, just hearing it repeated causes
you to be more likely to believe it. This came
to be known first as the truth effect, and then
later on probably a better title was the illusory truth effect.
I think we should use the second one because otherwise
that makes it sound true. Yeah, it just makes it

(28:47):
sound like, yeah, if you if you repeat something, then
it is then it is true. There is no question anymore. So.
The basic version of the illusory truth effect is quote.
People are more likely to judge repeated statements as true
compared to new statements. In another way of putting it
is that, all other things being equal, you're more likely
to believe a claim if you've heard it before than
one you haven't heard before. And the more times you

(29:09):
hear the claim, the more likely you are to believe it.
But so far we've just talked about one study, right,
this this one seven study. Uh, we can call it
the Star Wars study if you want. Okay, uh, the
Star Wars study. Here. It's a fairly small sample, just
one study. If you want to be skeptical and rigorous,
maybe especially because this backs up ful quisdom, which is
always something you should be careful about. We should see

(29:31):
if the effect has been replicated by other researchers, and boy,
howdy it has. That's right. This next one comes to
us from nineteen seventy nine Journal of Experimental Psychology, Human
Learning and Memory, the work of Frederick T. Bacon. Yeah,
this was called the Credibility of Repeated statements memory for trivia,
so Bacon performed he was trying to replicate this this effect.

(29:54):
He performed additional experiments to test the previous team's conclusions
and add some new on So his first experiment, you
got ninety eight undergrads and they had two sessions in
which they were asked to rate sentences as true or false,
with three weeks between the two sessions, and Bacon found
that the repetition illusory truth effect was modulated by whether

(30:15):
the subjects consciously believed that a sentence had been repeated.
That is, if they remembered that they had seen the
sentence last time, they were more inclined to believe it.
If they believed they were seeing a sentence for the
first time, they were less likely to believe it. And
this was true regardless of the statements themselves. I can't
help but think of our modern version of this with

(30:37):
Facebook feed right, because you're inevitably, if you're your Facebook user,
or perhaps if you're a Twitter user or some other
social media you're you're scrolling down right and there are
a lot of sentences coming at you. Some you just
kind of read in passing, so maybe you don't read
at all. But are you actually stopping to really think
about what a particular headline or you know, or paragraph

(30:58):
is saying, or to just kind of scrolling in the
background of your mind. Yeah, And the results of this
one experiment here would seem to indicate if it has validity,
it would mean that the ones you stop and pay
attention to and make a memory about are the ones
you're more likely to believe later on. But then also
in another experiment, he had a group of sixty four
undergrads and he replicated the illusory truth effect and found

(31:22):
that students believed repeated statements to be more credible even
if the students were informed that the statements were being repeated.
So you can directly tell somebody, Hey, I know, I
just asked you if it was true or false that
zebras could automatically detach their own tongues and fling the
tongues at attacking hyenas. I asked you that same thing
three weeks ago. It may or may not be true.

(31:44):
And even in this case, repeating the statement still makes
them judge it to be more true than statements they're
seeing for the first time. So you can warn people
that something fishy is going on and they still fall
for it. So you could you could straight up share
a piece of just undeniably fake news on social media
and and said, hey, guys, this is this is some

(32:06):
fake news. This has been totally debunked. Um, you can
look it up on Snopes, etcetera. And that's still not
going to completely disarm the piece that you're sharing. Well,
we will talk, so I would say, yes, we will
talk more about that in the second episode where this
kind of thing comes into conflict with real world beliefs.
And just to be clear, I made up that zebra

(32:26):
thing that that wasn't find of the Bacon study. Okay,
I thought that would be clear. But that's number one,
not true. Number two as far as I know, not
one of the examples Bacon used. Right, Well, I'm sorry
you had to drag Zebras into all of this. Joe. Well,
you know, I like the idea of a weaponized tongue
that's gonna go beyond the X men. Surely that exists
in reality. Well yes, but but not with Zebras. No. No,

(32:48):
I guess that's amphibians and stuff. Okay, okay, So back
to the study, So Bacon says in his abstract quote.
It was further determined that statements that contradicted early ones
were rated as relatively true if misclassified as repetitions, but
that statements judged to be changed were rated as relatively false.

(33:09):
So even if you misremember that you saw something before,
you're more likely to believe it's true. It's kind of odd.
That makes you wonder, what is the initial uh, what's
the initial stimulus that caused you to misremember that you
had seen it before. Well, as we've discussed on the
show before, I mean, there are multiple ways that false
memories can be can be encoded. Oh yeah, absolutely. So

(33:32):
Bacon concludes that basically, people are predisposed to believe statements
that affirm existing knowledge and to disbelieve statements that contradict
existing knowledge. That's not all that unusual, right, But but
it's specifically the repetition effect that seems to be playing
a role here. Let's take a look at another study.
How about nine two Maryan Schwartz Repetition and Rated truth

(33:54):
Value of Statements from the American Journal of Psychology. So
Schwartz here conducted two experiments on what psychologists were by
this time calling the truth effect, what we're calling the
illusory truth effect UM. So experiment one, you get a
group of subjects and they rate claims on a seven
point truth value scale. Just like in the first study,
the Star Wars study, the seventy seven study, UM and

(34:17):
a different group of subjects rated the same statements on
a seven point scale of how familiar they were with
the statements before the experiment started. How familiar are you
with this? Repetition increased both ratings, So both pre experimental
familiarity as well as the perceived truth value, they both
went up when people saw them more than once. That's

(34:39):
not surprising. Again the replication and then also the fact
that you have seen something before, we'll tend to make
you more familiar with it. Then you've got another experiment here.
Second one replicated the illusory truth effect. Again found that
it didn't matter whether you mixed up repeated statements that
people had seen before with new statements or only showed
them repeated statement. Either way, belief and repeated statements went up.

(35:03):
And this was done so that they could rule out
the possibility they're thinking, you know, maybe it's only by
contrast with new and unfamiliar statements that repeated ones seem
more credible. That is not the case either way you
do it. If you've seen it before, you believe it more.
And so this study has taken as evidence that the
feeling of familiarity with an idea might be an important part,
or even the most important part, of how we judge

(35:26):
something as true or plausible. But we should shift to
asking the question of why why would increasing familiarity with
the statement through repetition make it seem more true to us?
It makes me think about this passage from Wittgenstein in
his philosophical Investigations, about how absurd it would be to
use repetition of a mental representation as evidence that the

(35:49):
representation is correct. He writes, quote for example, I don't
know if I've remembered the time of departure of a
train right, And to check it, I call to mind
how a age of the timetable looked. Is it the
same here? No, For this process has got to produce
a memory which is actually correct. If the mental image

(36:09):
of the timetable could not itself be tested for correctness,
how could it confirm the correctness of the first memory.
As if someone were to buy several copies of the
morning paper to assure himself that what it said was true.
And that's that's kind of what we're doing. Like he's
talking about mental images, but the general point is a
good one. We're essentially buying several copies of the same

(36:32):
newspaper to to increase our belief that what the newspaper
says is actually accurate. Now, one it possible interpretation that
comes to mind is just like the idea of say,
walking us picking out stepping stones to cross a creek. Right,
you step to one stone and it doesn't slip out

(36:52):
from underneath you, and so you use that too to
make your way across the other stones and hopefully make
it across the entire creek without getting your feet wet
or falling in being swept down stream to the to
the waterfall. So to what extent are we just like
trusting anything that hasn't resulted in catastrophe thus fire? Well,

(37:12):
I would say that it would make more sense for
that to be true with sort of embodied physical experimental
knowledge about the world than it would for that to
make sense for that to apply to semantic knowledge of
things people tell us. Or maybe our brains just aren't
good at differentiating between semantic knowledge that's imparted through words.
You know, maybe somebody saying all those stones will hold

(37:35):
you up is encoded by the brain in sort of
the same way as testing out one stone at a time. Uh, yeah,
I don't know. So this is what we should explore
for the rest of the episode. I think, why should
repeatedly exposing ourselves to the same information increase our confidence
in it if we didn't have good reasons to believe
it the first time. It's clear that this is what's happening,

(37:56):
But why does it happen this way? All right, we'll
take one more break and when we im back, well
we'll jump into this. Thank thank Alright, we're back. So
we're asking this question of why repeatedly exposing ourselves to
the same information would increase our confidence if we didn't
have good reasons to believe the information the first time.

(38:17):
It's clear from several experiments that this is what happens
in our brains if if a statement is repeated, we
believe it more. But why do our brains work that way?
It doesn't necessarily make sense. Yeah, And one possible interpretation
that came to mind is, of course, we we've touched
on this before that that we're all social animals. Yeah,
so I've I've wondered if there this is a byproduct

(38:39):
of the drive to fit in with a given group
or tribe, that there's ultimately a survival advantage and getting
along with the group, and so does that bleed over
into highly repeated or highly circulated lies or untruths. So
basically like, if there is a lie going around in
the group, you'll get along with the group better if
you just accept the lie. Yeah, And I'm not know

(39:00):
certainly after looking at more of the research, I'm not
arguing that that is the core um mechanism involved here.
This is worth exploring that. But but I but I
do like wonder to what extent that is that's playing
a role, because you we we all have our our
groups that we are involved in, our our our friends,
our family, or our work groups, our social media groups
are are sort of echo chambers that we find online.

(39:21):
And uh, does it make you more susceptible to the
lie just because there is this this ingrained need to
fit in with that group too, to share the same
values and to put it all in the prehistoric framework,
to to continue to have access to the fire and
the and the feast. Yeah, I think I think that's

(39:41):
the possibility worth exploring. Let's let's take a look at it. Okay, Well,
I started looking into this a little bit, and I
ran across a paper titled the Evolution of Misbelief, Misbelief,
Misbelief from two thousand nine. This is published in Behavioral
and Brain Sciences, and it was by Ryan T. McKay
and Daniel in it. Oh, Daniel's in it all right.

(40:02):
So they approached the following I guess you could call
a paradox in the paper. Given that we evolve to
thrive in a fact based world, what other kind of
world could there? Exactly? Yeah, I mean, we we're dealing
with with with actual reality here. But but given that
we've evolved to thrive in this world, shouldn't true beliefs
be adaptive and misbeliefs be maladaptive. It's clear that in

(40:26):
many cases, probably most cases, that is the way things are. Right.
Believing that you are able to fly off the edge
of a cliff is not good for you. Believing that
polar bears want to cuddle with you is not advantageous.
Holding false beliefs like this doesn't work out well for people. Yeah,
they're they're they're reckless and dangerous misbeliefs that clearly like,

(40:48):
if you reach the point where you're believing in that,
you're going to go extinct. So it's obvious that there
is going to be at least some kind of major
selection pressure in the brain for shaping brains that believe
sly true things, unless there are cases where believing something
that's false outweighs the negative the drawbacks essentially. So here

(41:09):
here's a here's what they wrote quote on this assumption,
our beliefs about the world are essentially tools that enable
us to act effectively in the world. Moreover, to be reliable,
such tools must be produced in us, it is assumed
by systems designed by evolution to be truth aiming and
hence barring miracles, these systems must be designed to generate

(41:33):
grounded beliefs. A system for generating ungrounded but mostly true
beliefs would be an oracle, as impossible as a perpetual
motion machine. I like that. Yeah, So there's got to
be like a grounding procedure through which we can discover
true beliefs if we're going to have them. Otherwise we're
just talking about magic. But we have to account for
these varying levels of misbelief and self deception in the

(41:55):
human experience. They write, if evolution has designed us to
appraise the world accurately and to form true beliefs, how
are we to account for the routine exceptions to this
rule instances of misbelief? Most of us at times believe
propositions that end up being disproved. Many of us produce
beliefs that others consider obviously false to begin with, and

(42:17):
some of his form beliefs that are not just manifestly
but bizarrely false. How can this be? Are all these
misbeliefs just accidents, incidences of pathology or breakdown, or at
best undesirable but tolerable byproducts. Might some of them contrat
the default presumption be adaptive in and of themselves. I

(42:38):
like this distinction they're making. I think this is actually useful.
So they're breaking misbeliefs down into two basic kinds of
categories right right that they're talking about one those resulting
from a breakdown in the normal functioning of the belief
formation system. This would be delusions malfunctions, So things like
face blindness or or catard syndrow. Okay, this is when

(43:01):
the brain is creating incorrect beliefs because it's not working right,
it's not doing what it's supposed to be doing. But
then the second category are those that are arising in
the normal course of that system's operations, so beliefs based
on incomplete or inaccurate information. These would be this would
be a case of manufacture. And we'll get into examples

(43:22):
of this in a second. There could be tons of examples.
One that comes to my mind that would be an
example of this would be optical illusions. When you when
you witness an optical illusion, you have a false belief
that has been generated by your brain. But it's not
because your brain is doing anything wrong. It's just because,
like it's being exploited by a situation that's not part
of its normal what it normally needs to do. Right. Yeah,

(43:44):
they point out that it's it's easy to think of
these in light of of an artifact. Is it failing
due to a limitation in the design in a way
that is culpable or tolerable. Examples here being say a
clock that doesn't keep time keep good time versus a
toaster of and that doesn't keep time at all. You
can't expect the toast drivant to keep time unless it's
got a time right. Well, yes, but that's true. Yes,

(44:05):
I would have said a purple donkey built by muscular
elves that doesn't keep because he wouldn't even expect yes now.
But but it gets more complicated when you go into
the biological realm, because what counts as immune function, dysfunction,
a pathogen infection, but what it would have Ultimately, the
immune system airs by defending the body against say, a

(44:26):
transplant organ, and may ensure its survival because the body
is going to reject that attempt, to reject that that
heart transplant. Uh, even though the heart transplant could save
the patient, will save the patients. So it seems like
in order to understand this, you almost have to understand
the context, right right. They say that they invoked the
work of Ruth Garrett Milliken, who in saying that we

(44:49):
can't look to an organ's current properties or disposition, we
have to look to its history. That makes sense to me.
Organ transplants, of course, are not part of our evolutionary history.
So this is just the body function normally in rejecting
the invader heart. Right, The body is not malfunctioning is
doing what it's supposed to do. We're just throwing a
situation at it that it's not prepared to deal with. Yeah,

(45:10):
so that brings us to the more human examples, you know,
lies and uh and so forth. Oh, that's interesting. So
a lie could be like a thing that our bodies
were not really prepared to deal with very well, which
is weird to think of because of how common lies are. Yeah,
they right. However adaptive it may be for us to

(45:30):
believe truly, it may be adaptive for other parties if
we believe falsely. Now that and of course this is
something just to to interject here, I think this is
something that we ultimately see holds holds true with other animals,
like the role of deception of course in u in
in in certainly in hunting, in defense, and even acquiring mates.

(45:50):
They continue. An evolutionary arms race of deceptive ploys and
counterploys may thus ensue. In some cases, the other parties
in question may not even be animate agents, but cultural
traits or systems. Although such cases are interesting in their
own right, the adaptive misbeliefs we pursue in this article
are beneficial to their consumers. Misbeliefs that evolved to the

(46:13):
detriment of their believers are not our quarries, so they
stress the difference between beliefs and what they referred to
as a leafs uh and uh. So for instance, if
I'm freaked out by tall buildings, I might not believe
that I'm going to fall off, but I might a
lieve that I'm going to fall off. They leave as
in like like a moral yes. Yeah, And in this

(46:36):
case it seems to be something that is tall, a
tolerated side effect of an imperfect system. But it's not
McKay and Dinnett that end up bringing up the illusory
truth effect, but psychologists Pascal boy Yer in commentary on
the paper. Uh, this particular paper from from McKay and
dine Itt, by the ways, available online. I'll try to
include a link to it on the landing page for

(46:57):
this episode. But in his commentary, a boy A rights
dramatic memory distortion seem to influence belief fixation. For instance,
in the illusory truth effect, statements read several times are
more likely rated as true than statements read only once.
People who repeatedly imagine performing a particular action may end
up believing they actually performed it. Oh, yeah, this is

(47:19):
something I've read before. If yeah, if so, if you
just like have people walk through a task in their
mind and then ask them later if they remember doing it.
A lot of times they remember physically acting it out. Yeah,
I've certainly had this occur with me, Like There'll be
something I need to do and I'm thinking about doing it,
and then I can't remember if I actually carried it out. Uh.

(47:39):
And this is uh, this is called imagination inflation. He writes.
Misinformation paradigms show that most people are vulnerable to memory revision.
When plausible information is implied by experimenters in social contagion protocols,
people tend to believe they actually saw what is in
fact suggested by the confederate with whom they watched a video.

(48:00):
So that he's just listing lots of the ways that
we are end up with false beliefs. There's a plethora
of examples of mechanisms for putting false beliefs in our brains. Yeah.
I know there's a lot of territory covered in this
paper in the attached responses, but I can't I come
back to the sort of key reason that I sought

(48:20):
it out like, like, when is self self deception helpful?
Is it necessary for the deception of others? It doesn't
quite seem to be, like, you don't have to believe
the lie yourself to tell someone else the lie, regardless
of what telling the lie repeatedly might do to you. Well, so, boy,
he's skeptical of the idea, right, So is he basically

(48:41):
saying like, you don't want to overstate the the adaptiveness
of believing lies, but yeah, he drives sound that memory
need only be as good as the advantage and decision
making it affords. Okay, so he's essentially going for the
byproduct thing for most most beliefs He's he's saying like, look,
you know, memory needs to do certain things, and in

(49:03):
the course of doing those things, it may generate some
false beliefs. We don't have to assume that those false
beliefs themselves are beneficial, right, Yeah. And And to come
back to McKay and Dinnett, they point out that natural
selection doesn't seem to care about truth. It only cares
about reproductive success, so that there are various cases where
a particular false belief or misbelief is seemingly adaptive. You

(49:24):
believe in a non existent fire god. Okay, but say
that its laws inhibit overt selfish behavior gets you in
trouble and not work out for you in the long run.
So in that case, you have an adaptive misbelief. Now,
if the fire God wherever to actually appear, then this
would be an adaptive belief. But then there are arguably
a whole host of other false ideas that seem adaptive,

(49:46):
positive self deceptions about ability, the placebo effect for instance. Um,
they bring up the self theories of intelligence, entity and
incremental view of intelligence. Um, this being like, m am,
I born with a certain in uh intellect? Or do
I develop it over time? And how those different core
beliefs can affect your effectiveness in life Like doesn't mean like, oh,

(50:09):
I've got to work really hard in order to stay
stay on top of this, or is it a situation
where oh, I'm I'm brilliant, I can accomplish anything. And and
and of course I think you can argue for pitfalls
on both sides. And of course there's always the optimal
margin of illusion in play, which comes to us from
Roy f Ball moister. Uh. And you know, ultimately crazy

(50:29):
over confidence as we do as we discussed is going
to lead to extinction? Right, you don't want to cuddle
the polar bear. Right, cuddling the polar bear thinking you
can fly? Uh, these are going to lead you falling
off the side of a mountain or winding up in
a polar bear's tummy. Yeah. Now, I could certainly understand
the idea of socially adaptive misbeliefs. I think that thing.

(50:49):
Those things definitely do exist, and in some cases there
might be some overlap with the types of things that
get repeated so often, Like reasons for believing untrue things
can also compound each other. I mean, I'm I'm about
to explain why I think false beliefs gained through exposure
and repetition are not adaptive in themselves. But you can

(51:10):
have more than one reason for believing something that's untrue.
Think about objectively untrue statements that get repeated, as we
were talking about earlier, in a political context. The evidence
shows that we believe them partially because of how often
they repeated, but there's also social cognition and also identity
protective cognition. In other words, we tend to believe things

(51:30):
that members of our political tribe and social in groups, say,
and for social cohesion reasons that that is adaptive for us.
We also believe things that validate our sense of personal identity.
But I think it's pretty clear that that these types
of effects can work in a nasty perverse tag team format,
boosting and complementing one another. But even if we we

(51:53):
put aside these complementary effects, put aside uh social and
identity protective cognition, put those aside, and just focus on
the explanation for the illusory truth effect and repetition, there's
a really interesting thing that comes out. And this is
based on the idea of processing fluency, which is it's
a it's a concept that is way more interesting than

(52:15):
the name would let you believe. So the dominant explanation
for the illusory truth effect in the psychology literature, which
we're about to get into, um fits into this byproduct
category that we were just talking about. Based on all
I've read, it seems the informed majority opinion of psychologists
is that the illusion of truth that we get from

(52:36):
exposure and repetition is an unfortunate byproduct of generally useful
cognitive heuristic. Now, a heuristic, as we've talked about before,
is a mental shortcut. It's a fast and cheap trick
that the brain uses to arrive at a judgment or
produce some kind of result without using too much effort.
And it's it's worth driving home that our brains need

(52:57):
fast and cheap tricks. Yeah, brains are very energy hungry.
Yeah yeah, there's only there's only so much power to
go around there. So it's, uh, it's kind of has
to hold everything together with a bunch of of of
cheap tricks. Yeah, so it works something like this. Let's
let's go on, let's go with it. Assume that on balance,
true statements get uttered more often than lies. As cynical

(53:20):
as we like to be, that's probably true, right, True
statements are generally more useful to people. Also, there's a
sort of convergence effect where there's only one way for
a true statement to be true, but there are lots
of different ways to say a lie about the subject
of that statement. So like, true statements on the subject
are going to be more consistent usually than lies about

(53:42):
the subject because a lie about the subject could be
anything well, and also lies lies it in large part
have to be believable. Like think about the various true
statements and uh and false statements that might be uttered
during the course of a given day at work. Yeah, somebody, hey,
we're the bathroom, you know, it's a new building. Say.

(54:02):
Then they're they're gonna probably say, oh, it's over there,
and they're they're they're probably going to tell you the truth.
Generally does not serve people well to lie about the
location of the bathroom, right because you're gonna find out
and then you're gonna say, hey, why did you tell
me the bathroom is over there and not over there?
Are you insane? But then some of the false statements
you're liable to hear might be, hey, if you uh,

(54:24):
I don't know, let's see have you started on that
report yet, Let's do on Friday, And they'll say, oh, yeah,
I've got it, I've got it taken care of. I'll
get it to you on Friday. You know. There there's
there are a lot of statements like that that are
that ultimately you can't really check in on, like you're
just gonna have to take their word for it, right,
and that kind of lie. Yeah, you'll never find out,
you know, yeah exactly. Or I can't come into work
today because I'm sick. Well, all right, I'm you know,

(54:46):
we're not going to ask for a doctor's note. You
might be lying, you might not, but it's just kind
of a gimme on that situation. Yeah, that's another reason
that we're more likely to be exposed to true statements generally,
or at least that were more likely to detect true
statements generally, because false statements are harder to verify, usually
by design of the person making them. So you're able

(55:07):
to find yourself in an environment it's mostly built out
of true statements and believable lies. Right, So on this assumption,
you know you're you're in a hurry, and your brain
it is not designed to consume infinite energy. It wants
to try to be efficient. You don't have time to
evaluate all claims rigorously. I mean, even no matter how

(55:28):
skeptical you want to be, we can confirm this eventually.
You are just not going to have time to look
into everything you believe. You're just gonna have to take
somebody's word for it. It's not practical to try to
live by verifying every single belief. Oh yeah, I mean
it would. You've just got to have something firm underneath
your feet in order to proceed. Oh yeah, you've got

(55:49):
a bedrock, but then you've also got to have you
just I mean, you take somebody's word on where the
bathroom is, like, you're not gonna try to fact check them,
you know, well, I guess you will by trying to
go there. But other other things like that, mundane things
people tell you throughout the day, You're just gonna have
to believe them. There's just no, it doesn't make any
sense to try to verify all of it because you

(56:10):
don't have time. So therefore, an easy shortcut for assuming
that a statement is more likely true is have I
heard this statement before? Statements they get uttered more often
are more likely to be from that class of true statements. Okay,
I can roll with that. Now, there's another type of
parallel thinking that says, uh, that says, you know, also,

(56:33):
it's actually more difficult to disbelieve something than it is
to believe it because and I don't know if this
is really confirmed or if this is just one theory
about how the information processing in the brain works. But
just as a quick tangent, there is a model of
thinking that says, Okay, to believe a statement is true,
to hear a statement and say I believe it is

(56:54):
just one step in the brain. To hear a statement
and reject it as false is a two step procedure
where first you have to hear it and believe it
to understand it, and then you have to go back
and revise what you just did and say, but it's
not true. Yeah. It's ultimately like a king setting down
at a banquet table. Right, is the king to simply

(57:15):
eat every uh, every food item on the plate and
trust in it and trust that he's not going to
be poisoned or is he going to independently test each thing?
Have the food taster come up, put the mid transfer
this goblet of wine into the rhinoceros horn, etcetera. Hold
the magic crystal over this plate of beans. And you know,
another thing that came to mind was some of our

(57:37):
discussions we've had in the past about consciousness and imagination
as a simulation engine, that we use our imagination to
mentally simulate possible outcomes so that we can best choose
how we're going to react to the world. And when
I'm presented with something that might be a lie or
or of some sort of untruth or a bit of

(57:57):
of misinformation. I still can't help but imagine it, right,
I'm having to create a mental picture of it. Um
in a sense, you're kind of believing it for the moment. Yeah, yeah,
because I have to simulate it in my head. And
in cases of people who can form mental pictures, you
have to form those mental pictures and uh, now you know,
And I imagine a lot of this what shakes out

(58:19):
after has to do with an individual's particular worldview. But
I wonder if in some cases it's like a type
one error in cognition, you know, and it's a false positive.
Uh that uh, that I I'm imagining this is a
possible outcome, and then maybe I'm more inclined to believe
it just so that I can keep it from harming me. Yeah.

(58:40):
I think that's a very very reasonable way of imagining it.
But so here's where we get into the final part
of our discussion today, which is the idea of what
I mentioned a minute ago, processing fluency. So processing fluency
just means how easy it is to process incoming information.
And you wouldn't believe the research, John, how many of

(59:01):
our decisions and mental outcomes seem to be based at
least in part on processing fluency. The brain really really
likes things to be easy. It really likes things to
be to go smooth, to not be too difficult. Uh So,
to start off, based on existing research, it definitely seems
true that people have an easier time processing statements and

(59:23):
information they've heard before. In fact, Robert, you probably know
this from direct experience. Like a familiar statement, when used
in the context of a sentence or an argument, is
processed quite smoothly, but a new, unfamiliar statement in the
same context often causes you to say, wait, hold on,
back up, I need to wrap my head around this.

(59:44):
Familiar is easy. Unfamiliar is difficult. But how would you
test whether the ease of processing information we're actually affecting
our judgment of of the truth of a statement. And
I want to get into a couple of quick, really
interesting studies on this that we're so simple and so brilliant.
So in Rayburn Schwartz did a study in consciousness and

(01:00:07):
cognition called Effects of perceptual fluency on judgments of truth.
They took true or false statements, kind of like in
the studies we've seen before of the variety osorn no
is in Chile, or Greenland has roughly fifty inhabitants, and
they presented those statements to people, and the main independent
variable was that they presented the statements either against a

(01:00:28):
white background in a high contrast, easy to read color,
or in a low contrast, hard to read color, and
apparently that made all the difference in the world. The
idea is that the hard to read one has low
processing fluency, it's difficult, and the easy to read one
has high processing fluency, it's easy to process. And they

(01:00:49):
found that this made a big difference in what people
believed was true or false. Uh quote. Moderately visible statements
were judged as true at chance all, whereas highly visible
statements were judged as true significantly above chance level. We
conclude the perceptual fluency affects judgments of truth. This is

(01:01:10):
another one that makes sense from a marketing standpoint, right,
just make your message very clear, very very easily absorbed,
and people will begin to buy into it. Oh, absolutely,
And this has actually been studied in marketing and consumer preference.
Like there is one study from Novimski at All published
in two thousand seven in the Journal of Marketing Research
that in short, it found that consumers more often tend

(01:01:32):
to choose brands that represent ease and fluency. Like say,
if the information about a brand is easy to read,
consumers are more likely to choose that brand that's the
one they want. So that makes me wonder why Coca
Cola is written in cursive. It seems like you would
want it just very clear, bold letters. Well didn't they
try to change the can? When did? I haven't really
looked at a can recently. Maybe it's not incursive anymore,

(01:01:56):
you know, they're Actually you might have two things in conflict, right,
So you could have in conflict if you if you've
got an old logo that people are familiar with but
it's hard to read, that the hard to read part
might be undercutting their preference for it, but the fact
that it's familiar might be boosting their preference for it.
If you try to change it to something that's easier
to read, the change might introduce more difficulty in processing

(01:02:20):
than the ease of reading would improve processing. Yeah, that
makes sense, all right, So I want to cite one
more study, a study by Christian uncle Bach in two
thousand seven from the Journal of Experimental Psychology, Learning, Memory,
and Cognition and Uncle Bach does an interesting thing in
the study where he's got a hypothesis he wants to test.
He writes, quote, I argue that experienced fluency is used

(01:02:44):
as a que and judgments of truth according to the
cues ecological validity, meaning like successfulness in the real world.
Quote that is, the truth effect occurs because repetition leads
to more fluent processing of a statement, and people have
learned that the experience of processing fluency correlates positively with

(01:03:04):
the truth of a statement. So this is sort of
what we were talking about earlier. It's a heuristic that
you know, you're more likely to encounter true statements in
the wild. People learn this through experience, and then they
use the the the queue of processing fluency to to
be the judge of whether something is familiar or not.
And if it's familiar and they get that processing fluency

(01:03:27):
bump it's easy to process, then they're more likely to
believe it's true because that's what has worked for them
in the past. And if this is true, Uncle Box says,
I bet I could reverse it with a little bit
of training, and he does. He's got an experiment where
with a training phase he actually does three different experiments,
and essentially what he does is that he trains people

(01:03:49):
in a scenario where things that are easier to process,
either because of being easier to read or because of
repetition and familiarity. Either way, those things are more core
related with the thing, with the thing being false, and
when people get trained in sessions like that, they lose
the effect. So the good takeaway there is that if
he's correct, it would probably also mean that your susceptibility

(01:04:12):
to the lusory truth effect is dependent on what kind
of environment you've trained in, and that you could potentially
untrain yourself on it. But that would be hard to
do because we all live in this world all the
time where most of the time people are telling us
true things. Right, and and again, the brain is still
going to need all of these shortcuts in order to
function properly. Yeah, exactly, But you could just be using

(01:04:34):
the opposite shortcut, Like if you live in a world
where people lie to you all the time, Uncle Bock's
results here would suggest that you would eventually adapt to
this and you would instead become exactly the opposite. New
claims you've never heard before would seem more true to you,
and repeated claims that you're familiar with would seem like
lies to you. Okay, so there's hope for us after all. Yeah,
I mean, but we can't expect to live in a

(01:04:55):
world like that, and we don't want to live in
a world like like like that, Like you don't want
to train your brain to live in a world where
everything is assumed to be a lie. Oh and I
surely somebody has has considered exploring this in fiction. It
would be it would be a delicate affair to to
really put it together and make it work on paper.
But it's a world that I don't want to live in,

(01:05:17):
but I kind of want to visit fictionally. Oh yeah,
i'd go there with you. That that's a good one
to to come back to. But just as a quick
note before we close out today, I think this idea
of processing fluency is a really interesting one. There's tons
of research on it, Like, uh, there is a study
I found by Sasha Topalinski from in Cognition and Emotion

(01:05:38):
about how processing fluency affects how funny we find jokes
that apparently, if a joke is easier to process, we've
got high processing fluency on the joke. We think it's funnier.
I guess it just like feels good to get it
without with less effort or something. Uh So, there were
multiple experiments, but basically here let me let me give

(01:05:59):
you a quick preview. Gonna say a word, Robert, Peanuts.
Do you like that word when you think about it? Peanuts? Peanuts?
It's pretty good. It's it's not the funniest where it's
no cheese. But but but I like it. Okay, I
just said that word. So one example of this type
of study would be if you prime somebody with significant
nouns from the punch line of a joke fifteen minutes

(01:06:19):
or even up to just one minute before you tell
them the joke, people find the joke more hilarious. However,
if you tell them a significant noun from the punch
line immediately before the joke, they find the joke less funny,
and the authors think this is probably just or the
author thinks this is because if you tell them right
before the joke, it sort of spoils the punch line.

(01:06:40):
But knock, knock, who's there? Cash? Cash? Who? No? Thanks?
I prefer peanuts. Ah See, it works, it's not not
even but you already established peanuts, so it helped, right.
I tried to let a minute or so elapse there.
I don't know if it worked well. It's also complicated
because we did bring up peanuts and peanut better earlier
in the episode. I didn't even think about that, but

(01:07:02):
this Actually I am not a student of of stand
up comedy by any stretch of the imagination, but I
watch enough stand up to see that just that that
common structural tool that they use where you have the
call back to a previous joke, and they'll often do
it right at the end and then it's good night everybody.
That's the high note. And it it's not even necessarily

(01:07:23):
like a call back to their to the funniest moment
in the bit or the funniest bit in the in
the stand up performance, but just the fact that they've
brought your mind back to it. It generates laughter, and
it's the the moment to end the show. On Yeah.
The theory is that it's it's very satisfying to have
a joke that you've where you've been primed for the

(01:07:45):
punch line already, because it's so much easier to get
the punchline quickly and have that experience of familiarity in
the yaha movement moment, because when you say a word
and then you say the word again later, the second
time you hear the word, you've been primed, like, you know,
it's more fluid. So yeah, I think that may very
well be going on with callbacks. Another part of the

(01:08:06):
same study was that, like the studies we've been seeing before,
jokes presented in an easy to read font were rated
as funnier than jokes presented in a really hard to
read font. That's kind of not surprising, but processing fluency
it plays into all this stuff, like there is research
about how opinions that are repeated more often, even just

(01:08:26):
by a single person in a group, come to seem
more prevalent in a group. So you've got ten people
standing around, and then you've just got Jeff over here,
and Jeff keeps saying the same opinion over and over again,
even if you're aware it's just Jeff saying it in
the end. If he does that, you will think that
that opinion is more prevalent in the entire group the
more people hold it. Well, that would make sense. You

(01:08:48):
have one person in a group who say continually trashes
on the movie Aliens. Oh no, why would that happen.
I don't know, but let's say it it happens. You know,
I could see where it could reach the point where
you're kind of like, I don't really know how I
feel about aliens now, but because I sure do here
here Jeff uh talking trash about it all the time,
or you could walk away from it being like, man,
I don't understand all these people who hate aliens even

(01:09:09):
though it's just one person. Yeah, that that seems to
be something that would go on. Processing fluency also appears
to have something to do with aesthetic pleasure. There's been
a lot of research and theory about this that that's
a major component of what feels aesthetically pleasing to us
is based on what's easy to process. Another part is
that processing fluencing fluency apparently affects how credible a face looks.

(01:09:33):
Uh So, if are you going to believe somebody, Well,
it turns out if their face is easier to process,
especially because you've seen it a bunch of times before,
you're more likely to believe it. Even if they're not
famous and they're not somebody you know, they're not like
somebody you've had experience with that you can you know,
judge their credibility. Just random faces shown to you in
different sessions of an experiment. If you've seen them before,

(01:09:55):
they're more credible. Of course, that reminds me of various
experiments over the years involving the believability of people with beards,
people with facial hair or beards harder to process they have.
I have not looked into it recently, so I don't
know if there are any more recent studies that that
that crack this nut. But but there there have been
studies that have looked out in the past where they

(01:10:15):
make the argument that, yes, an individual with the with
the beard, you're going to have a little more distrust
towards them. Well, obviously nobody should trust me. Well, no,
we trust you because we know you. Joe, Yeah, do
you really? Do you ever really know someone? Well, I'll
tell you one thing I know, and that's peanuts stuff.
You got me. You got me there, you made me
laugh and my joke didn't make you laugh. Okay, Okay,

(01:10:38):
So we gotta wrap up there. We've gone long here,
but so we'll be back in the next episode to
explore more recent findings and some of the ways that
the illusory truth effect really does matter in our in
our political and social world. Um. But so main takeaways
I would say today is that the illusory truth effect
is real. Exposure and repetition really does change our beliefs.

(01:10:59):
The lusory truth effect is small, meaning it doesn't automatically
overwhelm other criteria in our decision making and judgment. In fact,
in many cases it appears that whether or not a
statement is actually true is more important to our judgment
than whether or not it's repeated or made easier to read,
or any of these other processing fluency boosts. But on average,

(01:11:21):
over lots of repetitions, it's easy to see how this
could have a big effect, especially when you bring it
back to propaganda purposes on things we believe as a society,
things that shift voting patterns in small but significant ways
and stuff like that. Yeah, that's the key. That it's
not occurring within a vacuum. It's uh, it's it's it's
affecting and being affected by all these other um mental

(01:11:42):
processes and UH and factors that are affecting our decision
making and worldview totally. But we will get more into
that in our next episode. In the meantime, be sure
to check out all the episodes of Stuff to Blow
Your Mind at stuff to Blow your Mind dot com.
That is the mothership. That is where you will find everything,
as well as links out to our verious social media accounts.

(01:12:03):
If you want to support the show, we always urge
you to leave us a positive review, leave us some
stars or whatever the rating system is. Just rate and
review us wherever possible. Big thanks as always to our
excellent audio producers Alex Williams and Tory Harrison. If you
would like to get in touch with us to let
us know your feedback on this episode or any other,
or to let us know a topic you'd like us

(01:12:24):
to cover in a future episode, you can always email
us at blow the Mind at how stuff works dot
com for more on this and thousands of other topics.
Is it how stuff Works dot Com was first time

(01:13:02):
a US bad by a brother

Stuff To Blow Your Mind News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Lamb

Robert Lamb

Joe McCormick

Joe McCormick

Show Links

AboutStoreRSS

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.