All Episodes

August 31, 2019 47 mins

If a lie is repeated often enough, are we more likely to believe it? Sadly, the answer is yes. Psychologists call it the illusory truth effect and it influences both our daily lives and the larger movements of politics and culture. Join Robert and Joe  for a two-part discussion of untruths, the human mind and just what you can do to fight the big lies at work in your world. (Originally published July 12, 2018)

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, you Welcome to Stuff to Blow Your Mind. My
name is Robert Lamb and I'm Joe McCormick, and it's Saturday.
Time for a vault episode. This episode originally published July twelve,
and this is part two of our exploration of the
illusory truth effect. That's right, this one, I will land
the plane for you and uh and hopefully give you
some tools that you might be able to employ to

(00:28):
fight the power of the illusory truth effect. Or at
least that's that's the intention. All right, let's dive right in.
Welcome to Stuff to Blow Your Mind from how Stuffworks
dot com. Hey, you, welcome to Stuff to Blow your Mind.

(00:48):
My name is Robert Lamb and I'm Joe McCormick, and
we're back part two of our exploration of the illusory
truth effect, probably the liar's best trick. If you haven't
heard in our last episodes, you'd probably go back listen
to that first. But if you have int or if
you have, let's just do a quick recap of what
we talked about last time. We discussed all of the

(01:09):
research on this thing that's sort of been part of
folk wisdom that if you say something and if you
repeat it, and repeat it and repeat it. People become
over time more likely to believe that thing, and that
is thoroughly validated by experimental research. Right. And we also
talked a little bit about why does it even make
sense that we would come to believe things that were

(01:30):
not true about the world that we live in just
because they were repeated. Yeah, and so the basis that
we ultimately ended up on last time that seems to
be favored by most of the psychologists to study this
is based in the idea of processing fluency that for
whatever reason, one researcher we talked about last time came
to believe that it was because of conditioning based on

(01:51):
real world effects. But for whatever reason, we tend to
associate things that are easy to process, things with high
processing fluency with truth. So something's easy to read, we
think it's more true. Or if something is an idea
we've seen or heard or encountered before, because that's easier
to process because of familiarity, we believe that it is

(02:13):
more likely to be true than if we're encountering it
for the first time. But of course, in all of
this extreme implausibility is going to be a boundary condition
that's going to kick in. So this is like the
Ted Cruizes, the Zodiac killer um level of of of implausibility.
What's just because the ages don't match up right, well,
just and it's just kind of like, alright, I'm not

(02:33):
bleeding that that sounds ridiculous, but some people do believe that.
So your boundary condition may not be where somebody else
canoundary condition is well, the boundary conditions will vary from
individual to individual. Um. So yeah, So the question that
we should address to start off in this one is
in the last episode, we discussed how this effect has
been thoroughly validated in the lab. But here's a question.

(02:56):
Does it work in the real world and is it
really all that powerful? Like a lot of researchers seem
to assume that, surely, if you already know something about
a subject, repetition of a contradictory false statement wouldn't actually
undermine your real knowledge, would it. Surely they would tend
to assume that this this ilusory truth effect only works

(03:18):
for state statements that were uncertain about to begin with,
and statements that seem highly plausible, like if you didn't
know anything about either Ted Cruz or the Zodiac Killer
really and then you would just sort of say, all right,
maybe that's possible, whereas an individual who has read multiple
books on the Zodiac Killer would say, no, that doesn't
that doesn't match up. That is just ridiculous. Yeah, So

(03:39):
that that's the assumption. But unfortunately some more recent research
has really turned that assumption on its head. Uh So,
I want to talk about an important recent study in
the illusory truth effect that brings it's a bearer of
bad news. The study is from the Journal of Experimental
Psychology General in Fasio, Brashier, Pain and marsh and it's

(04:03):
called knowledge does not Protect against illusory truth. So they
pointed out that the illusory truth effects that we talked
about last time, based on processing fluency, is widely accepted,
well established, but it had been previously thought that this
effect was constrained by a few things. Now, one constraint
shown to actually exist in the literature is recollection of

(04:25):
the quality of the source of the information. So previous
studies have shown that if you specifically remember where a
statement came from, and you consider the source of the
statement a dishonest or untrustworthy source, that can produce kind
of a reverse truth effect, where repetition of a statement
known to come from a liar or an untrustworthy source

(04:47):
causes us to disbelieve it. So this sounds like this
should be good news, right right? Yeah? Did I ultimately
the question did I hear that on the radio? Did?
Or did I see it on a T shirt? Yeah?
Or was this the cover of the National Enquirer? Like
you remember that's where it came from, and you're you know,
that's an untrustworthy source. So it actually has the reverse effect.

(05:08):
You hear that repeated and it makes you go no, no, no,
that's not true at all. But this isn't as much
of a protection as we think, because honestly, how well
do you remember the exact source of every bit of
semantic knowledge in your head? Why no, bat Boy did
not come from the New York Times, But there are
lots of other things that are in your head that

(05:30):
did come from the cover of the National Enquirer, and
you don't remember that that's where it came from. I
guarantee it you've stood in line at the grocery store. Well,
if it's a story about any particular aged celebrities, brave
last days or sad last days, they probably came from
inquire But yes, there there, there's probably there are probably

(05:51):
some stories in there that I would not definitely be
able to pin down to inquire versus other sources. Robert,
I see right through your bravado. Some Inquirer stories have
gotten through to you. Uh. Yeah. Other studies have backed
this up. After just a period of a few weeks,
what may have once been stored in the brain as
false claim by an untrustworthy source could potentially, over time

(06:12):
become just a familiar statement I remember, which of course,
once it's familiar translates it into more likely to be
a true fact. Uh. There was at least one study
that looked into this, by beg Annas and Feignacci in
Nino called Dissociation of processes and belief, source recollection, statement, familiarity,

(06:32):
and the illusion of truth, And basically they found that
when the source of a claim is not super memorable
as unreliable, familiarity can be more important than truth or reliability. Okay,
so it's not necessarily a like a magazine that that
has a negative reputation in your mind, but it's not
something that's completely reputable either. It just kind of follows

(06:54):
in between or even if it has a negative reputation
and it's just not all that memorable, you can lose
track of where it came from and it will suffer
from the illusory truth effect. This can happen even when
you should have remembered that it came from an untrustworthy source.
There are exceptions when the source is really memorable, but
a lot of times it doesn't protect you. Now, the

(07:16):
second assumption about constraints on the illusory truth effect is
about knowledge. Right, We've all got knowledge already in our heads,
and the idea is that pre existing knowledge will protect
against the effect. And this is what came under scrutiny
in this particular study by Fasio and our co co authors. So,

(07:36):
despite being an assumption repeated again and again in the
illusory truth literature, very few of the studies actually bothered
to test whether knowledge protects people. I was just sort
of asserted to be true as if it were obvious,
And the few that did bother to tested in any
way generally did so by testing how the effect presented
in people who claimed subject area expertise. So, uh, these

(08:01):
studies yielded contradictory results. But here's a couple of examples.
Scroll in nineteen eighty three found that if you rate
yourself as an expert on cars, Robert, would you rate
yourself as an expert on cars, but some people would win.
Some people around the office. Yeah. Car experts, well found
suffered smaller illusory truth effects uh than non experts on

(08:21):
car trivia. So that would suggest, Okay, knowledge gives you
a little bit of an edge. You're not You're not
as susceptible as amateurs. And then Parks and Tough in
two thousand and six had people rate claims about known
versus unknown consumer brands, and the illusory truth effect was
bigger for statements about brands that people were unfamiliar with.

(08:42):
That makes sense. So like, if you didn't already know
anything about this brand, you were more susceptible to illusory
truth effect on statements about the brand. Yeah, that makes
perfect sense. On the other hand, Arkey's, Hackett and Boem
in ninety nine found the opposite, that the higher a
person rated their expertise sent a subject, the more susceptible
they were to the illusory truth effect in that subject area.

(09:06):
Makes you wonder if there's like some kind of insecurity
or like identity protective thing going on there. Yeah, Like
I don't want I don't I don't want to be wrong.
So I'm just gonna nod my head on that situation.
I don't want to look bad. I've already staked my
reputation on being a car expert. Also, boem In ninet.
Found that psychology majors showed a larger illusory truth effect

(09:28):
on psychology than non majors. But there's some issues with
these studies. So Fasio and her co her co authors
point out that these types of tests don't actually manipulate
direct knowledge of whether the statements are true or false,
just sort of the perception of related knowledge. So they
wanted to test this directly. They created a big list
of statements like we've seen in these other tests, where

(09:51):
you'll have true statements and false statements, and they based
this off existing lists of facts that have been shown
in previous studies to be either generally known were generally unknown.
And this created four categories of statements. You've got known truths,
unknown truths, known falsehoods, and unknown falsehoods. Here's some examples.
You've got a known truth quote. The cyclops is a

(10:13):
legendary one eyed giant of Greek mythology. Robert checks out.
Checks out. Okay, how about the Pacific Ocean is the
largest ocean in the world. Checks out. Then you go
into known falsehoods. The minotaur is the legendary one eyed
giant of Greek mythology. Absolutely not the Atlantic Ocean is
the largest ocean in the world, and most people are
expected to know that these are not true statements. Then

(10:36):
you've got unknown stuff. Here's an example. Unknown truth. Billy
the kid's real last name? What was it? It's Bonnie.
Unknown falsehood, Billy the kid's real last name is Garrett. Yeah,
I would have would have been a toss up for
me because I did not know Billy the kid's last name.
I thought maybe it was a kid, you know, as

(10:57):
in Kid Rock as Kid rocks first name is Billy
kids like his middle name is the So there, there
you go. So Experiment one, using this set of statements,
forty students in the first phase. Subjects were shown a
subset of statements from the list of all four types,
and they were just asked to judge how interesting the
statements were. You know, that sounds like a really fun task, right.

(11:19):
Billy the kid's last name is Bonnie. How interesting was
that I get more interesting than some names? Yeah? Maybe,
I guess. I don't know. I didn't find that one
that interesting. I don't know. I guess it sounds like
Bonnie is in like pretty it sounds it sounds maybe
a little odd for what based on the photos needs
to be kind of like an ugly looking, you know,
western outlaw. It makes me think like a Robert Burns

(11:41):
kind of poem thing. And Bonnie Glenn or whereas Garrett
sound you know, has kind of a guttural sound to it. Yeah,
got right, Okay, So then they got the second phase.
This happened immediately after the first phase. Students were given
another subset of statements from the list, again all four
types of statements, and they were warned that some statements
were true and some were false, and they were also
warned that they would see some repeats from the list

(12:03):
that they had just reviewed for how interesting they were,
and then they rated the claims on a scale of
one to six about how true they were. There was
also at the end an open ended knowledge check test
with it had these open ended questions like what is
the world's largest ocean? What is the one eyed monster
of Greek myth uh to strengthen the experiment or's picture
of the individual knowledge of each participant. So then you

(12:26):
got the results. First of all, the original findings of
the illusory truth effect were replicated. Repeated statements got higher
truth ratings than new statements that the students had never
seen before. But also, quite surprisingly, knowledge did not seem
to prevent the illusory truth effect. Statements about both previously
known and previously unknown facts were rated more true if

(12:50):
they were repeated than if they were new. In other words,
repetition increased perceived truthfulness, even for contradictions of facts that
you know. So I want to quote from the author's quote.
Reading a statement like a sorry is the name of
the short pleaded skirt worn by Scots? Increased participants later

(13:11):
belief that that statement was true, even if they could
correctly answer the question, what is the name of the
short pleaded skirt worn by Scots? Isn't that bizarre? So
like you ask somebody what is the short pleaded skirt
worn by Scots? And they answer kilt? But if you
show them the phrase a sorry is the name of
the short pleaded skirts skirt worn by Scots? And then

(13:34):
show them the phrase again later they will they will
take the repeated phrase as evidence that that statement is
more true than if they saw the statement for the
first time again. It it comes back to the shortcuts
that our brain make. How weird is that's bizarre? I
mean again, it's kind of a reminder that the human
culture and human language just complicates everything. Yeah, it's crazy.

(13:58):
Uh So again, there has found that the repetition effect
also emerged for truth. So it wasn't just false statements,
it was true statements to whether it's true or false,
if you repeat it, people believe it more. So the
takeaway from this first experiment is whether a statement is
true or false, and whether you already no better or not.
If somebody repeats the statement to you, on average, you're

(14:19):
more likely to believe it. And then the second part
of their study was kind of interesting. So they're discussing
their own finding and they say, quote, the data suggests
a counterintuitive relationship between fluency. Remember that's the fluency processing
fluency how easy it is to process information between fluency
and knowledge. Prior work assumes that people only rely on

(14:40):
fluency if knowledge retrieval is unsuccessful i e. If participants
lack relevant knowledge or fail to search memory at all.
Experiment one demonstrated that the reverse may be true. Perhaps
people retrieve their knowledge only if fluency is absent. So
to test this out, they did a second experiment, and
they repeated a modified version of the experiment to test

(15:02):
it uh. They believe that the results indicate that people
sometimes use a fluency conditional model, which means they would
rely on fluency even if knowledge is available to them.
Do you start with fluency and influency fails, you fall
back on what you actually know. We shouldn't over interpret it,
but in a limited way. There may be processes in

(15:24):
the brain that say, I'm going to go for what
feels easy before I even check my memory to see
what I know. What kind of lines up with there
the mind's tendency to want to offload memory to people
and gadgets, like I do I have to remember that
anymore if the machine is going to do it or
my spouse is going to do it, And the brain
says no, I think, well, that's completely prune that section.

(15:47):
Here's a question, how often have you used a calculator
to do math that you could yourself easily do? Um?
You know what I mean, like, not not problems that
would be really hard, but something that if you us
took ten seconds, you could probably solve in your head. Yeah.
I do that in Dungeons and Dragons sometimes when we
get into hit points and whatnot. You know, I could

(16:08):
certainly easy. I could either do it in my mind
or just do it, you know, and pen and pencil
real quick. But I'll go ahead and type it into
my calculator just to yeah, I get it done. I've
done the same thing too. It's weird. It's a little
disturbing why or search engines, you know, just just throwing
in the mathematical equation something really simple, um so, such

(16:29):
as just determining how old a particular actor is or
how old they would have been during a certain movie.
I feel like I do that all the time, Like
you're saying you do that even though you could easily
know the answer if you checked your own memory. M M.
I feel like I do that less with search engine,
Like I definitely do the calculator thing. Yeah, not so

(16:51):
much that I would remember, say how old Robert de
Niro was during Godfather too, but I would just but
I was it would suddenly wonder how old he was,
and so do the simple mathematical scenario of you know, subtracting,
subtracting one year from the other. Let's plant a lie
in everybody's mind right now, Robert de Niro was four
hundred and twenty three years old when he did Godfather too.

(17:12):
And now you'll remember that that's implausible, that that's the
implausibility barrier in action. Oh yeah, maybe I should do
something else. Yeah, we'll come back to that. But anyway,
So the conclusion of this experiment by Fasio and co
authors is that quote participants demonstrated knowledge neglect, or the
failure to rely on stored knowledge in the face of

(17:33):
fluent processing experiences, so they'd rather go for what was
easy to process than what was the correct answer based
on their own knowledge. At the same time, it's really
important to note that this doesn't happen every time, it
doesn't happen with every person, it doesn't happen with every question,
and it doesn't necessarily happen with huge effects, so the
effect is relatively small. This was actually pointed out pretty

(17:57):
well in a BBC article in sten by Tom's Afford.
He pointed out that while repeated exposure to statements increase
their believability. The biggest influence on whether a statement was
rated true or not was whether it was actually true.
So the the illusory truth effect is valid, and it
does change the averages of the answers, but it's not

(18:18):
like the only thing that matters, and it doesn't overpower
our real knowledge about the truth. It's just weird that
it does have some effect in the face of actual
knowledge we have when actual knowledge should mean it has
no effect. Does that make sense? Yeah? Again, I just
come back to the you know, to to to the
fact that the mind is going to offload whatever information

(18:40):
it can or whatever processing it can. Yeah, those lazy
brains of ours. Okay, well we should take a quick
break and then when we come back, we will discuss
more recent research on the illusory truth effect and some
related concepts and what it means for our lives. Than
thank alright, we're back. So we've discussed the subject of
false memories, but for the many ways in which false

(19:01):
memories can form um Psychologist Daniel Shackter identified seven in
fact his his work The Seventh Sins of Memory, transient
sam's absent mindedness, blocking, misattribution, bias, persistence, Uh, and I
like to think of it this way. Memory is is
not something that is carved in stone, but rather uh,

(19:22):
something that is sculptured from clay, and the clay of
memory remains valuable every time we retrieved from the drawer
and handle it. As psychologist Pascal Boyer, who referenced in
our last episode pointed out, um examples of this range
from wordless recall intrusions and experiments, to therapy induced imaginings
of past lives and or ritual abuse, which we've we've

(19:45):
discussed on the episode on the on the show before
in past episodes. Uh. So, memory retrieval is a very
delicate stage. There's actually a line from the television series
The Expanse and I think captures this perfectly well. The
character Miller played Thomas jane Um. He sums up that
they have the character sum up this rather perfectly says,

(20:05):
you know, every time you remember something, your mind changes
it a little, until your best and worst memories are
your biggest illusions. So in the two thousand and eleven paper,
remembering makes evidence, compelling retrieval from memory can give rise
to the illusion of truth. From Jason d Azubko and
Jonathan Fogel, saying, The authors conclude that quote memory retrieval

(20:27):
is a powerful method for increasing the perceived validity of
statements and subsequent illusion of truth, and that the illusion
of truth is a robust effect that can be observed
even without directly pulling the factual statements in question. WHOA,
So this is sort of the same effect, but not
statements coming in from the outside. Right. So they conducted

(20:48):
a two fifty seven person study, all individuals from the
University of Waterloo. So we're, you know, relatively small study,
and they and they admit that they quote may have
made it particularly difficult to observe any different is between
our control condition and our experimental conditions. So as always,
you know, more studies are required. But uh, here's how

(21:08):
it shakes out. Quote. If this account is correct, the
current work demonstrates that information retrieved from memory cannot only
be viewed as relatively more important than more difficult to
retrieve information, but can also be viewed as more important
than information that is explicitly provided. In particular, information that
is retrieved from memory may actually be more fluently processed

(21:30):
in general than information that is directly perceived. So the
idea here is that repetition entailed in memory retrieval need
not be from an external source. It can be internal.
In the form of memory retrieval, it is it is
quote naturally more familiar and fluent than information that is perceived. Wow,
that that is profound. Actually, like the idea that you

(21:53):
that your memory is the haze of your memories is
greater evidence sometimes to your own mind. Then what's in
front of your eyes right now? Yeah, and it and
it means that like for the for the lie or
the the untruth to to resonate, Uh, it only needs
to be memorable, like something that you'll continually retrieve. Oh yeah,

(22:15):
and that forms that serves as a form of repetition. Oh.
And this is so true of so many of these
lies they get repeated so often in public conversations. Is
that they're the really memorable, weird outlandish ones that stick around.
I think about in the last episode, we talked about
the the belief that's still so common that Barack Obama

(22:35):
was born in Kenya. Yes, there's no evidence of it,
and it's like such a weird thing to suggest that
it sticks in people's brains, right, Yeah, And then you
keep coming back to it. You keep rethinking it. Um,
I guess we just made you think of it again. Yeah,
that's the horrible thing about this. We'll have to have
a discussion about that at the end of the episode.
Another way of looking at it is this, So, if

(22:56):
you're a regular listener to this podcast, if I were
to remind you in every episode that Joe drinks a
full cup of coffee every morning before he gets out
of bed, that's not true. That's a lie that I
just made up. But if I repeated it in every episode,
and even if Joe said it's a lie, you're hearing
it enough right that the repetition is going to uh

(23:16):
potentially influence you. And it's also it's it's a perfectly
reasonable lie, right. There's no like if you said, oh,
that's actually what I do, nobody would think you weird
or anything. Right, it'd be kind of weird that I
drank it without getting out of bed. Well, I assume
somebody brings it to your I mean, I didn't say
that you had the coffee machine set up on the
night stand, coffee robot that pours coffee on my face

(23:37):
every morning. But but what if instead of saying this
lie every episode, what have just once. I told everybody
that Joe McCormick before he gets out of bed in
the morning, he um, he shoots back three six hour
energy drinks one after the other. No, why did you
do that to me? Robert? Like, but that's potentially more

(23:58):
memorable because it's a little strange, it's maybe a little
more funny, and therefore it's exactly the kind of untruth
that might pop up again. Like you're just you're thinking
of Joe. You're hearing Joe talk and you're like, oh, yeah,
Joe shooting back six hour energy drinks first thing in
the morning. I don't do that either. Come on, But yeah,
I totally see your point, and I think you're absolutely correct.

(24:19):
So what they're saying here is essentially that there is
an illusion of truth effect, not just for statements you
hear from the outside, but from your own memories. Every
time you go back and check in with the memory,
you're reinforcing it and making it seem more true, even
if you didn't necessarily believe it to be true in
the first place. Yeah, and you know, they don't really
get into this, but it also makes me think of

(24:39):
like just negative things people might have said to you
in the past. You know, if you know some criticism
that is is not accurate, but it steams you and
then you end up sort of you end up reflecting
on it, perhaps even traumatically, and then it makes you
more susceptible to its power. Well yeah, I mean, as
as always you have that fear that all critics systems

(25:00):
of you are accurate. Now I'd like to turn to
another paper here, this one with the title making up
History False Memories of fake news stories, and this is
from Europe's Journal of Psychology from two thousand and twelve. Uh,
and again it's worth noting, uh, this is again a
two thousand twelve papers, so this predates the more recent
usage and uh, politicization of the term fake news. So

(25:24):
in this they wanted to see if false news stories
that were familiar would result in the creation of false
memories of having heard the story outside of the experiment.
So they had a small study here forty four undergraduate
psychology students and they're participating in exchange for course credit.
They exposed the participants to false news stories that they
portrayed as true, and then five weeks later, the participants

(25:46):
were found to be more likely to rate the false
news pieces as true than test subjects only just exposed
to the stories. Uh the the author's right. These results
suggest that repeating false claims will not only in ease
their believability, but also result in source monitoring errors. So
again we get in back into this situation where you're

(26:06):
you have this headline or this news story popping around
in your head, but you ask yourself, where did I
hear that? Was it a talk show, radio talk show?
Was it the BBC? Was it a verified news source
in my Facebook feed? Or just some dubious bit of
news that's kind of passing through. Oh and by the way,
the author not authors on that particular UM paper is

(26:28):
Danielle C. Polage. Yeah, this really makes me think about
how um, I don't know, I wonder how the Internet
has changed the way we think about sources of information.
Like has the Internet and say like social media feeds
made us more scrupulous about the sources of information or

(26:48):
less scrupulous? I don't know, Or maybe it's had a
you know, divergent effect on different people. Well, I think
you have you you you do have sort of two
different timelines going on there, because I feel like, on
one hand, you have the industry responding. You have like Facebook,
for instance, responding to criticisms and an overall need for
better sourcing and UH an attribution of of publication sources.

(27:12):
And then also, I think every individual is probably going
through this, the situation where perhaps they're more trusting and
then they realize, oh, I really need to be better
about seeing where I'm getting my information and then have
it to self correct. Now there's another paper that gets
into some of this here and this UH is a
forthcoming paper from the Journal of Experimental Psychology General. Now

(27:33):
we should just note with the lester or a scarre
this is a forthcoming paper, so take with a grain
of salt that it has not yet fully passed all
of the pre pre publication review procedures. But it's a
it's been put out there and people have been talking
about it. Yeah, titled prior exposure increases perceived accuracy of
fake news and and key here and all this is
quote fluency via prior exposure. They say that even a

(27:56):
single exposure increases subsequent perceptions of accuracy see quote. Moreover,
this illusory truth effect for fake news headlines occurs despite
a low level of overall believability and even when the
stories are labeled as contested by fact checkers or are
inconsistent with the reader's political ideology. Also key here that

(28:18):
is the extreme implausibility that we've been discussing, you know,
this boundary condition over the illusory truth effect um. Only
a small degree of potential plausibility is sufficient for repetition
to increase perceived accuracy. How small? Well, I imagine that's
going to vary from individual to individual. Right, we come
back to this we mentioned earlier that then my my

(28:39):
boundary condition is not gonna be the same as yours. Yeah. Yeah,
that's a weird thing to wonder about. So like you
might say that, for one person, if you showed them
a headline about bat Boy, they would not that wouldn't
even register as possibly true to begin with, So they're
never gonna believe it's more likely to be true later,

(29:00):
But somebody else might. But a lot of those other
types of headlines, just like weird, uh, you know, kind
of nasty rumors about celebrities or politicians, A lot of
those that are slightly more plausible than say, bat Boy,
are probably gonna stick in a lot of people's minds.
I think about the way that news feed algorithms keep
popular stories in front of your eyes on social media.

(29:22):
If you keep coming back and scrolling, the most popular
fake news stories do tend to show up again and
again and again. Yeah, and then hopefully people are shooting
it down again. But but even then it's going to
have a limited effect based on this particular study here. Yeah,
so it's worth remembering that these effects are small, but
small effects can add up quick example, one of these

(29:44):
fake headlines that they looked at here was it was
this ridiculous story and it's totally untrue. Originally five percent
believed it was true. The second time people saw it,
ten percent believed it was true. So that might sound small,
but aggregated over whole populations with lots of manipulative false
stories and lies, this kind of thing could have huge effects.

(30:05):
It could swing an election in a country, It could
tip public opinion on an issue from a minority opinion
to a majority opinion. It could have real effects in
the world. Yeah, you're gonna have more than one of
these going on at a given time. Some of them
am gonna catch on, some of them are not. But uh,
adding them all together and they could have an effect.
So I think maybe we should transition to talk about

(30:26):
what we should do, both as receivers of information trying
to figure out what's true, and as purveyors of information
who you know, have public conversations. What should we do
in order to try to avoid creating wide widespread misbeliefs
in knowing what we know? Now, well, let's receive an

(30:47):
advertisement and then come right back with an answer to
that question. Okay, thank thank alright, we're back. So one
of the first questions I think we should ask is
what can you do about this? If you so, say
you've listened to these past couple episodes and you're like wow.
So I I accept that I'm susceptible to the illusory
truth effect. I know that being exposed to an untrue

(31:10):
statement or hearing an untrue statement repeated, is going to
probably make me more likely to believe it. How can
I protect myself against it? Especially given that we've seen
all these studies showing that various things apparently don't protect
you or don't necessarily protect you. Knowing otherwise isn't even
necessarily going to protect you. And I've I've felt that before. Robert.

(31:31):
I don't know about you, but like, there are cases
where I'm confident that I actually know what's true. I've
done the research, I know what reality is, and yet
seeing a lie that's that exists in contradiction to what
I know, over and over and over again actually does
work on me. I can feel it working on me.

(31:52):
I can feel doubts setting in. When I see a
lie repeated with great frequency, I start to wonder, like,
is true? I mean, I've checked it out before and
there's nothing to it. But maybe I don't I miss something,
maybe the maybe there's some new information. I'm not pretty too. Yeah,
So I really do feel it working on me, even
though you know I'm somewhat aware of this, and so

(32:14):
it can be difficult. It can be hard to know
what to do to protect yourself. But here's one thing
I want to offer as a as a general rule,
A huge red flag for judging a statements truth or
falsehood is I feel like I've heard that somewhere before.
And I do this. I'm you know, I I fall
prey to this. I do it all the time. Actually,

(32:35):
in a conversation, I think something's true because I have
exactly that feeling. I feel like I've heard this somewhere before.
I would say, if it feels familiar, but you can't
recall why it's true, and you can't recall the source
of where you heard it, you are in the danger zone.
That is the red that is the red zone for

(32:55):
repeating and reinforcing a false belief. So I think maybe
we should try a little experiment. Let's do it. Let's
repeat something a bunch of times and see if it
sets in. So here's the phrase, if it feels familiar,
check the facts. If it feels familiar, to check the facts.
If it feels familiar, check the facts. If it feels familiar,

(33:17):
check the facts. It feels familiar, check the facts. Death
to videodromes, Long live the New Flesh. All right, well,
we've we've we've done it, job, Joe, I think we've
we've won. No, we haven't one yet. There's actually there's
some more stuff we got to talk about. Uh So.
One of the other studies we looked at was just
a study in political communication in six by Emily Thorson

(33:37):
called belief Echoes the Persistent Effects of Corrected Misinformation, And
this was a study where they did three experiments. Thorsen
writes that they showed that exposure to negative political information
persists even after people are informed that the information was
not true. So this goes along with some of the
fake news stuff we were just talking about. And Thorson

(33:59):
called these beliefs that persist after being discredited quote belief echoes.
So she writes, quote belief echoes occur even when the
misinformation is corrected immediately. The gold standard of journalistic fact checking.
The existence of belief echoes racist ethical concerns about journalists
and fact checking organization's efforts to publicly correct false claims.

(34:22):
So dang. So even correcting a lie tends to increase
people's belief in the lie. What can you do then?
I know? I mean in this on top of the
reality that in some cases, corrections are not going to
resonate as as as much as the original, uh lie
or the original bit of of unfactual information. Well, yeah,

(34:47):
very often a lie is interesting in the correction is
not interesting. Yeah. Yeah, the corrections page two, But the
the original that's the headline on page one. Yeah. So
there was a article in the Columbia Journalism Review by
the Dartmouth political scientists Brendan Nihan. It was called building
a Better Correction Now this is not necessarily responding to
the exact same research we've been talking about, but it

(35:08):
addresses the fact that journalistic fact checking, corrections and so
forth can be insufficiently effective at correcting false beliefs, and
it does end up coming up with a few recommendations
based on Nihand's research and other people's research in recent years.
Number one is, of course, identify sources that speak against

(35:29):
their ideological interests. So apparently people are more likely to
accept a correction on a false belief for a widely
repeated lie, if that correction comes from somebody who who
it's against their political interests to to discredit it, does
that make sense? So in the political sphere, if it
is a misconception that's widely held on the right, you

(35:50):
need to get somebody from the right to discredit it.
If it's widely held on the left, you need to
get somebody from the left to discredit it. Right, So like,
if if the correction is pandas are not the most
awesome animal on the planet, it's going to carry more
weight if Panda weekly runs that correction as opposed to
you know, grizzly bears monthly exactly correct. So the second

(36:12):
point coming from the research is don't just assert that
a false claim is false given alternative causal account, So
you give a different explanation. To read a quote from
the article quote in the fictitious scenario used in one study,
For example, respondents who were told of the presence of
volatile materials at the scene of a suspicious fire continued

(36:35):
to blame the materials even after being told the initial
report was mistaken. So you tell them there's volatile materials there,
there was a fire. What caused the fire. Oh, those
volatile materials weren't actually there? People say, oh, it was
caused by the volatile materials. So the only way to
persuade people against that seemed to be to give them

(36:56):
another explanation of what caused the fire. So you don't say, no,
those materials weren't actually there. You say they weren't there
and the fire was caused by arson. If that's true. Obviously,
like you wouldn't want to make up fake alternative accounts,
but like, this is how you correct a misperception with
the truth is you give them the alternative causal account

(37:16):
that is true. And then finally, this is a big one,
don't state the correction is the negation of the lie.
Instead state the true fact that stands in contradiction of
the lie. Yeah, if you're having to say I am
not a crook, you're kind of saying I have a cruk.
Instead you say I am a good person. Yeah, yeah,
if that's true. I mean the good people don't usually

(37:36):
say I'm a good person. Yeah. So, but an example
would be from the thing we used at the beginning
of the last episode about this widespread belief that crime
has gone up in the United States since two thousand.
That's not true at all. Crime has gone down. So
you shouldn't say it's not true that crime has gone
up because a lot of times people are just gonna

(37:56):
remember crime has gone up. Instead, what you should say
that we've been violating this all this time. Here, what
you should say is crime has gone down since two
tho eight. State the true fact, don't negate the lie, okay,
and we have something we can chant to make this
really take hold in everybody's mind. I don't know. I
don't want to make you uncomfortable. I want to chance.
Let's chance. Okay. So here's here's the way i'd put it.

(38:19):
You won't kill a lie by repeating it instead, say
what's true. You won't kill a lie by repeating it. Instead,
say what's true. You won't kill a lie by repeating it.
Instead say what's true? Death to video Drone. No, you
won't kill a lie by repeating it. Instead, say what's true?
If I feel like if we could have made it rhyme,
we would have helped. Oh, maybe too late. It does

(38:41):
feel kind of creepy to chance, and that gets into
a thing that I did want to talk about at
the end here. That's frustrating because I wonder if there
is sometimes a sort of perverse system widely spreading bad beliefs,
essentially because people who are willing to lie and spread
malicious misinformation are also more willing to blatantly use proven

(39:02):
manipulation techniques like repetition and chanting and illusory truth, while
I I feel like more often people who want to
spread the truth and want to spread true messages are
more hesitant to use blatantly manipulative types of rhetoric and communication.
I mean, I don't want to say like I'm so good,
but like I don't want to give people misinformation, but
also in trying to help them with that stuff. I

(39:24):
was just saying, like I felt very uncomfortable, like chanting
a phrase over and over again, even though I knew
it would be effective, right. I mean, generally speaking, individuals
are are very serious about journalism. They're going to want
to adhere to the standards of their industry and maybe
not you know, fall back on you know, tribal chance

(39:45):
about about something because they feel they feel so obviously manipulative,
and they feel that way because they work. I mean,
this is kind of like a whole this is a book,
a whole other area discussion. But you know, I can't
help but think in terms of the click bait and
the ease of publication and distribution. I mean, naturally, this
isn't something that's going to apply to individuals who, via

(40:06):
celebrity and or political power, already reach a wide audience.
But you know, any wild conspiracy theory or accusation can
can penetrate a lot deeper, seemingly these days than in
pre internet days. We talked earlier about some of the
celebrity urban myths from decades past and about how to
really get going they had to you had to have

(40:26):
just the right celebrity um urban legend and it had
to had to spread by word of mouth or maybe
a you know, a concentrated effort to send facts is
across Hollywood potentially. I don't even know if that's true
in the Richard Gear case, but be a repeated false
story exactly. Yeah, it's that's one of those situations where

(40:47):
I think that correct me if I'm wrong, but out there.
But I don't think anyone's ever really been able to
get to the bottom of like where the urban legend
even really emerged from. Um. But nowadays, like the ease
of publication is a lot lower, and we're we're having
we're currently in a time where we seem to be
correcting and figuring out, well, how do we manage this

(41:10):
just plethora of of of publications of varying um you know,
you know, ethical solidity. Yeah, but that's just one part
of the issue obviously. Well it's a really difficult time. Yeah,
Our media landscape is is difficult. I don't know what
to what to do, Like what the best way to

(41:30):
address the wide spread of misinformation through social media and
the internet is. I mean, you can't, like, you know,
you don't want to become a sensor and lock it
down and say, well, I will decide what's true and false.
I'll shut you down. You'd want there to be an
organic way where people would would I don't know, have
the tools to tell between truth and falsehood themselves. Yeah,

(41:51):
you know. And then one of the issues too for
us is that we we sometimes discuss the theories and
hypotheses that that are not true or disproven over time.
This is exactly something I wanted to talk about at
the end of the episode today. It's a very frustrating
takeaway from this conversation we've had, um that there could

(42:11):
be negative effects from discussing what's wrong with bad ideas
and false claims because something we love to do, we
love to do on this show. For example, we just
did an episode about the ancient aliens hypothesis, something that
I don't want to speak for both of us. I
think neither of us think there's any good evidence to
believe is true. I do not believe there there is,
so we we put no stock whatsoever in this hypothesis.

(42:34):
It's the belief that ancient aliens came to the earth.
All of the evidence is either really bad over interpretation
or outright fraud, and yet it's fascinating to understand this
widely held, unfounded belief, to understand where it came from,
why people believe it, To talk about the real facts
and the real knowledge that undermine the existing claims in
this belief structure, uh, to think about what good evidence

(42:57):
there could be for past alien contact, if there, if
it did exist. Yeah, it's it's kind of like trying
to imagine how a dragon would work based on real
world biology. Yeah, you know, like you don't want to
advocate that dragons are real, but it is fun to
to take it apart and say, well, if they were real,
this is how it would work, and your discussion of
that should be based on real biology, and so all

(43:17):
this stuff. This is all stuff that I really enjoy
and I think is very valuable. But it makes me
wonder if even by having that kind of discussion, some
people are more likely to, you know, months years down
the road later remember as true the claims that we
examine in order to criticize and understand where they come
from in the episode. I don't know if there's any

(43:40):
way around that. Like, I don't think it's reasonable to
say we should live in a world where nobody ever
examines or talks about why widely held untrue beliefs that
that just doesn't seem reasonable. I think we learn almost
as much about the world and about ourselves from critically
studying the false misbeliefs we hold as we do from say,
reading a list of objectively true statements about the world.

(44:02):
It's not like studying false beliefs is uninformative. It's very informative. Yeah.
And in some cases it's it's about not repeating history, right,
not being doomed to repeat history. Um, when we when
we've talked about eugenics, for instance on the show, Uh,
you know that there's some horrible ideas wrapped up in eugenics,
but it is it is worth remembering. It's it's it's

(44:24):
worth knowing how we got there. Yeah. We we had
that discussion with Karl Zimmer a while back that talked
about that, and that's an important part of the history
of the study of inheritance. If you just ignore it
and say we never will talk about that anymore, Um,
you you do a disservice to like the you know,
the memory of all the evil that was done in
its name. And yeah, you like you're saying you open
yourself to not being aware of the really bad past

(44:46):
people can go down. Yeah. Now, now, of course, obviously
ancient aliens is less high stakes than that. But but
still I think the same as some of the same
principles apply. And then then again at the same time,
I like, I don't want to deny this research. I
knowledge it seems very true that bringing up a statement,
even to discredit the statement or even to criticize the statement,

(45:07):
can have the negative side effect of many people increasing
their belief in that statement later on, just because it
sticks somewhere in the back of their mind. They don't
remember the original context in which it came up, which
was a context of criticism or context of debunking, and
so people just kind of they think, oh, maybe there
is something to that. I've heard that somewhere before. It

(45:29):
feels kind of familiar. Yeah, well, I guess one of
one argument one could make then it would be, Hey,
if you're going to cover ancient aliens, then you also
have to make sure that you cover an ancient in
an ancient aliens free way, like how life actually emerges
on Earth, which we certainly discussed evolution on the show before.

(45:49):
So I think we're we're mostly there. Well, I'm not
worrying that we have a deficiency of saying true things,
but I wonder what we can do about the fact
that these types of discus sessions of bad ideas that
are really important and interesting to have, can also have
these negative side effects. I don't think I know quite
what the answer is yet. Obviously it will depend a

(46:13):
lot on the context of the idea. Oh yes, certainly,
and then this would actually be a great a great
topic to hear back from listeners on. Really, yeah, help
me out of this dilemma. I feel stuck. I don't
think I can live in a world where false beliefs
and bad ideas can never be spoken of. That would
sort of it would rob intellectual life of so much
of its richness. You would prevent us from gaining all

(46:35):
these insights about our culture and our minds. At the
same time, I don't want to spread bad beliefs. I
don't know what to do about that. Well, remain remains
an open question for now. Then, and in the meantime,
if you want to check out other episodes of Stuff
to Blow Your Mind, head on over to stuff to
Blow your Mind dot com. That's the mothership. That's where
you will find them as well as links out to
our various social media accounts. And if you want to

(46:56):
help the show, you want to support the show, rate
and review us where ever you have the ability to
do so. Huge thanks as always to our wonderful audio
producers Alex Williams and Tory Harrison. If you would like
to get in touch with us directly to to get
me out of my dilemma from this episode, or to
suggest a topic for a future episode, to give feedback
on this episode or any other, just to say hi,

(47:17):
let us know where you listen from. You can email
us at blow the Mind at how stuff works dot
com for more on this and thousands of other topics.
Does it how stuff works dot com

Stuff To Blow Your Mind News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Lamb

Robert Lamb

Joe McCormick

Joe McCormick

Show Links

AboutStoreRSS
Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.