All Episodes

May 15, 2022 19 mins

Misinformation is being weaponised in the media and politics, and many fall down the conspiracy theory spiral. In what ways do our brains predispose us to believe in misinformation? How is our current information environment – especially social media – aiding the spread of ‘fake news’? And can you actually convince true believers to let go of conspiracy theories?

Hosted by journalist Lynne Malcolm for the Melbourne School of Psychological Sciences. Featuring Associate Professor Andrew Perfors and David Milner from The Shot. Our production team is: Carly Godden (producer), Amy Bugeja and Mairead Murray (assistant producers), Arch Cuthburtson (sound engineer), and Chris Falk (music).

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Lynne Malcolm (00:02):
This podcast is made on the lands of the Wurundjeri people, the Woi Wurrung and Boon Wurrung. We'd like to acknowledge and pay respects to their elders, both past and present and emerging.
From the Melbourne School of Psychological Sciences at the University of Melbourne.
This is PsychTalks. Hello, I'm Lynne Malcolm and welcome to

(00:27):
psych talks. This series looks at what studies in psychology
can reveal about some of today's big issues.
Each episode dives into one facet of modern life in Australia.
You'll hear from psychology experts and a lineup of fabulous
commentators as we unlock insights into what drives people and

(00:49):
make society tick.
First up, we're entering the murky world of misinformation.

Andy Perfors (00:55):
I think the main thing to realise is that all
of us have the capacity to fall down a conspiracy
theory spiral. And especially all of us have the capacity
to be taken in by misinformation.

Lynne Malcolm (01:08):
False, misleading and inaccurate information seems to be everywhere.
In fact, last year, in 2021 Macquarie Dictionary named 'fake news'
their word of the decade.
Today we're exploring the psychological aspects of why people buy
into misinformation and seeing if there's a more productive way

(01:30):
to navigate our encounters with those peddling mis truths.
We're starting off with Liam's story.

Liam (01:37):
His family and my family have been friends forever.
So I knew him growing up. They were all part
of the church and it wasn't until probably, I was
about 13 or 14, I think when my family left
that church, his family stayed at that church

(01:59):
And that's sort of where our families kind of split. My
folks would see them in town every now and then.
And it wasn't like, there were hard feelings or anything,
it was just, you sort of moved in different circles kind
of thing.

Lynne Malcolm (02:10):
This is Liam Brewer who grew up in the regional
city of Bendigo in Victoria.

Liam (02:16):
He stayed in the church, I left when I was
probably around 16 or 17, I stopped going to church altogether. Yeah,
so he was full on church guy, married, kids the
whole thing.
Whereas I went music and went to Melbourne and

Lynne Malcolm (02:34):
Had some fun?

Liam (02:35):
Yeah, let's get into rock and roll and heavy metal
and have a good time.
So yeah, totally different, totally different paths, but you know,
he was always a nice bloke and easy to talk
to and just a pretty genuine, nice Christian bloke.

Lynne Malcolm (02:53):
And when did you notice something was up with your
former friend?

Liam (02:56):
Because I hadn't been in contact with him heaps.
It wasn't really until Facebook came along that I started
to get back in touch with some of those people
from years before and he just started getting more right
wing leaning, which politically he was always sort of that way.
But then I think it was around when Trump started

(03:21):
running for president and it sort of snowballed from there.
He wasn't really a Q guy, but he definitely was
just anti mainstream media and was picking up all this
rubbish from these quite easily debunk able other sources.
It's not like full, full blown tinfoil hat, which is

(03:41):
like for me a little bit almost worse because it's
a bit more subtle. It's easier for other people to
buy into when it's not that like, hey, there are
lizard people in the royal family end of things.

Lynne Malcolm (03:56):
If you're on social media, chances are you have a
friend like Liam's among your wider network. Over the past
few decades, misinformation has increasingly infiltrated our screens, newspapers and
chat messages.

Andy Perfors (04:11):
A lot of what we're interested in studying is how
people pass on information that they believe is true, but
that's actually false in some way.

Lynne Malcolm (04:19):
This is Dr Andrew Perfors, Associate Professor of Psychology at
the Melbourne School of Psychological Sciences.

Andy Perfors (04:27):
I mean, misinformation is as old as time, but the
news environment has changed so much. We've got social media
and people have so much more power to send information
around themselves. You don't need to be someone with a
big microphone and a lot of wealth.

Lynne Malcolm (04:42):
Of course, established media outlets can also be guilty of
touting misinformation,

Dave Milner (04:48):
Especially from a mainstream media point of view. It's at
the point where the information is secondary to the message
that wants to be pushed behind it.

Lynne Malcolm (04:56):
David Milner is an award winning journalist and columnist with
The Shot.

Dave Milner (05:02):
You can twist things very easily by excluding context. That's
the major way that it's happened.

Andy Perfors (05:09):
This removing context has a big pernicious effect because it
means you can never talk about any of the complicated
nuance that really matters. You spend all of your time
trying to fill in that missing context. Even if people
end up not believing it, they've changed the discussion in
a really bad way.

Dave Milner (05:26):
Just think about what happened in Victoria. They really were
things that the Andrews government should have been criticised for.
But you have to spend so much time convincing people
that actually he's not a dictator and actually some of
this stuff is really sensible.

Lynne Malcolm (05:40):
"11:59 p.m. Wednesday Victorians will have to wear a face
mask like the lockdowns. No evidence, not a single slip
of paper to justify this make it up as you
go approach."

Dave Milner (05:50):
Organisations, let's say News Corp, as a random example I've thought of,
from out of nowhere, will do things like encourage anti
scientific things like vaccine hesitancy, things like these masks don't work.
They didn't do anything to stop the spread of coronavirus.
And in this environment, it just spreads like wildfire. It

(06:14):
plays on people's fears, people's anxieties, people's preconceived ideologies,

Lynne Malcolm (06:22):
"I hate the word scared but people are scared."

Andy Perfors (06:27):
All of us have the capacity to fall down a
conspiracy theory spiral. And especially, all of us um have
the capacity to be taken in by misinformation. In fact
probably all of us do believe some misinformation right now
Really, misinformation preys on certain cognitive biases that serve us
pretty well in the real world. Like for instance, we

(06:50):
have this thing where basically the more fluently we understand something,
the more we believe it which makes sense.
But it means that you can sort of sell simple
ideas much more easily than more complex ideas.
And it's you know unfortunately the truth is very often
complex and so it's easier to get people to believe

(07:10):
a simple misinformation than a complex truth.

Lynne Malcolm (07:14):
Dave thinks that conspiracy theories similarly offer a vision of
the world based on order and control. It's an idea
he picked up when researching the 9 11 Twin Towers
conspiracy

Dave Milner (07:27):
Even though it's a horrible conspiracy theory that the US
President brought down these towers for his own nefarious means,
people were finding comfort in that because it removed cruel
randomness from the world.
It was by design and cruel randomness is the scariest
thing any human has to face. And I think all
conspiracy theories do basically provide that level of comfort that,

(07:49):
you know what, it's not going to your plan but
it is going to someone's plan.

Lynne Malcolm (07:53):
Going back to Liam's online encounters with his childhood friend,
Andrew believes that his experience is very typical.

Liam (08:03):
He wasn't just reading one thing and then running with it.
It's like it would be parroted by a bunch of
different people but there wouldn't be any real like
corroborating evidence or anything to really back it up. It
was just a bunch of different people were saying the
same thing.

Andy Perfors (08:17):
We have this cognitive bias where basically the more often
we hear something, the more we believe it, even if
we know it's false so you can just repeat something
very often and you'll believe it more.
And again, that kind of makes sense when you're not
in an environment where you can artificially increase the frequency
of things really easily. But you know, misinformation propagates just

(08:39):
by saying it a lot basically. And all of us,
all of us are susceptible to this because it's an
unconscious kind of thing.

Lynne Malcolm (08:46):
According to Andrew, we're also inclined to not accept new
information when it doesn't fit within our established worldview.

Andy Perfors (08:55):
We're still trying to figure out the reason for this
because it sounds really stupid, why would people?
But you know, human brains are stupid in many ways.
You know, the reason is usually because if you believe
something you believe it is part of a whole web
of theories about the world. And so even if you
hear this bit is wrong, if you don't have something
to replace it with or something that also stitches together

(09:16):
all your other theories about the world.
Then your brain still kind of wants to fill something
in there. And so to correct something is essentially a process,
not of just fixing that information, but also fixing all
the links between all the rest of our theories about
the world.

Liam (09:33):
Whenever he'd put up an article, I'd try and put
up something against it that had reliable sources and yeah,
he just dismiss it as been rubbish because it's mainstream media.
How do you expect to debate anybody or have a
reasonable conversation with someone?
If you're just going to dismiss anything that doesn't fit

(09:54):
what you want it to be, he'd sort of twist
words around and sort of project that back on other people.
It's like, oh, you're just trying to manipulate the narrative,
but that's what you're doing.
That's exactly what you were doing. And he would never
admit to it and it just dropped that thread and
start a new one.

Lynne Malcolm (10:12):
We've heard how we're all prone to believe in and
cling to misinformation because of certain cognitive biases built into
the human psyche.
Andrew thinks that those swayed by conspiracy theories tend to
be first drawn in at a difficult point in their life.

Andy Perfors (10:32):
People who fall for conspiracy theories, they've got something extra
going on, which is generally a big emotional need to
believe them and that is what causes a whole host
of behaviours where they seek out a lot of this
confirming information.

Liam (10:46):
He just seemed really angry at everything.
And that sort of came out. The way that he
was using dialogue was really aggressive and the way that
if anybody spoke against what he was saying, he would
go full attack mode on them.

Andy Perfors (11:02):
Usually the first emotions are generally feeling alienated, angry, wanting
someone to blame. And again, this is something that all
of us are susceptible to at some times in our lives.
And if you talk to people who have become trapped
in conspiracy theories or cults or any of this, it's
it's almost always someone who's at a really fragile time

(11:22):
in their life.
They've had a breakup, they've lost their job, they've lost
their social and emotional support.
And so these these theories give them a sense of specialness,
they understand the truth.

Liam (11:36):
I hadn't really noticed anything with him, then when I first
sort of got in touch with him and then at
some point him and his wife split.
And I think that's sort of where there started to
be a shift with him. If you look at the
pictures and the other stuff that he posts like one
of his kids is doing amazingly in soccer and he's
a pretty good coach
and so there's all these really nice happy family photos

(11:59):
and everything is lovey dovey and life is great.
And then just this vitriol coming from him on Facebook
at people. It was like, yes, something certain in there.

Lynne Malcolm (12:11):
Curbing the spread of misinformation is no small challenge as
the technology behind digital platforms has become more sophisticated.
So too has the means of disseminating mistruths .

Dave Milner (12:25):
My background was I did tech and video games journalism
for a long time and Youtube is a big centre
for that sort of discussion.

Lynne Malcolm (12:33):
We're back with journalist Dave Milner, it really only takes
a few random recommendations via Youtube's algorithm to go from
a video about Minecraft to almost extreme outright white supremacist.
It's three or four clicks and the kid is unwittingly
down that rabbit hole.

Andy Perfors (12:52):
You also look at the algorithms for social media sharing
like Facebook, Twitter and stuff. And what they do is that
they reward people with extreme takes. Those are the ones
that get lots of engagement and they reward engagement in
a certain way. So people with mild well let's listen
or a really long complicated thing that doesn't fit on Twitter.
Those things aren't talked about it. They're not emphasised as much.

(13:15):
So just the way that the algorithms work, actually distorts
the information itself and you have to take a stance
on some level saying this is not good content.
And almost no one wants to do that because it
feels like that's against free speech.
But of course having algorithms that preference some content over
another is also against free speech. But it it feels

(13:37):
like it's less because it's not us making the decision.
It's still though is and I think that's something that
we're really only starting to wrestle with as a society.
I think we just have to admit that we have
to make some decisions because if not decisions are happening
to us anyways.

Lynne Malcolm (13:52):
If big tech can't be relied on to properly police
the spread of misinformation, how do we best protect ourselves
and others from being duped by it?

Dave Milner (14:03):
I think what's gone wrong recently is this, it's a
lack of nuance. It's this idea that all of this
is fake news or none of it is when actually
people that have been pushing for critical reading of the
media have just been pushing for taking every single source
on its own merits and
working out what's nonsense and what's editorial and what's opinion

(14:23):
and being able to shift through that and this just
has to come through
education. It has to start quite young as well. And we
need to be talking about this in if not primary schools,
definitely high school.

Andy Perfors (14:35):
If you look at the sort of studies that have
been done on this, there's so much research trying to
figure out how to make people more resistant to misinformation,
much of it unsuccessful.
And the things that I've seen that have gotten the
most success, we call inoculation, but it's basically this kind
of media literacy training.
So not teaching people about specific things or specific misinformation,

(14:59):
but teaching them about the way that someone who is
trying to fool you, tries to fool you and what
to be alert for and so that you can kind
of work backwards to be able to say, hey, this
is a really click-baity title.
It's really written in a way to take advantage of
my emotions. It might be trying to make me, you know,
get me to believe something that's not true.

Liam (15:25):
It felt like any time I tried to steer him
in a different direction, especially using evidence, he'd just get
his back up and get defensive and dismissive and became
more just trying to put different information on his threads
that other people who are reading could see.

Lynne Malcolm (15:42):
And finally, the million dollar question, if you see that
someone's been conned by false information, should you try to
point this out to them, like in Liam's case?

Liam (15:53):
Like, I dunno, I tried a few things, but it
just seemed like nothing was really going to push him
in any direction.
He was already pretty well set.

Lynne Malcolm (16:01):
And how do things between you two end up?

Liam (16:04):
It was really difficult to try and get through to him.
So I just sort of got to a point where
I had to stop because it was doing my head in.
Just trying to like I'm spending way too much time,
being angry and it's not going to get me anywhere.
So what am I doing? This is pointless, which is
a shame.

Andy Perfors (16:23):
I think there's a different approach. If it's just misinformation,
if someone believes the wrong thing. I mean in that
case and and they don't seem very wedded to it,
then I think it's fine to say, uh is that
really true? You know, like I've had friends do that
to me and I've always really appreciated it.
It gets a lot harder if they've made this, if
this piece of misinformation isn't an isolated thing, if it's part

(16:45):
of a web of how they view the world.
If they've put a lot of their identity into it,
then it's you know, you correcting it, basically all you're
doing is ensuring that they will not listen to you
in the future and they will distrust you more.

Dave Milner (16:59):
It's just such a difficult situation and I know in
my personal life when I've come up with people that
have started going down these rabbit holes,
I've definitely made mistakes by just hoping that the slightly belligerent
approach to logic and come on, please. It just always
repels them.
It never draws them back to the realm of sane

(17:19):
reality that you're trying to achieve.

Andy Perfors (17:22):
Trying to get at the emotional reason which means often
not talking about the thing, but sort of addressing, they're
really lonely, they're really angry and helping them find some
other ways to meet this emotional need other than whatever
the conspiracy theory is giving them.
And the other thing is I think just asking them
about it, but not in a sort of patronising or

(17:42):
sort of, I'm going to prove you wrong way, but
just like,
Oh you said this, so would that mean that? Like just
like I want to find out about it because there's
this thing called the illusion of explanatory debt. We we
think we understand these things, but once we actually try
to explain it, like, there's not much there.
And so you can do this, it's often misinformation and
conspiracy theories are the same way, right? So they haven't

(18:05):
actually tried to integrate all these bits in their head.
And so if you can say so you said this,
does that mean that or whatever and just make them
walk through it themselves.
Then sometimes people, if it's an inconsistency you notice yourself
it sticks with you a lot more.

Lynne Malcolm (18:20):
But attempting to change the mind of someone who is
heavily attached to misinformation or conspiracy theories is typically a
very long and uphill battle.

Dave Milner (18:32):
This is a difficult thing to decide to engage with.
It could potentially go on for months or years and
without any sense of achievement.

Andy Perfors (18:40):
I would say, also you have almost no hope if
you're not close to them, it's also probably not worth
it emotionally to you if you're not close to them,
you know, arguing on Facebook is fine and maybe you
want to do it for the other people who might
be reading it, but you're not going to persuade your Facebook
friend of anything.
You know, it might be if it's your parent though
and you live with them and you're willing to do this.

(19:02):
But it's a huge emotional cost.

Lynne Malcolm (19:05):
You've been listening to PsychTalks with me, Lynne Malcolm, I'd
like to thank our guests for today. Associate Professor Andrew
Perfors and David Milner
and Liam Brewer for sharing his story with us.
This episode was made possible by the Melbourne School of
Psychological Sciences at the University of Melbourne. It was written

(19:26):
and produced by Carly Godden with Amy Bugeja and Mairéad Murray
providing editorial and production assistance.
Arch Cuthbertson was our sound engineer, and the show's music
was composed by Chris Falk.
For more episodes of PsychTalks find us wherever you get
your podcasts. Bye for now.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.