All Episodes

October 7, 2025 28 mins

Misinformation and disinformation now pose some of the biggest global risks to democracy and public trust.

Associate Professor Piers Howe explains how misinformation and disinformation spreads, and the subtle but influential ways they can shift public opinion, disrupt elections and damage society. With the rise of AI and sophisticated campaigns, he explains why it's more important than ever to stay informed, think critically and understand the science behind persuasion.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Intro (00:00):
This podcast was made on the lands of the Wurundjeri people,
the Woi Wurrung and the Bunurong. We'd like to pay
respects to their elders, past and present, and emerging.
From the Melbourne School of Psychological Sciences at the University
of Melbourne, this is PsychTalks.

Nick (00:22):
Welcome to PsychTalks, the podcast where we dive into the
fascinating research happening here at the University of Melbourne's School
of Psychological Sciences. I'm Professor Nick Haslam, and I'm joined
by my amazing co-host, Associate Professor Cassie Hayward.

Cassie (00:34):
Hi, Nick. Great to be here. Today, we're tackling an
incredibly important issue, the rise of misinformation and disinformation. Now,
this podcast has previously looked at why we can fall
victim to conspiracy theories and misinformation way back in season one.
But we wanted to zoom in on the developments in
the research since then. We've seen how the spread and

(00:54):
influence of misinformation and disinformation has only increased, posing real
threats to democratic processes, public health, and social trust.

Nick (01:02):
And joining us to unpack it all is Associate Professor
Piers Howe, who leads research at the Information and Influence Lab.

Cassie (01:13):
Piers, welcome to Psych Talks. Um, could you start by
telling us a bit about your background, your current research,
and I guess what drew you to study misinformation and disinformation?

Piers (01:24):
Yes, indeed, and thank you for having me on PsychTalks.
So by training, I am a cognitive scientist, so I
am very interested in studying, um, information processing and how
people make decisions. So naturally I got drawn to this topic.
But obviously, what's been going on in the news and
um what has been going on elsewhere. For example, last

(01:46):
year the World Economic Forum listed mis- and disinformation as the
number one short term global threats, that kind of gets
your attention. So there's been lots of reasons why I've
been focusing on this more and more.
In my lab, the um Information and Influence lab, we
focus on creating interventions both to reduce the spread of

(02:08):
false information primarily on social media, but also to try
and mitigate the harmful effects of mis- and disinformation.
So perhaps to reduce uh support for undemocratic um practises
or um even partisan violence.

Nick (02:24):
So Piers, you're using terms like misinformation and disinformation and
you also hear about uh fake news.
Uh, terms like that thrown around all the time. Can
you break down what they mean and how they differ?

Piers (02:35):
Yes, sure. So, um, misinformation is false information that is
communicated without the intent to deceive, though perhaps by people
who haven't bothered to check whether it's true or not. Conversely,
disinformation is communicated with the specific intent to deceive and
the person communicating it knows it's false or at least misleading.

(02:58):
Um, fake news is not, not a particularly scientific term, um,
popularised by a certain person and it generally means something
which is put in a news format but hasn't undergone
the editorial standards of a mainstream newspaper. So the attempt
is to try and give something which hasn't gone through
the editorial process, the appearance that it has to give

(03:20):
it credibility when it shouldn't receive as much credibility.

Nick (03:22):
Uh, I thought it was just news you didn't agree with.

Piers (03:26):
[Laughs] There has been an alternative definition put forward by
a certain president, but, um, no, that's not its definition.

Cassie (03:34):
With all of those with disinformation, misinformation, and fake news,
one impact that's been talked about is the impact they
can have on elections. Well, we're talking about certain politicians. Um,
is there actual evidence that election outcomes have been manipulated
or influenced, or is it kind of a more subtle

(03:54):
impact on population beliefs?

Piers (03:56):
So there's increasing evidence that coordinated disinformation campaigns can change
voting behaviour by a few percentage points. So there is
theoretically the possibility that they could swing a close election.
Proving that, of course, would be hard because it would
only occur for a close election.

(04:17):
However, the bigger threat is perhaps the um decrease in
trust regarding election results. And there was a particularly pertinent
example of this in the 2024 Romanian presidential election where
the um the first round was nullified.
Um, in part because of fears of the influence of

(04:39):
a Russian disinformation campaign. So there's an example of, um,
false information, disinformation having a very real and tangible effect.

Cassie (04:47):
Just on, on close elections, I mean, we've just had
our Australian election and a lot of the seats were
very close, coupled with the ability to specifically target certain
electorates with social media messages, it can't be too far
off the radar that it's possible to have those, um,
impacts in those close seats through targeted messaging from either

(05:10):
real or, or fake news outlets or information outlets.

Piers (05:15):
I would have to agree. So there was a lot
of um misinformation and disinformation going around. I should say
from both sides. Um, maybe it roughly balanced out, maybe
it didn't quite balance out. It would be surprising if
at least some of those results weren't swung by mis
and disinformation, especially due to coordinated campaigns, but we can't

(05:36):
prove that, so we have to, as scientists be a
little bit cautious about claiming that.

Nick (05:40):
So it could potentially swing closely contested elections, um, and
that's why we care about it because it might have
all sorts of influence for who's in power. But is
there any evidence it's actually caused other harms outside the
political system?

Piers (05:53):
So I would regard mis and disinformation as a "meta risk".
The principal problem is that it reduces our ability to
respond to other risks.
Um, the classic example is our ability or slow response
to healthcare risks. So for example, in the 1950s, um,
in the USA but also in the UK there was

(06:14):
growing evidence linking cigarette smoking to cancer. And in response
to this, the top six American tobacco companies, um, um,
came together and they funded the PR firm Hill and
Norton
to create a decades-long disinformation campaign, which was perhaps one
of the most successful um disinformation campaigns in history. Um, historically,

(06:39):
looking back, it seems to have delayed effective legislation by
at least 10 years, and the legislation that did come
in was very much watered down. Um, it really boosted
their profits and depending on who's modelling you look at,
this campaign alone, and it wasn't just the
single event that was um it started off on the
4th of January 1954 with this um with this advert,

(07:04):
a frank statement to cigarette smokers which was actually viewed
by 43 million Americans. So it, it was a very
well funded, very big campaign.
But um it probably cost the lives of somewhere between
3 and 7 million Americans just by reducing the legislation
which would have hampered cigarette sales. So that's one incredible example.

(07:25):
The reason why I picked on that example is it
literally became a playbook that other commercial companies used because
it had the forgive me for saying this, the genius
insight
that you don't have to persuade something that someone's not true.
You just have to cast enough doubt that it is true,

(07:45):
and then people will delay their actions, especially if they
don't want to act. So for example, people are smoking,
they're addicted, they don't want to give up smoking, and
so you all you have to do is give them
enough doubt and then they won't give up smoking. Legislators
won't impose laws because they're getting money from taxes from this.
So they don't want to do that.
So this literally created a playbook on how to most

(08:08):
effectively create doubt and I'm talking about things like, fund
scientific research that just throws doubt on the issue, um,
hire fake experts to testify that the evidence isn't yet conclusive.
Things like that, very effective, and other companies, um, the
petrochemical industry, for example, various polluting industries, even pharmaceutical industry

(08:29):
trying to deny that certain medications were addictive, literally
use the playbook going forward.
Um, a more modern example would be, um, COVID-19 in
the US. So between May of 2021 and September 2022,
over 230,000 Americans died unnecessarily due to COVID. So these

(08:52):
were people who were offered the vaccine,
refused the vaccine, went on to die, and this takes
into account that not all of them would have lived
had they got the vaccine. Now, misinformation wasn't the only
reason why they didn't take the vaccine. A lot of
these people didn't want to take vaccines anyway, but it
was an enabling factor. It cast enough doubt that it

(09:14):
helped them decide not to take the vaccine and ultimately die. So,
we often think of misinformation's primary aim as trying to
convince something of something false. That's actually not how it's
generally used. It's generally used to create doubt which either
delays actions or allows people to act in a way
that they want to act anyway. That's a second function

(09:37):
of mis and disinformation. The third one is actually by foreign
disinformation campaigns, is to purposely designed
to actually increase social divisions and weaken a country, and again,
to reduce the ability of that country to take action
against a foreign threat.

Cassie (09:53):
Just on your playbook there, you can see that happening
in the US at the moment with funding of studies
to look at the link between autism and vaccines, and
just the fact that they're funding those studies throws doubt
on what is pretty much established as no link.
And then you see the, the vaping companies doing pretty
much the same playbook as the tobacco companies, often the

(10:13):
same companies, doing the same thing with vaping that you see. So,
you must see a lot of these examples just every
day at the moment.

Piers (10:21):
Well, yes, that's exactly right. And of course, what else
you're seeing in the US is not only are these
um
dubious, not only is this dubious research being funded, but
the proper research is being defunded and you're getting a
lot of statements being um made to support um unproven

(10:41):
scientific techniques and you're getting a lot of true scientific
information removed from US government websites, again, to create confusion.
It's primarily about creating confusion.

Nick (10:54):
Which presumably makes it harder to fight because it's not
just telling lies, right? It's, uh, increasing doubt and there's
always some doubt about pretty much any proposition, uh, I guess, uh,
or removing truth rather than spreading lies. So that presumably
makes it a lot harder to fight?

Piers (11:10):
Yes, so that was why it was the genius insight
of John W. Hill back in the 1950s that it's,
it's such an easier battle to fight, not to, not to,
it's impossible to prove that vaccines cause autism because they don't.
But it's much easier to pour doubt on the fact
that they've been proved not to cause autism and that's,

(11:31):
that's what's going on.

Nick (11:33):
You've mentioned in your work that misinformation can shift societal
narratives like the illusory truth effect or normalising fringe ideas.
Can you explain how this works and why it's so dangerous?

Piers (11:44):
So, the illusionary truth effects is the psychological phenomenon that
the more you hear something, even if it's not true
and even if it's not supported by evidence, the more
likely you are to believe it, or at least to
begin to believe it's plausible even if you can't believe
it entirely. So that's coming back to this doubt theme.
You begin to doubt that it's impossible.

(12:04):
And the reason why this is an issue is this
this concept of the Overton window, which is the range
of um issues that the public think is plausible and
so are willing to discuss. And using the illusionary truth effects,
if you
just keep on repeating extreme views, you can alter that
Overton window to include um more and more extreme content

(12:28):
and thus shift the narrative of the discussions and that's
by shifting the narrative, by shifting what people are willing
to discuss and entertain, um, you can often shift their
decisions because one of the heuristics that people use is
they tend to try and
stick on the middle decision. So they hear a range

(12:49):
of ideas and they try and pick the central one
because as a heuristic, that's generally quite a um a
good um thing to do.
So a heuristic is um a mental shortcut.
It's a way of making an educated guess. In this case,
an educated guess of whether something is true or not,

(13:10):
but it is not a reliable method. It's a quick
and dirty method which usually gives you the right answer,
but not always.
People who are pushing disinformation campaigns know that um as
humans use heuristics and so if they can shift one end,
they can shift what reasonable people trying to be reasonable
but being time poor do. And that's one of the

(13:31):
reasons why um humans fall for disinformation. We're all busy,
we all have lives.
We can't actually research every topic we hear about and
so we need to fall back on heuristics simply to
get through our lives.
When I first started in this field, there were media
literacy studies which were actually trying to teach people how

(13:53):
to spot false information and they were saying things like,
you can spot a false advert by looking for misspellings,
looking for unprofessional formatting. Those are heuristics, and nowadays a
lot of the false information is very, very professionally done.
Um, so
this, humans aren't mean, they aren't out to, um, deceive

(14:13):
other people, mostly. Some are, but most aren't, but they're
time poor and forced to use heuristics, which can be exploited.

Cassie (14:19):
Just on the heuristics thing, and I don't want to
get too much into the anti-vax weeds because we'll get
hate mail. But, um, when we're talking about how anti-vax
campaigners talk about the issues, they will be very direct
and assertive about their claims, and they will say, vaccines
cause autism.
And then you look on the other side, and scientists

(14:39):
aren't trained to talk like that. They talk in words like,
there's no evidence that vaccines cause autism. So it's a very,
from a listener's point of view, one seems very sure
and one seems very kind of weak, and you can
kind of see how people get led down certain paths
from those heuristics just in terms of how people talk

(15:00):
about things.
Would you suggest that scientists start to talk more certainly
about their findings? Or just in terms of the, the
damage that it does when we have the other side
of a lot of these issues talking very, kind of
with a lot of assurance in their claims.

Piers (15:20):
So what you say is very true. It's been very
well studied in the literature that just by giving facts
and figures, even when you can't justify them, makes the
message sound more plausible.
But as scientists, we can't do that. We have to
stick with the truth. We can't become um what would

(15:40):
amount to being um partisan, wouldn't it? Because then we'd
be trying to push a political message because once we
go beyond what the science is, then we're getting political.
And that's probably going to backfire. So no, I wouldn't
recommend scientists do that.
But what I would recommend science, and this is actually
one of the things I'm, we're doing in our lab,

(16:00):
is explaining what the truth looks like and trying to
get people not to use heuristics and not to assume
that every time they see a number. On my way
back when on my commute home, I pass I I
go past the barbershop which um says that um,
That character is, um, 84 84% innate and 16, 16%

(16:26):
the haircut, and they fix the 16%. Now that's a
joke and we can all see it's a joke, but, um,
we need to train people, and this is one of
the things my lab is trying
to do. So they don't fall for these simple tricks
that a number means you're credible.

Cassie (16:42):
What are some other things for our listeners who want
to avoid being pulled by disinformation campaigns? What practical steps
can they take or what things can they do to
avoid falling down these rabbit holes?

Piers (16:53):
So, the best advice is perhaps the most boring, which
is watch what information sources you use. So the majority
of Australians now um get the majority of the um
news from social media. It's just popping up in their feed.
Social media is a very unreliable source, consciously deciding not

(17:18):
to get your information from an unreliable source because we
all suffer from the illusionary truth effects.
And another thing we all suffer from is believing we
don't suffer from the illusionary truth of facts. If you
just see it more and more often, you will shift
what you believe is plausible. That's just how humans are.
So use reliable news sources, go out of your way

(17:39):
to subscribe to these reliable mainstream news sources, ABC, BBC,
New York Times, The Age, The Australian. Distinguish between when
these reliable sources are reporting factually and when you have opinions.
Um, a lot of people don't make that distinction. In fact,
a surprisingly large amount of people cannot distinguish between a fact,

(18:01):
a newspaper report and an advert put in this in
this news newspaper with the word advert at the top,
but put in the format of a newspaper article. So
be clear which bit of the source you're reading and
that bit is actually reliable.
But also, even the reliable news sources are now beginning
to use clickbaity headlines. What pops up on social media

(18:21):
is typically just the headline. Most people share that without
even clicking on the link.
Um, so they don't even know what they're sharing because
the actual article, sometimes these, these um links go to
reputable articles, but the actual article, even it's reputable, might
not quite encapsulate that headline. That headline is short. Um,

(18:42):
and in some cases, some social media companies actually changed
the headline as well, which is, uh, which, which can
be a problem. Um.
Don't just put things into search engines because the AI
will spit out an amalgamated result which may not correspond
to reality in any way.
Um, reliable sources also include things like government websites, but also, um, Wikipedia.

(19:06):
There's a bit of a backlash at the moment in
our schools against Wikipedia. Wikipedia can be a very, very
valuable source, but it sometimes misses the mark. And at
the top of the Wikipedia page, there's a talk tab
and if you go there, you can get a rating
of how reliable that particular Wikipedia page is.
And although the article itself may be unreliable, that rating

(19:29):
is usually very good indeed. And I would argue because
of the illusionary truth effect, if that rating is low, don't,
don't read, don't think, oh, OK, this, this isn't reliable,
but I'm gonna read it anyway and I'll just, you know,
be sceptic because you will end up fooling yourself.

Cassie (19:45):
A lot of those tactics of critical thinking are tactics
that the misinformation side will often say that they're doing, right?
They'll say, don't fall for
the, you know, don't be a sheep, don't follow this,
this is the real truth. And they kind of believe
that they're critical thinkers and doing all those things. They're
just going to the real source rather than the fake news. So,

(20:07):
how do you get around people thinking that they are
doing all those things? It's just that they're being led
to the wrong sources.

Piers (20:16):
So that's the million dollar question.
And I'll just be straight, I don't have a good answer,
and the field doesn't have a good answer. Our answer
used to be go to third party fact checkers. Now
we have Breitbart doing fact checking and those fact checks
bear no resemblance at all to any of the other
fact checkers.

(20:37):
Um, but people are quoting these Breitbart fact checks. They're quoting, um, um,
Fox News. So you have an alternative information ecosystem and
we're not really sure how to deal with it. We
thought that you could go to the ultimate source of truth,
which would be science. Well, hasn't that been proved wrong?

(20:58):
I remember a junior member of faculty coming up to
me being terribly, terribly excited.
Because he knew I I studied misinformation and he had
the solution. The solution was very simple for him. Australia
needed an independent scientific body which would research um topics
which were important to elections and give the facts to

(21:19):
the Australian people. So of course I mentioned CSIRO, I
mentioned the nuclear report, and I mentioned that it was
pretty much ignored by a certain party.
We don't have a good answer. So my lab is
trying other ways. That, that's what you, what you're talking
about is too hard for us to tackle. But there's
other ways you can do. There's the foot in the

(21:40):
door technique. You can talk about other values and you
can build up getting people more and more reasonable, and
then eventually they can perhaps realise that perhaps Fox News
isn't always so factually accurate.

Nick (21:54):
So looking ahead, do you think the fight against misinformation,
disinformation will be getting easier or harder with AI and
other technological advances?

Piers (22:03):
It's gonna change, but I don't think it's gonna get
either easier or harder. So AI is having a huge,
huge impact. So we're now getting far more sophisticated botnets,
which are coordinating in far more human-like manners, um, and, um,
and also even in some cases acting as sleepers.

(22:24):
So in the past, you would be befriended by what
who you thought was a human but was actually a
bot and would immediately try to persuade you to do something,
vote for this person or don't vote for that person.
Now they can act like your friends for maybe 6
months and then try and persuade you to do something.
So it's getting pretty sophisticated.
Um, sending out personalised messages is getting increasingly easy with

(22:46):
these large language models. The game is changing very much,
but also we're developing techniques using AI to spot these botnets,
to spot this inauthentic activity, and also to, um, there's
quite a lot of work done by, um, um, Costello
and um David Rand, um, and Gordon Pennycook, who um,

(23:08):
Thomas Costello, um, who are using, um, large language models
to try and talk people out of conspiracies. It's again though,
when someone really goes down the rabbit hole, it can
be quite hard, but they're,
um, these large language models are more effective than a
human in many cases because they can engage at the
level of detail of a conspiracy theorist. So people, um,

(23:30):
gone down the rabbit hole, they often know a lot
of detail about the conspiracy theories, and when they, forgive me,
when a normal human starts talking to them, we
haven't spent 3 years studying this particular conspiracy theory, so
we don't know this level of detail, but a large
language model can work at that level of detail and
give um some information which can be accepted. I have

(23:53):
to say the effect size so far aren't that large,
but at least they're sustainable.
Um, so we shall see.

Cassie (24:00):
I guess the other thing with these conspiracy theory groups is,
you know, we've talked about how to talk them out
of it, but sometimes it's not about the facts, it's
they've found their group, they've found a sense of belonging,
that identity. Whenever those beliefs become part of that person's identity,
then it becomes less about the facts of the conspiracy
and more that they're getting all of these social needs

(24:22):
from
that group, and it's very hard to just convince them
out of that with facts, right? It becomes a harder
sell to pull them away from what might be their
only social connection, which I think is another challenge with
a lot of these conspiracy groups.

Piers (24:36):
Oh, very much so. So,
some of the standard advice for how to deal with
people with conspiracies is well, firstly, don't try and refute
what they're saying because that's what they're doing. And if,
and if, if that would have worked, they would no
longer be into conspiracies. So you know that's a dead end.
But try um and find out why they believe in
it and also try and find out what need it's

(24:57):
satisfying in their life. So what caused them to go
down
Um, that rabbit hole. So my mother-in-law, um, became a
conspiracy theorist. Um, she very much went down several rabbit
holes and conspiracy theorists typically don't believe in just one
conspiracy theory. There's often, um, a series of related conspiracy theories.
And in her case, um, she was feeling quite lonely,

(25:20):
quite isolated, and as you said, it provided a support network. Um,
I work with people, um,
in Melbourne Connect who actually study um um conspiracy beliefs.
And one of the things that shocked me was them
reporting how supportive um the support networks are. Um, and

(25:40):
this was someone um studying the manosphere and these, these, um,
these men, these um incels, involuntary and celibates, um, would
go on the spaces and tell everyone their um problems
and they would actually receive support and it was
in some sense, a positive experience had not the group
actually been focused on being misogynistic. Um, so it's gonna

(26:04):
be very hard to talk someone out of that because
these people clearly didn't have a social life outside this
and were unable to create normal connections with human beings.
So it can be hard.

Nick (26:14):
OK, Piers, so what's one final takeaway or piece of
advice you'd like to leave our listeners with as they
navigate today's very, very challenging information landscape.

Piers (26:23):
So, we were discussing this a little bit earlier, Nick,
and um I think you would agree with me when,
when I say there's a bit of a feeling of despondency.
A lot of us are feeling a bit of dragged
down about international events, about perhaps domestic events, feeling all
a bit overwhelmed. So I'd actually like to end on
a somewhat hopeful note.

(26:44):
That you're actually more powerful than you realise. You can
actually access reliable information easier than perhaps you realise, and
you can actually fend off these mis and disinformation campaigns
easier than you might think.
And so it would just be to encourage people to

(27:04):
actually take the time and think, is this really true?
Is this not? Lateral reading, just, you've received some information
before you read it all, start checking, do other reliable
sources say this sort of thing? Um.
It's actually quite easy to spot a lot of misinformation
if you just take a few seconds to do so.

(27:25):
So I would end on a positive note, don't get despondent.
This too shall pass and um lots of us are
working on solutions to it.

Cassie (27:35):
Fascinating. Thank you for joining us today, Piers, and I
think you've given our listeners and us a lot to
think about and take away.

Nick (27:42):
Thank you very much.

Cassie (27:45):
And that brings us to the end of today's conversation.
A huge thank you to Associate Professor Piers Howe for
helping us understand how misinformation works and what we can
do about it.

Nick (27:55):
This episode of PsychTalks was produced by Carly Godden with
production support from Mairead Murray and Gemma Papprill. Sound engineering
by Jack Palmer. Thanks so much for joining us, and
if you don't already, please subscribe so you can catch
every episode in the series. We'll see you next time.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.