All Episodes

May 21, 2025 40 mins

This week, we're doing a deep dive into the modern misinformation crisis. In a culture where "rage bait" is king, learn how algorithms amplify rage, fear and extreme positions—along with tips on how to protect yourself from falling for it. Guests include Kara Brisson-Boivin, Director of Research at Mediasmarts, Canada's Centre for Digital Media Literacy, along with professor and author Timothy Caulfield.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Until recently, mary-kate Cornett was a typical
college freshman going to class,hanging out with friends,
posting pictures on Instagram,and then, in the span of 24
hours, her entire world flipped.
She wakes up to dozens ofmissed calls.
Her phone is blowing up withmessages Nasty, vulgar,

(00:27):
terrifying.
A rumor's gone viral, somethingoutrageous.
This is what is being reportedby everybody on the internet.
Broadcasters like ESPN areamplifying rumors that she had
an affair with her boyfriend'sfather.
It's not true, but it doesn'tmatter.
Her face is everywhere,amplifying rumors that she had
an affair with her boyfriend'sfather.

(00:47):
It's not true, but it doesn'tmatter.
Her face is everywhere.
Her name is trending.
The internet has alreadydecided what's true and what's
not.
What started out as ananonymous post on a college app
exploded into memes, hatemessages and even a swatting
incident at her mom's house.

Speaker 3 (01:07):
Having the entire internet, half of the country,
hating you and calling youdisgusting things, telling you
to kill yourself, telling youthat you're a horrible person,
that you deserve to die that's areally hard thing for a girl to
go through.

Speaker 1 (01:20):
And why?
Because a few big names insports media thought it was
funny, Just content.
But for Mary-Kate Cornett itdestroyed her sense of safety,
her peace, her life.
Just one more example of whathappens when internet lies
become entertainment and aperson becomes collateral damage

(01:41):
.
This is how misinformationtakes hold.
It doesn't always have to bestrategic or political.
It can happen to anyone andwe're all vulnerable to
believing it too.
Fueled by emotional reactionslike shock, fear, anger or
scandal, false stories canspread faster than the truth.

Speaker 4 (02:04):
So, things that really get at our base, emotions
of anger, anxiety, fear thoseare the kinds of information
that we are most likely tospread.

Speaker 1 (02:13):
This week on what's Up With the Internet.
We're going to talk about howmisinformation takes hold and
spreads.
As always, I'm your host,takara Small, and the podcast is
brought to you by CIRA, theCanadian Internet Registration
Authority, the nonprofitbuilding a trusted internet for
Canadians.
Spreading lies has always beenpart of human behavior, but

(02:37):
until very recently, ournetworks were smaller and there
were certain gatekeepers thatcould either protect us from our
worst impulses or feed them.
In either case, we had a sharedidea of reality, but that's
changing.
Publishers and broadcasters nolonger control what we consume,

(02:59):
for better or worse.
We're all the media now, orworse, we're all the media now.
So we all need to get better atunderstanding how false
information spreads and how wecan combat that.
To start the conversation, wehave Dr Cara Breeson-Boyvin on
the show.
She holds a PhD in sociologyand leads the research team at

(03:21):
Media Smarts.
Media Smarts is a Canadiancharity focused on digital media
literacy, and it just publishedsome new research on
misinformation.
Cara began by telling us allabout it.

Speaker 4 (03:35):
So we've just finished a pretty big in fact I
think one of the biggest studiesin the country on
misinformation.
We did a mixed method study, sowe did a survey with 5,000
participants across the countryand it included interactive
activities where we asked themto verify different examples of

(03:56):
information, and then we showedthem some of MediaSmart's Break
the Fake campaign videos.
So these are short educationalvideos that are designed to
teach folks how to recognize andrespond to misinformation.
And then, after the study, weworked closely with 30
participants in focus groupswhere we dove a bit deeper into

(04:19):
some of this content and weasked them to share examples of
information that they've comeacross or shared in the last
couple of weeks.
We asked them to to shareexamples of information that
they've come across or shared inthe last couple of weeks.
We asked them to rank someexamples of information on
things they would shareprivately, publicly or not at
all.
And ultimately, the goal of thisstudy over the two phases was
to be able to understand whatkind of messaging works best in

(04:43):
an intervention like the Breakthe Fake program.
We wanted to be able tounderstand how important it was
to emphasize how to steps sosort of how to recognize and
respond to misinformation, andwe also wanted to understand how
important it was to motivatepeople, so the kind of why it's

(05:03):
important to check, you know,for misinformation, so things
like it impacts the peoplearound you, like your family and
friends.
And so the videos were designed.
Some of them very much focusedon how messaging, some of them
focused on the why messaging andsome of them combined the why
and the how.
And through the survey and thefocus groups, we were able to

(05:25):
explore the different messagesof the videos and we learned
some really fascinating thingsabout people's information
seeking, sharing and processinghabits and what makes a very
good intervention.
What are the ingredients of asuccessful intervention to help
support people in againrecognizing misinformation, but

(05:48):
also what would help break thefake in Canada.

Speaker 1 (05:52):
So how will that research influence your work
going forward?

Speaker 4 (05:57):
Yes.
So we learned, perhaps notsurprisingly, that people did
struggle to determine what wastrue and false online.
More often than not, peoplerelied on these kinds of mental
models that most of us do relyon when we're exploring or
seeking information onlineThings like guessing, sort of

(06:18):
gut reactions to things thatappear reliable or unreliable.
People relied on their previousexperience and knowledge.
One of the examples we gave inthe study was an image of a
spider.
It looked very colorful and wegot some fascinating responses
from people around.
You know, I have never seen aspider like this.
Therefore, it must not exist.

(06:39):
And actually the spider didexist, it was real, and so
that's just an example of how wecan tend to rely on our own
experience or our own knowledge,and more often than not,
relying on these kinds of mentalmodels are not as reliable.
As you know, the steps that weoutline in the break the fake
program, which is to check thesource, verify the source, um,

(07:02):
you can use fact-checking tools.
So those kinds of steps aregoing to ensure a much more
reliable answer than some of ourown mental models.
And so, moving forward, we'regoing to build on the knowledge
we've gained through this studyto continue to build resources
in the Break the Fake program,as well as to disseminate and

(07:25):
share the results of this studyso that other community
organizations, researchers andpolicymakers who are working in
the field of misinformation anddoing work to mitigate for
misinformation in Canada canalso build interventions that
are based on the evidence thatwe've gathered in this study and
the lessons we've learnedaround what makes a successful

(07:47):
intervention.
I will say I think the a reallyimportant good news piece coming
out of this study is thateducation works.
We found that, by virtue ofjust participating in the study,
that participants were muchmore likely to engage in
critical thinking, to say thatthey will, you know, fact, check
information, you know pause andthink before they share content

(08:11):
, think about who they'resharing it with and how.
So just by virtue of sort ofwhat we call in the field,
priming people to thinkcritically about information had
a positive effect on people'sbehavior, and I think that's
really important in the contextwe're living in right now, where
people are also feelingoverwhelmed by the amount of

(08:31):
information that we're seeingand experiencing on a daily
basis.
It's very good news to knowthat priming people to think
critically about theirinformation seeking and
processing behaviors has apositive result.

Speaker 1 (08:47):
You've touched on this a bit, but I'm curious if
you can break down howmisinformation actually spreads
and what role social media playsin that.

Speaker 4 (08:57):
Yeah, I think it's really important, and one of the
educational pieces we need todo a bit more work around, you
know is this idea of how itspreads.
It's important for people tounderstand that, in an online
context, we're operating in anetworked environment, and that
means that all of the differentplatforms that we engage in and

(09:17):
the spaces that we shareinformation are connected, and
so that's why often, when yousee something go viral, it can
go, it can, it could havestarted or gone viral on a
platform that you're not even amember of.
It can be on a social platformthat you don't have an account
on, but you end up seeing it ona different platform.

(09:37):
So, for example, if somethingstarts going viral on TikTok but
I don't have a TikTok account,it's very likely I could see
that on YouTube, where I do havean account, or on Instagram,
and that's because of the waythat these platforms are
networked and connected.
It's also because, while we areconsumers of information, we
are also producers ofinformation.

(09:59):
When we choose to share content, we are also engaging in the
reproduction and production ofthat content, and so, again,
there's this educational pieceto remind people that we have a
responsibility when it comes toconsuming and sharing
information in this networkbecause of that ripple effect

(10:20):
and because of the way theseenvironments are connected
effect, and because of the waythese environments are connected
and so topics and conversationsand debates and so on can, I
know, get shift amongst thesedifferent platforms and will
sort of rise and fall inpopularity because of the way
that they're connected.

Speaker 1 (10:38):
I wonder what role groups that are motivated by
ideology play in this.
You know, not political, notnational, but very specific
feelings on societal topics.

Speaker 4 (10:52):
So research confirms that ideologically motivated
content is one of the mostpredominant forms of content to
spread misinformation.
So any kind of content that istied to ideologically motivated
information and or emotionallymotivated information so things

(11:13):
that really get at our baseemotions of anger, anxiety, fear
those are the kinds ofinformation that we are most
likely to spread, and sodifferent groups know this and
they will use that as a way topush misinformation.
So tying it to any kind ofpartisanship or ideology is a
way to encourage the spread ofmisinformation.

Speaker 1 (11:35):
Are there any examples that have stood out to
you over the last years, onesthat highlighted this
ideologically motivatedmisinformation?

Speaker 4 (11:46):
that highlighted this ideologically motivated
misinformation.
Well, I think we saw thisclearly in a couple different
ways during COVID.
I mean, we saw the ways inwhich anti-vaccine ideologies
were being tied to other formsof hate and extremism in ways
that previously were not, whichagain both speaks to the power

(12:06):
of ideologically and emotionallymotivated content and also the
networked nature of the onlineenvironment, if you can connect
both those kind of ingredients,the networked nature and the
kind of information that isbeing spread.
We definitely saw an increaseof that during COVID, where

(12:27):
suddenly you're seeing thesemovements again anti-vaccine,
some neo-Nazi groups that werepreviously distinct suddenly
coming together in very odd ways.
And again, I think that was inpart because of the way the
information environment hasshifted and now the ways that
these ideologies are comingtogether have become difficult

(12:51):
to disentangle, which makes ithard to recognize sometimes when
there's misinformation.

Speaker 1 (12:58):
And what have you found that foreign actors do in
order to influence and interferewith Canadian public opinion or
even elections?
What works?
What are you seeing on theground?

Speaker 4 (13:10):
You know one tactic specific to foreign actors is,
in particular, reaching diasporacommunities in Canada through
sort of allophone media.
So this could be traditionalmedia that's printed or
broadcast in languages otherthan English or French.
Sometimes you see it in socialmedia and networking apps that
have sort of a large user basein a particular language.

(13:31):
This allows them to reachaudiences that are less likely
to see counter messages.
They can also sometimes gounder the radar of sort of
efforts to detect.
So think that's that's onetactic.
Um, I think most reachersresearchers agree that organized
misinformation campaigns havehad sort of little direct effect

(13:52):
on elections in canada so far,but campaigns related to covid,
for example, and electionintegrity were fairly wide seen
by voters in 2021.
And the latter theme is, youknow, since been seen in
provincial elections as well.
So they're both examples ofmisinformation campaigns that to
a large extent, start inAmerican politics and then, when

(14:13):
they've been adopted byCanadian sort of disinformation
actors that can sort ofreinforce their continued
prevalence in sort of a NorthAmerican discourse.
Again, because these networksor these online platforms are
networked, they're notnecessarily national or
provincial in the same way thattraditional or localized media

(14:36):
can be, and so, again, that justreally muddies the water in
terms of how misinformationflows and how things like
elections or public healthcrises are impacted.

Speaker 1 (14:48):
And that was Dr Cara Breeson-Boyvin.
To build on that, we haveanother great guest, tim
Coalfield.
Tim is a professor of law atthe University of Alberta and
he's also an expert voice onmisinformation, specifically
with regard to health.
He recently published a newbook called the Certainty

(15:09):
Illusion what you Don't Know,and why it Matters and he began
by telling us what motivated himto write it.

Speaker 2 (15:17):
Holy cow, you don't know what the world is going to
be like when you start writing abook.
When I started writing the book, it was really going to be a
kind of an examination of how wegot in this information

(15:37):
environment mess.
That was really the goal, andto kind of do a deep dive, in a
fun and interesting way, intothe research.
What does the emerging researchtell us about the current
situation?
And by the current situation Imean this spread of untruths,
the constant spin and rage thatbombards us daily.

(16:03):
Right, you know how did we gethere?
But you know, since you know,near the end of the book and
since the book has beenpublished, I never imagined it
would be even worse, right, evenworse.
And and it's fascinatingbecause I really think it's
going to sound like hyperbole,but I really think that you know

(16:23):
the spread of misinformation,this chaotic information, and,
from my environment, it's becomeone of the defining
characteristics of our time.
And so, yeah, the motivationwas was pretty deep and I'm glad
I tackled it.
But it also seems like I had noappreciation that this is where

(16:49):
we would be right now, in thismoment.

Speaker 1 (16:52):
You've been writing about this subject
Misinformation for quite a while, and I'm really curious if
there's anything you learnedthroughout the process of
writing this particular book,considering the unique times
we're living in.

Speaker 2 (17:08):
Yeah, you know, I have been exploring this for
really for decades.
And when you know you're ajournalist, I find that when you
write a book, as opposed to ashorter article, or do a
research project, it causes alot of reflection.

(17:29):
How do these forces impact me?
Have they impacted me?
So, for example, in the book Iwrite a lot about our
dysfunctional researchenvironment.
I write a lot about ourdysfunctional research
environment, right, you know allthe pressures on the research
community that create hype andspin and less than ideal

(17:50):
representations of science, andit was sort of like hold on,
hold on.
Did I do that too?
And you know what I think?
Early days I probably did.
But that's how incentives, howincentives, work, right, you
know they.
It's not necessarily conscious.
It's these subtle pressuresthat that shape your behaviors
and I think it's reallyimportant to pause and reflect

(18:12):
on that.
And you know I try to do thatin the book a little bit.
So, for sure, that's one of thethings I, you know, kind of
learned on a personal level.
And the other one that I foundfascinating, you know, so I
there's a whole section ononline reviews, the five-star
reviews that rule our universe.
So this is a topic we haven'tdone much research on and we

(18:33):
have since I've finished thebook.
We've actually done someempirical research on that topic
.
But I loved it because I waslearning so much and I got to
talk to experts around the worldabout this topic and I learned
so much.
But also I got really anappreciation of the degree to
which these online reviewsreally do shape our behaviors.

(18:57):
I mean it's incredible Again,this is going to sound like
hyperbole, but I don't think itis.
I mean it's incredible Again,this is going to sound like
hyperbole, but I don't think itis.
You know, trillions of dollarsare moved as a result of online
reviews.
You know, no one makes apurchasing decision without kind
of reflecting on these reviews,these five star reviews, and

(19:23):
you know, as you know, and itseems like almost everyone knows
they're manipulated, they'refake, you know they're, they're
subject to all the cognitivebiases that impact all the
decisions that we makethroughout the world and even,
and despite the fact we all knowthat we still rely on them.
So I found that, you know,absolutely fascinating and, as I
said, we've actually gone on todo some empirical research on
that very topic.

Speaker 1 (19:43):
What I found interesting in your book was
when you were talking about allof the information we're
bombarded with and how our humanbrains really weren't set up to
process the gigabytes that hitus almost on a daily basis, and
so I want you to dive into howmisinformation spreads and walk
us through the process of it.

Speaker 2 (20:04):
Well, it is an incredibly complex phenomenon,
you know, culturally, socially,psychologically it's a complex
phenomenon.
I think often when you talkabout misinformation, people
maybe think or at least somepeople think that well, you're
just talking about people lyingand people being fooled by those
lies, or people you knowexaggerating and being fooled by

(20:25):
those exaggerations, when youknow in reality there's so much
at play, right, you know,there's the misinformation
continuum, as I call it.
You know, at the one end youhave the individuals that are
spreading disinformation.
They know it's a lie andthey're spreading it for, you
know, to sell something for aparticular agenda.
State actors do this, right,just to create information chaos

(20:48):
.
And then, on the other end ofthe continuum, you have
individuals that spreadmisinformation and perhaps they
don't know it's misinformation.
They're just doing it forthemselves and their family and
their loved ones.
You know they have no, no illintent.
So there's a lot of complexitythere.
Having said that, we know, weknow that so much of the
misinformation that is spreadingright now is driven, no

(21:12):
surprise here, by the onlineenvironment, by the incentives
baked into that onlineenvironment.
So, in other words, things youknow, things that play to our
cognitive biases right, theyplay to are confirmation bias,
you know, lies that play tothings that we want to believe.
For example, they play to ourfear and the things that enrage

(21:38):
us to our grievances.
Indeed, there's a growing bodyof evidence that tells us that
much of this stuff that comesacross our radar is designed to
play to all those emotions.
Right, there's a big study thatcame out after I finished the
book that really highlighted thedegree to which misinformation

(22:00):
is tethered to rage and to fear,and that, of course, is because
it plays to our negativity bias.
We remember the bad stuff, weremember the scary stuff, we
remember the stuff that enragesus, and the reality is so much
of the stuff that we see online,so much of our information
environment, is shared.

(22:21):
It's on our radar becausesomeone shared it, and 74% of
that content somewhere between73% and 75% of the content has
been shared without clickingthrough.
In other words, someone saw aheadline or a meme or a picture
that played to their emotions,played to their rage, played to
their preconceived notions, andshared it, and part of the

(22:43):
reason is because it's just sucha chaotic information
environment.
So I I think that the the rageand the anger, the playing to
our emotions and all of theincentives that are baked into
our online environment,especially social media, but
really even the search, thethings that's really.
As someone who's been studyingthis for a long time, one of the

(23:03):
things that's really changed isthe degree to which
misinformation lies you knowuntruths, whatever you want to
call them the degree to whichthey're tethered to politics, to
our political identity, ofcourse.

(23:25):
Of course, politics has alwaysbeen lurking in the background,
but now it's at the absolutefour right.
You know you pick a topic andit has a political dimension to
it.
I study mostly healthmisinformation.
I can't believe the degree towhich you pick a topic.
It's become political.
You know you're.

(23:45):
Whether you believe that bit ofmisinformation or not is can
almost always be predicted byyour political identity, and it
wasn't always like that, but,holy cow, it is now.

Speaker 1 (24:00):
Okay, well, let's dive into the science then,
behind this phenomenon.
In your book you talk about thescience illusion.
How does science get misused tospread lies?

Speaker 2 (24:13):
Yeah, so a big part of the book.
I do examine science and youknow the scientific process and
our research institutions andone of the reasons I do that is
because you know I'm a sciencegeek and I really believe that
science is the, you know, thatcandle in the darkness that is,
you know it's going to lead usout of the darkness.

(24:35):
We can't let it becomeextinguished, we can't let it
become twisted right, andunfortunately, the way it's set
up right now it is less thanideal.
But let's talk about how goodscience is misused in our
information environment and thishas become incredibly common,
right?

(24:55):
You see it everywhere In thebook.
I joke that the Enlightenmenthas won.
Right, but it's won as a brand,not necessarily in substance,
right?
No one says, oh, my product hasless science behind it.
You know, no politician says myposition actually has less

(25:19):
evidence behind it.
The idea of science has beenembraced by the public, I think
in general, and because it givescredibility and because it
makes whatever they're sayingseem more legitimate.
So it's a process I callscienceploitation.
You take a little bit of realscience, real exciting science,

(25:42):
and you use that language tosell bunk and it's done as I
said, it's almost become thenorm.
It's absolutely everywhere.
One of my favorite examples ofof science exploitation is the
microbiome.
You know if, if we got in atime machine and went back five,
six, seven years and we wentout in the street and asked

(26:03):
people about the microbiome,some people might have a vague
idea what, what it, what that isabout.
Uh, now, the word microbiome ison the side of shampoo bottles
right, it's.
It's on every soap.
Soap, it's everywhere.
It's everywhere.
Right, it's on our, you know,yogurt and supplements.
And that's science exploitation.
You know it it's using.

(26:25):
Everyone knows that there's thisexciting research going on
around microbiome and that'sreal research.
I've been involved in that, inthat stuff.
But they're just leveragingthat genuine excitement to
create a veneer of legitimacy.
The other great example isregenerative medicines, stem

(26:46):
cells, that entire area, it'severywhere.
I mean there's face cream thathas stem cell on the label, even
though it has nothing to dowith stem cells.
We're doing a research projectright now where we're looking at
stem cell supplements.
Right, and again, they're justleveraging the idea, the
excitement, the legitimate anddeserved excitement around stem
cell science to sell products.
But this happens everywhere,right?

(27:09):
The word quantum is a reallygood example.
Everything's quantum nowquantum healing, quantum
thinking, you know, quantum yoga.
And that's done just to use thebrand of science to sell
products, to sell ideas.

Speaker 1 (27:27):
So what role does social media play then?
Because anyone who uses theseplatforms, I think they
understand social mediaalgorithms.
They dictate much of the modernlife, but it can still be
confusing.

Speaker 2 (27:41):
Oh, I think it plays a huge role.
Again, not an overstatement tosay that social media is one of
the things that has completelyaltered how we get our
information.
I mean, this is an obviousstatement.
But I think what is often lessknown is the degree to which the
algorithms dictate.

(28:02):
You know our informationenvironment.
You know, as I said at thebeginning, a lot of those
algorithms are designed to playto our emotions, right.
Rhythms are designed to play toour emotions, right.
Lots of studies have shown thatthe more extreme a position is,
the more likely it is to getclicks, to get attention right,
which is the, you know, thecurrency on social media and

(28:25):
this attention economy, right?
And what's fascinating aboutthat?
There's a study that came outnot that long ago that found
that, yes, the more extreme aposition, the more likely you
are to get clicks.
But we all live in echochambers, so, very quickly, that
extreme position isn't soextreme anymore, so you have to
become more extreme.

(28:46):
How this cycle develops, that,you know, sort of incentivizes
extreme fringe ideas but alsohelps to facilitate this
polarization, right, becauseit's going to push echo chambers
apart from each other, uh, insort of a race towards the most
extreme position.

(29:07):
That's going to get to getclicks, and we're seeing, we see
that happen on virtually every,every single topic.
Um, and that's going to getclicks, and we see that happen
on virtually every single topic.
And that's because of thealgorithms that push this
information in front of us.
And, by the way, it happensalso with search engines, right?
We did a study on cancer booksand the idea here was you're an

(29:30):
individual, god forbid you get acancer diagnosis, or someone in
your family, a loved one, getsa cancer diagnosis, kind of a
logical thing to do would be tolook for a book on that.
So you go to amazon and you, youknow, google cancer and we
found that 49 of the books thatare returned are filled with
misinformation and, by the way,some of its hardcore

(29:50):
misinformation.
This is not like, oh, you know.
By the way, some of it'shardcore misinformation.
This is not like, oh, you know,is this kind of okay?
No, it's like, you know, carrotjuice will cure your cancer,
kind of stuff.
And 70% of the books on thefirst page, which is usually all
anyone looks at, filled withmisinformation, right?
So that's the algorithm, youknow, responding to the you know

(30:11):
, the algorithm knows these arethe books, these are the ideas
that are going to get traction,that are going to get clicks,
that are going to drive theinformation economy and,
unfortunately, it works and,unfortunately, it's having a
massive impact on society.

Speaker 1 (30:29):
You've talked about people like Dr Oz and Gwyneth
Paltrow before.
What role do influencers andcelebrities play in spreading
health misinformation?

Speaker 2 (30:39):
And it's interesting because throughout my career,
you know, this topic has come up.
You know I wrote a book withGwyneth in the title and people
go oh yeah, I know celebritiesand influencers, you know, say
ridiculous things.
No one really believes it, butwe know that people do.
We know that celebrities shapeand have an influence on

(31:02):
people's beliefs, on people'sbeliefs, and they also help just
to spread misinformation, evenif it doesn't start with them.
There was an interesting studydone at Oxford, very
straightforward, a verystraightforward study, and more
sophisticated studies have beendone since then.

(31:24):
But this was early in thepandemic and they looked at
hundreds of bits ofmisinformation and they wanted
to get kind of the origin storyof each bit of misinformation
and they found about 20% startedwith a prominent individual, as
they called, you know, acelebrity, a sports star, that
kind of thing.
That's a pretty high number initself, right, but then what

(31:45):
they found out is 69% of what weshare on social media.
So the normies the not thecelebrities comes from a
celebrity, right.
So it really gives you a senseof how a celebrity almost has a
magnifying effect.
Other studies have shown thatyou know that if you relate to

(32:06):
that individual, they speak toyour values, you are more likely
to believe them.
The mere fact that they have amegaphone, that obviously
matters because it plays tosomething called the
availability bias If you justhear something enough, it starts
to feel true.
You know the illusory truthphenomenon.

(32:28):
You know, just hear it enoughit feels like it might be true.
And celebrities can play a bigrole in that phenomenon, but
also also and I think this isunderplayed the celebrities
telling a little bit ofmisinformation.
That's an anecdote, that's atestimonial, that's a story.

(32:49):
Even if it's just a post onsocial media, that's a story
from a celebrity.
So we know that stories,testimonials, anecdotes can
overwhelm our ability to thinkscientifically, especially if
it's a scary story.
So think about.
One of my favorite examples ofthis is Nicki Minaj.

(33:09):
I don't know if you rememberthis, but Nicki Minaj's friends,
cousins, testicles yeah, thosetesticles ruined a week of my
life because I was doinginterviews about those testicles
.
But think about that.
The story that nikki minajposted was these testicles
became inflamed because of thecovet vaccine.
And those testicles, you know,they overwhelmed and and beat

(33:32):
out hundreds of millions of datapoints on safety and efficacy
because it was a scary storyfrom a celebrity.
And, by the way, there has beenresearch that has shown that,
yes, scary stories fromcelebrities kind of win on the
Internet.
So that's, I think, anotherreally important phenomenon and
why celebrities and pop cultureand influencers can have such a

(33:58):
big impact.
But fast forward to today andyou know these influencers.
I don't think they're likethese niche voices anymore.
They have become the mediaright.
Think of Joe Rogan.
He, you know, I think once upona time he was kind of viewed as
a podcaster and influencer.
Now he's one of the mostinfluential voices in North

(34:19):
America.
He's a podcaster.
So I think that's, you know,now, that the media environment
has shifted and these sort ofonline influencers that were
once, you know, microcelebrities you might even had
called them now they're fullblown celebrities, massively
influential, and part of that isbecause they play to niches,

(34:41):
they play to an audience in away that, unlike any time in the
past, right.
So that is, you know, another,I think, phenomenon that has
made influencers and celebritiesmore broadly so incredibly
powerful.

Speaker 1 (34:58):
You kind of touched on the COVID pandemic with Nicki
Minaj's anecdote.
I'm wondering, though, what theCOVID-19 pandemic did for
misinformation.
Was that a misinformation boomthat we experienced?

Speaker 2 (35:14):
Yes, there's no doubt about it.
You know I think it did acouple of things.
You know it's interestingbecause early days, I think many
people in the sciencecommunication community were
hopeful.
Right, because you rememberthat.
You know it's hard to, it'seasy to forget.
I'll put it this that way it'seasy to forget.
How.
You know, together we were atthe very beginning of the
pandemic and I think everyonethought, okay, this is going to

(35:36):
be a golden time for sciencecommunicators, for public health
, you know, because everyone'sgoing to appreciate how
important it is.
And it didn't last long, though,right, it really didn't last
long.
It quickly devolved into almostan opportunity for
misinformation mongers, and weknow that happened.
Right, we know that, forexample, belief in things like

(35:59):
the efficacy of ivermectinincreased throughout the
pandemic and it's continuing toincrease.
So, you know, if you look atthe studies, things that are
demonstrably false are believedmore now than at the beginning

(36:20):
of the pandemic.
And ivermectin is just.
You know, the efficacy ofivermectin is just one example
of that.
And part of the reason is've,you know, touched on this
already is because so many ofthese topics became political
talking points.
Right, they became wedge issues, they became ideological flags.
Your position on the efficacyof ivermectin is an ideological

(36:42):
flag, right, I can almost withsome degree of certainty predict
you know your ideological, youknow your views on other topics
based on your view on ivermectin.
And this happens across theideological spectrum.
I want to emphasize that rightaway because I'm sure people you
know are going to say he'sbeing so partisan.
This happens across theideological spectrum.

(37:04):
But that's another reason.
I think that the pandemic sortof accelerated the spread of
misinformation.
And the other thing thathappened is that it weaponized
distrust.
So we're often hearing aboutdistrust now and how we have to
earn the trust of the public nowand how we have to earn, you

(37:28):
know, the trust of the public.
But I don't think it should beforgotten that much of the and
research backs this up much ofthe distrust if not, you know, a
large portion of the distrustthat exists right now was
created by the spread ofmisinformation.
And we know that because thatdistrust falls exactly along

(37:49):
political lines, exactly right.
And if it was just sort of ageneral distrust created by, you
know, mishandling of topics etc.
You'd see a more uniformdistribution.
But you see it entirely at oneend of the political spectrum.
And let me just give you oneexample In the United States,
for example, the growingdistrust of childhood

(38:09):
vaccination is entirely, hasbecome, you know, part of the
populist playbook.
So that's another thing thatflowed from the pandemic.

(38:33):
And if I could just say one morething, the other thing that's
happened is this war on thefight against misinformation.
Right, you know, we've seenmisinformation become the idea
that of researchingmisinformation, trying to
counter misinformation, tryingto understand misinformation.
That's been politicized, right,and you're the enemy.
If you're fightingmisinformation, somehow you're

(38:55):
pushing censorship or you'reagainst freedom of expression,
which could be further from thetruth.
But that narrative has workedand just very recently, in the
United States, they've startedcancelling, cancelling research
projects on misinformationbecause that narrative has been
so successful and because thatnarrative has been so

(39:15):
politicized and that was TimCofield speaking to us.

Speaker 1 (39:20):
We're going to hear more from Tim next week when we
look at the human factor in allof this.

Speaker 2 (39:26):
If it feels like your team just scored a touchdown,
that should be a reason to pause.
That should be a red flag.
Or or if it feels like theother team just scored a
touchdown and you feel veryangry, that should be a red flag
.

Speaker 1 (39:42):
Keep an eye out for that and if you're enjoying this
podcast, then please leave us areview.
You can reach me online attakara small on blue sky social
and instagram, or you can emailus at podcast at siraca.
Thanks for listening and we'llsee you again next week.
Thank you.
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

True Crime Tonight

True Crime Tonight

If you eat, sleep, and breathe true crime, TRUE CRIME TONIGHT is serving up your nightly fix. Five nights a week, KT STUDIOS & iHEART RADIO invite listeners to pull up a seat for an unfiltered look at the biggest cases making headlines, celebrity scandals, and the trials everyone is watching. With a mix of expert analysis, hot takes, and listener call-ins, TRUE CRIME TONIGHT goes beyond the headlines to uncover the twists, turns, and unanswered questions that keep us all obsessed—because, at TRUE CRIME TONIGHT, there’s a seat for everyone. Whether breaking down crime scene forensics, scrutinizing serial killers, or debating the most binge-worthy true crime docs, True Crime Tonight is the fresh, fast-paced, and slightly addictive home for true crime lovers.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.