All Episodes

June 11, 2025 51 mins

From fake news outlets to AI-trained propaganda, hostile actors are sowing division and distrust. In this episode, we explore how foreign interference and AI-driven disinformation are threatening trust in institutions including schools, our health services and media. We unpack insights from Canada’s recent public inquiry, examine why our country is a prime target and look at global strategies for digital resilience. 

We're joined by journalist and tech commentator Sue Gardner, foreign disinformation manipulation expert and fellow at Centre for International Governance Innovation, Halyna Padalko and Helen Hayes, a senior fellow at the Center for Media Technology and Democracy at McGill University. 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Takara Small (00:03):
Like any big problem in society, finding a
solution requires a number ofapproaches.
So what about our electedproblem solvers?
How are they doing?
Mis and disinformation is oftentargeted at our public
institutions directly.
Manipulated engagement onsocial media amplifies extreme

(00:24):
positions and has seen trust ininstitutions dwindling.
Earlier this year, Canada'spublic inquiry studying foreign
election interference releasedits final report and it held a
stark warning.

Helen Hayes (00:37):
The commissioner said that foreign actors are
using sophisticated technologyto sow disinformation, calling
that the biggest threat to ourdemocracy.

Takara Small (00:46):
The report found they haven't yet changed an
election, but this isn't aproblem that will go away
anytime soon.
Disinformation campaigns arebecoming more sophisticated, and
our safeguards need to be morerobust.

Halyna Padalko (01:00):
Very, I need to admit, creative way to do
disinformation.

Takara Small (01:05):
So this phenomenon called LLM grooming- and it's
not just the federal government.
Trust in institutions likeschools, our health service and
media have all been harmed bymyths and disinformation in
recent years too.
So on this episode of what's UpWith the Internet, we're going
to take a look at how they'reresponding to this threat.

(01:25):
I'm your host, takara Small,and this podcast is brought to
you by CIRA, the CanadianInternet Registration Authority,
the nonprofit building atrusted internet for Canadians.
We've got some great guests inthis episode, but first up is
Sue Gardner.

(01:45):
You may remember Sue from ourfirst episode.
She's a journalist and techcommentator who used to run CBC
Online.
Among many other things, suehas been telling us all about
the scale of the task facing thegovernment in regulating this
space.

Sue Gardner (01:59):
Like?
I guess the headline is it'sextremely difficult, right?
Like it's extremely difficult,right, like it's extremely
difficult.
It's like, oh my god.
So canada has had, um, someinvestigations into foreign
interference, right?
Canada has brought into play acouple different pieces of

(02:19):
legislation designed at leasttangentially to affect some of
this stuff, right?
So Bill C-27, you know, wasaimed at, in part, ensuring that
AI technology got usedresponsibly.
So there are implications there.
Or potentially, c-63, probablythe big one, right?
Online harms would have imposeda duty of care on the large

(02:45):
social media platforms.
That would have hadimplications for hate speech,
for extremism, for terrorismbroader implications, probably,
too.
It also proposed to establishthe Digital Safety Commission,
which would have been a sort ofa center for expertise, thinking
about this stuff and grapplingwith it and trying to get out in
front of it, not just misinfoand disinfo, but including

(03:07):
misinfo and disinfo.
All of that got washed away,right when the government was
prorogued, so all of that isgone.
It would need to be broughtback again if wanted, right?
And so the answer is we're notreally anywhere, really, and the
only thing that's comfortingabout that, I suppose, is that

(03:30):
that is true for most everybody,right?
I mean, that's not entirelytrue.
Right, the EU, germany.

Takara Small (03:38):
Yes, they've made great strides.

Sue Gardner (03:41):
Yes, yes, yes, and so we are probably behind.
Would you say that?

Takara Small (03:47):
I would say that, Like in the grand scheme of
things, internationally, I wouldsay we're behind, and so there
hasn't been a lot of momentum inthe US to tackle this issue.
So Canada, as a result, haskind of just, you know, on the

(04:07):
sidelines taking a look atwhat's happening in the EU,
taking a look at what'shappening in the US and kind of
hedged their bets, right, and Ifeel like that is very much writ
large.

Sue Gardner (04:29):
That is a serious problem for us right now, right,
because you're right, like,like, generally speaking, we
have followed the US generallyright, even in terms of our
public broadcaster, like ourpublic broadcasters kind of
halfway between the Americansystem and the European, sort of
pure classic public broadcastersystem, right.
So in many ways that's true.
And as the US veers off intowhatever it is, whatever road it
is going down, we are going tohave to, as a country, obviously
like figure out what do we donow, right, if we can't just be

(04:51):
on their coattails and sort of amini me version of them.
What therefore, are we and howdo we pick it and how do we make
our way to it?
Right?
So it's a bigger question too.

Takara Small (05:01):
I mean just a side on of that.
It's so interesting because, asyou know, again the
relationship with the US andtariffs and everything it's
evolving day by day.
There's been, like there's been, a petition I don't know if you
saw it, you know asking Canadato consider joining the EU and
to me what that symbolizes thefact that once again Canada

(05:22):
doesn't necessarily, orCanadians don't necessarily,
feel like Canada can be on itsown.
It has to have a leader ofsorts it needs to have someone
directing where to go, how tofeel about certain issues.
And you know, a petition, evenin jest I mean it had hundreds
of signatures, but even in jestsays a lot about, perhaps, the
role that Canadians feel theyshould play on the global stage.

(05:43):
It's not of a leader, it'smaybe of a quiet supporter.

Sue Gardner (05:47):
Yeah, and I don't feel like that's necessarily
unrealistic or incorrect, right,we are a small country and
we've always kind of to somedegree appeared at least to
punch a little bit above ourweight because of our
relationship with the UnitedStates.
But that is predicated on ushaving a strong, tight
relationship with the UnitedStates.

(06:08):
If we don't have that, we'revery, very small, right.
So I think a realisticappreciation of our heft is
probably not a bad thing.

Takara Small (06:19):
When we talk about regulations, it can be quite a
passionate discussion for people, I think, on many sides of the
political spectrum, because whenyou think about regulations,
particularly when it comes tosocial media and online, what
someone considers freedom ofspeech and other might consider
censorship.
So how do you go aboutregulating misinformation in a

(06:43):
an online world where being ableto to convey your feelings and
opinions is taken as a freedom?

Sue Gardner (06:53):
yeah, and I am probably not the most patient
person, um, when it comes tothis conversation, because I
feel like so much of it isdisingenuous or at least
ahistoric, because, like, ofcourse, there are challenges in

(07:13):
terms of regulating, you know,speech, right, like yes, right,
you know.
And?
And there are real questionslike who decides what is true?
Who decides what is okay versuswhat is outside the Overton
window and not okay?
How do you avoid governmentoverreach?
How do you avoid?
Oh, I like it today with thisgovernment, but tomorrow, with

(07:34):
that government, it's going tobe very oppressive and terrible.
Right, like those are real,obvious, time-tested questions,
right, but I think that whatwe're forgetting when we have
those conversations and some ofus are forgetting this on
purpose I think that what we'reforgetting is that that all
needs to be understood incontext.
And in democratic countriestoday, in liberal democracies,

(07:57):
today, we have a hilariouslylarger amount of free speech
than we have ever had before.
Right, like everybody gothanded a printing press,
everybody got handed amicrophone.
Right, it is today.
Now it is totally okay to shoutfire in a crowded theater, and

(08:17):
we have people doing it everysingle day, all day long, right,
so it's a different landscape.
You know, in a lot of waysthese changes have been super
fantastic, right, because youknow it used to be that there
were a lot of people shut out ofthe social discourse for a
variety of reasons, right, somepeople were like invisible to

(08:40):
the gatekeepers.
The gatekeepers just didn'tknow about them or care about
them, and some people thegatekeepers disagreed with.
Those people now get to speak,and that is great and that is
real and it is new.
But it's also true that we havenever had 100%, entirely
unconstrained freedom of speech,even in the United States,

(09:00):
right, it's just not a thingLike.
There are libel laws, there aretruth in advertising laws.
Lots of countries have hatespeech laws.
You know, we have always beenhistorically very careful during
election campaigns becausethere's such a time of
vulnerability, etc.
Etc.
Etc.
And so here we find ourselves,right where most of the world

(09:25):
you know we talked a littlewhile ago about the EU, right,
like most of the world has takena more moderate approach when
it comes to freedom of speechthan the United States.
And here we find ourselves withthe limits of our speech being
largely set by American socialmedia companies, unless we take
steps to make it otherwise.

(09:45):
Right, they have been settingthe terms of the discourse and
they have also been setting theterms of the discourse about the
discourse, right.
So Silicon Valley has kind ofset the stage, they've set the
table, they've set the terms ofthe discussion right.
And so now, suddenly, newly,you know, all of the world is

(10:05):
operating in a much more US-likeenvironment, but you know,
hyper-US environment when itcomes to freedom of speech
relative to where we werepreviously, and again, with the
caveat that some countries havebeen advancing Germany
especially right, advancing toimpose their own cultural norms
on internet platforms, right.

(10:26):
And so that's the context thatI feel like that needs to be
understood in.
We have so much more freedom tospeak and to be heard, right,
and, generally speaking, that isgood.
But also, and equallyimportantly, we have always
constrained our speech and youknow the normal trajectory, have

(10:47):
always constrained our speech,and you know the normal
trajectory.
Technology makes it possible.
We run off and do things, harmsemerge and then we constrain
those harms.
That's normal.
And so I feel, like you know,we're having these conversations
about misinformation, freedomof speech.
In some ways we're having themas though it's the first time
we've ever had them and likethey're shocking and horrible to
us, like we're furious aboutthem, and I don't understand

(11:08):
that, because I feel like theseare extremely normal
conversations that we havealways had right.
We've always had them.
Like Canada, right.
We made hate speech illegal anumber of years ago and then we
decided to not have it beillegal and then we decided that
we're now going to consideragain having it be illegal, like
.
Those are good conversations.

(11:29):
That's a healthy societytalking to itself about what it
wants to be like right and whatthe trade-offs are of various
approaches.
I think that's good.

Takara Small (11:40):
That was Sue Gardner.
So what about foreigninterference?
It's well known that hostilestates spread disinformation in
Canada and other democracies asa form of social engineering to
cause instability.
Earlier we mentioned the publicinquiry which the Canadian
government established toinvestigate foreign interference

(12:01):
in Canada.
Commissioner Marie-José Oguewas in charge of it and when she
announced her findings she hadthis to say Foreign actors are
no longer content to usetraditional means to interfere.

Marie-Josée Hogue (12:16):
They are also using sophisticated
technological means andincreasingly sowing
disinformation.
In traditional media, but aboveall on social media,
distinguishing what is true fromwhat is false is becoming

(12:37):
increasingly difficult, and theconsequences are, in my view,
extraordinarily harmful.
The impact of traditionalmethods should not be
underestimated, but the greatestthreat the one that I believe
threatens the very existence ofour democracy is disinformation.

(13:01):
This threat is all the morenefarious because the means
available to counter it arelimited and very difficult to
implement.
Nevertheless, we must not giveup, but rather attack it
forcefully altogether.

(13:21):
This cannot be the sole purviewof government.
It will not work.

Takara Small (13:28):
So to learn more about threats and how the
government is responding.
We spoke to Helena Padalko.
Helena is an expert on foreigndisinformation manipulation and
she's a fellow at CIGI inWaterloo that's the Center for
International GovernanceInnovation.
Helena has a particular focuson large language modeling and
how that's being used to bringdisinformation to a wider scale.

Halyna Padalko (13:51):
Canada is actually targeted in multiple
ways, as you correctly said.
On an institutional level,government is targeted.
Academy, media, diasporacommunities are under constant
attack.
Also, on a level of processes,canada is targeted, for example,

(14:14):
like election interference, butalso in this context it's very
important to know that on anindividual level, there is also
constant targeting of decisionmakers.
Even researchers like me whostudy disinformation, and the
journalists who are doing thejob of debunking disinformation
and actually enlighten thepublic on that issue are

(14:35):
targeted.
Activists are under constantattack for this job that they
are doing.
Maybe you're interested to know, and I think it is important to
set the stage and explain whyauthoritarian countries and
anti-democratic regimes aredoing it.

(14:58):
There are several kind ofstrategic goals that foreign
countries seek to achievethrough such interference,
active measures or informationwarfare, whatever you call it.
It is the purpose and thestrategic goal is the same.
This includes, first of all,polarizing public debate to
inflame these divisions in thesociety.
Such topics could be likevaccination, immigration,

(15:21):
indigenous issues and how theywork.
They just can landscape forthis existing division and
looking for these weak spots andjust amplify them.
There was a really coolresearch a few years ago that
showed that Russiastate-affiliated media
significantly amplified coverageof the Freedom Convoy protests

(15:44):
in 2020s, publishing informationabout this event 15 or even 20
times more often than outletslike BBC, cnn or Deutsche Welle.
And this amplification actuallyaimed to artificially increase

(16:05):
the significance of this eventand therefore deepen the
polarization among Canadians.
Another goal of theauthoritarians is to erode
public confidence in democracyand institution by convincing
that the public institutions,these very public institutions,
are no longer fit for purpose,and that's how they saw.

(16:28):
In this internal distrust,there are a lot of also external
goals, for example, to weakenCanada's alignment with allies
in NATO, in G7, but like that'sa whole other different topic.
And what also important tounderstand with that regard is
why Canada is actually a targetand also will be continuing to

(16:51):
be a target for usRotariansussians.
Because, first of all, due tothe strong democratic tradition
of our country, whichAustro-Russians regimes will
never be okay with and will findit actually threatening.
Similarly with it for example,invasion of Russia in Ukraine

(17:12):
they were not okay having thedemocratic neighbors.
That's why it will always be athreat and, ironically, these
non-democratic regimes exploitCanada's democratic nature it's
freedom of speech, like academicfreedom and other fundamental
rights and using the veryfreedom against our country.

(17:34):
Research also, like ourcolleagues did, incredible
research that shows that theyare weaponizing radical left and
radical right in Canada toundermine support for Ukraine
and seed other narratives.
And unfortunately, which islike really pity, but democratic

(17:55):
nations are limited in how theycan respond to that threats
because asymmetric or likecovert retaliation would
actually compromise the verynature of democracy which is
Canada.
Second reason why Canada will bea target it is that actually

(18:15):
Canada is considered one of themost covertly infiltrated
liberal democracies in the world.
Fragmentation of informationecosystem where all these
national communities are hangingout, engaging the type of

(18:43):
content they are consuming, andthis actually provides
authoritarian actors with very,very fertile ground to meddle in
Canada's internal affairs, inCanada's internal affairs,
weaponizing information channelsand messengers, social media,
to orchestrate this influencecampaign.

(19:04):
And the last point that I wantto mention with that regard, why
Canada will be a target, isthat Canada very closely
integrated with the USinformation ecosystem because of
the same language, because ofthe same cultural space, same
stars, bloggers, influencers,and it also makes Canada
vulnerable to the interferencethat is happening into the US by

(19:29):
other countries.
And I don't know if youremember this huge uh scandal
that happened last year that itinvolved um in the us, uh two rt
employees and they wereindicted for covertly funding
and directing uh company tenantmedia, actually funded by um

(19:51):
Canadians for publishingthousands of videos for Kremlin,
like in the interest of Kremlin, and research by Media
Ecosystem Observatory it's alaboratory at McGill.
They were actually analyzing thecontent of these bloggers and

(20:12):
how it intertwined with Canadianinformational media ecosystem
and the findings were absolutelyincredible.
First of all, one quarter ofCanadians were aware of Tenet
Media, affiliated creators orbloggers.
They were engaging with theircontents and the other findings

(20:34):
showed that among 2,000 podcastepisodes that they were
relentlessly analyzing, almost aquarter of them like 500,
mentioned Canada and very oftenin highly polarized and negative
tone, and topics for discussionwere frequently criticizing

(20:55):
immigration policies, socialjustice movement and like
reinforcing these divisions.
So it's also kind of like thatspillover effect on canada.
So it's like a long answer fora short question, but it is very
important to fundamentallyunderstand why Canada is

(21:16):
targeted and will be targetedand why authoritarian countries
are generally doing this.

Takara Small (21:25):
Taking into account all of these
vulnerabilities that exist, whatare some of the key government
policies which are supposed tocombat mis and disinformation?

Halyna Padalko (21:36):
Actually, most of the successful policies that
Canadian government is doing arerelated to elections and, as it
was noted well noted in publicinquiry into foreign
interference, canada has notexperienced large scale attempt
of foreign interference in itselection.
So to highlight best practices,I would love to mention several

(22:01):
things.
First of all, a side task forcewhich stands for security and
intelligence threats toelections.
It is a task force which islike a whole government working
group responsible for assessingand sharing intelligence during
the election period.
It also consists of expertsfrom CSIS, communication

(22:25):
security establishment, globalAffairs Canada and RSMP.
And RSMP there is also criticalelection incident public
protocol.
It is a panel of senior publicservants empowered to alert the
public about these incidentsthat actually could threaten

(22:46):
integrity of federal elections.
So they're doing kind of thispre-banking work which is really
important.
I'm sure your listeners knowabout rapid response mechanisms
under Global Affairs Canada.
It's like mechanisms, strengthsand G7 coordination in
identifying and responding toforeign threats to democracies.

(23:10):
They are sharing information,doing joint analysis and also
trying to find someopportunities for coordinated
responses together with otherdemocracies.
There is also really calledDigital Citizen Initiative.
It's like a multi-componentlet's say strategy, and they are

(23:32):
supporting digital literacy.
Counter-informationcounter-component, let's say
strategy, and they aresupporting digital literacy,
counter-information,counter-disinformation projects.
Among NGOs, I would personallylove to highlight academic
initiatives like CanadianDigital Media Research Network
and I already mentioned thatfrom the perspective of media

(23:55):
ecosystem observatory, becausemedia ecosystem observatory is
running this network.
They are supported by canadaheritage and do an absolutely
incredible job in fortify for afortifying resilience within a
Canadian information ecosystem,trying to identify some

(24:17):
vulnerabilities and doingamazing analysis that helps
policymakers to make a decisionand actually help other as just
ordinary citizens, to understandwhat is happening in the
informational ecosystem.
There's also communicationsecurity establishment and CCS.
These are agencies that areactually responsible for threat

(24:38):
assessment, cyber monitoring,and they do provide Canadian
government with IT security andforeign signal intelligence and
that sort of stuff.
Additionally, there are severallegislative frameworks, as
you're probably aware of andwe'll probably talk about that a
little bit later, like BillC-26, bill C-27, c-63, all of

(25:07):
them are working to somehowaddress the issue of
informational integrity andsecurity on the internet.

Takara Small (25:16):
You focus a lot on artificial intelligence, large
language models anddisinformation.
Can you break down for us whatthat research looks like?

Halyna Padalko (25:26):
AI, as a lot of other technologies, are dual use
, so actually, depending on theactor, they can be used to
combat disinformation, using itfor analysis of big data sets,
or actually to scale it up bysignificantly actually reducing

(25:46):
the cost of producing anddisseminating this false and
misleading manipulative content.
In my latest report for CG,which is about AI and
information manipulation, I'manalyzing Russia's interference
in the US election very, verynarrow and particular case.

(26:08):
I analyze how AI was used byRussia to conduct disinformation
campaigns in the US and I useda dataset from Atlantic Council
DFR lab and categorized theincident using FIMI topology.
So this research gave me areally cool understanding on how

(26:30):
AI has been used forinformation manipulation.
But after that, some reallycool development in that field
happened, but I will talk aboutthat in a moment.
So what did I found out in myreports that the most common
tactic was using a 65 percentagewas a narrative manipulation.

(26:52):
So AI was used to craft anddistort narrative by producing
fabricated news stories, images,videos, sometimes even
testimonies of fakewhistleblowers, to, to, to
create this false and misleadingcontent.
Create this false and misleadingcontent the most powerful.

(27:14):
Actually, these examples werecombining real and synthetic
material and it's like a veryold-school Soviet-style approach
where they were combining realmaterial and forged to deceive
people.
So not something super new forus, but using these new

(27:34):
technologies technologies, thesetactics are evolving.
So in some cases, actually,what research pointed out?
That audio fakes were morecompelling than video?
In others, even simple, cheapfakes were even more efficient

(27:55):
than AI fakes.
The second types of tacticswere identity falsification.
It was in 55% of the cases.
This involved using AI to forgeidentities, create fake
personas or like mimic trustedsources to deceive audience.

(28:16):
Maybe you've heard of this umoperation called doppelganger in
this operation.
russian actors used ai todeliberately create this
believable clones and copies ofreputable news outlets in the EU
and then in the US and recentlyCanada also was targeted by

(28:41):
this doppelganger operation, soit is related to identity
falsification.
Other group, like, as Iidentify them, was, like ai for
amplification, so use artificialintelligence to automate
content distribution andmanipulate social media

(29:01):
algorithms.
This tactic relies on ai andactually ai and bolts working
together in tandem to boost thereach and visibility of this
manipulative content.
And the last category isstrategic targeting, where
influence operations wereprecisely tailored to specific

(29:25):
audiences, like with thelanguages, with some even local
variation of different languages, and for specific demographics.
So like apparently, ai enablesthese campaigns to be very
adaptive, responsive, efficientand, what is really concerning,
super cheap.

(29:45):
But after all this likeanalysis finished super recently
, like one month ago.
We all learned about, learnedabout llm grooming.
Have you heard of that?
I haven't.
So that's a really coolphenomenon, uh, so, uh, russians
deliberately feeding contentinto large language models, so

(30:09):
they were creating the networkcalled pravda.
It's series of websites that arenot actually designed for human
consumptions.
They're not reallyuser-friendly, they don't have
this search bar and articlesdon't really resemble typical
journalistic content.
It's just pieces of sometelegram posts and other content

(30:34):
and they actually post like 2,like 2000 of them in a day.
Generally, these sitescumulatively produced 3.7
million articles in the pastyear.
So why do they do that?
Their primary goal is to seedtraining datasets of big, large
language model, because thechatbots are scraping the

(30:58):
internet looking for new contentto kind of educate LLMs, and
that's how finally they aregetting into your output in
ChatGPT.
So you cannot be sure that nexttime when you're trying to get
some information from chat GPTit will not be Russian
propaganda.

(31:18):
So they're kind of legitimizingtheir narratives through this
very, I need to admit, creativeway to do disinformation.

Takara Small (31:30):
So this phenomenon called LLM grooming- Earlier
this year, the Canadiangovernment's public inquiry that
studies foreign electionmeddling released its final
report, and it said thatmisinformation poses quote an
existential threat to democracy,which sounds pretty concerning.
Can you explain this threat tous?

Halyna Padalko (31:58):
pretty concerning.
Can you explain this threat tous?
Yes, because it is actuallyvery concerning.
So it is existential simplybecause if people no longer
believe in the process,institutions and generally do
not believe in the concept oftruth itself, because literally

(32:21):
every piece of information canbe forged, democracy itself
becomes unsustainable.
It's actually the end of thedemocracy and that's what
happened in Russia.
Russians do not believe in anykind of truth because they do
understand that everything couldbe disinformation.
So democracy doesn't die with acoup.
It dies very slowly by erosionand from the inside, as citizens

(32:44):
lose a face and theirparticipation in democracy
actually declines.
So it's become the very easytarget for non-democratic
regimes and then you canactually find some parallels
with real-time events right nowwith your neighbors, because
that's actually what ishappening and it is existential.
So in its core, actually,democracy really require like a

(33:09):
shared set of facts so thatpeople can debate solution and
not debate realities.
So when misinformation spreadunchecked, it create multiple
and like incompatible realitiesand people are starting living
in that realities and it'sbecome dangerous because what is

(33:31):
true become contested and it isactually eroding democracy
itself.

Takara Small (33:40):
The report also said something quite interesting
that Canadian elections haven'tbeen swayed so far.
So I mean, there must be, wemust be getting something right.

Halyna Padalko (33:53):
Yes, that's a really good finding and, as I
previously told you, canada isdoing very good with regard to
specific tasks as election.
But if we're talking aboutgeneral, comprehensive strategy
definitely like a long-termstrategy, like building
resilience, inoculating peopleagainst disinformation

(34:14):
definitely more could be doneand it cannot be a case for
complacency not yet swayed.
It doesn't mean future immunity, because actors, as you just
heard, are very creative intheir approaches.
They will be using new and newtechnologies, new tactics and

(34:36):
new vulnerabilities will beemerging in this fight.
So it's very good finding thatelections have not been swayed
yet, but more could be done.

Takara Small (34:51):
Thanks to from CIGI for those insights.
Now Helen Hayes is a seniorfellow at the Center for Media
Technology and Democracy atMcGill University, and she
joined us to talk about thegovernment's approach as well.
Helen is an expert ongovernment policy in the media
ecosystem.
So how has the federalgovernment been responding?

(35:14):
With legislation.

Helen Hayes (35:17):
You know it's a great question because in the
Canadian context it's kind ofnuanced.
We don't have any formalfederal legislation dealing with
the state and spread of mis anddisinformation.
We had a proposed piece oflegislation called Bill C-63,

(35:38):
the Online Harms Act.
That was meant to regulatesocial media platforms and the
types of content available onsocial media platforms, but we
currently don't have any pastfederal legislation that
actually does that work.
This is in part because C63 wastabled and then died on the

(35:59):
order paper after Parliament wasprorogued, and so now we're
kind of starting from scratchhere.
We don't have a regulatoryframework for mis and
disinformation specifically, buteven for social media and the
internet generally.
So we're kind of starting fromground zero with this new
government that was just electedand we'll have to see what they

(36:23):
do to try to address it, basedon, in part, the Hogue
Commission's most recent summaryreport about the importance of
addressing mis anddisinformation in order to
safeguard Canadian democracy.

Takara Small (36:38):
You mentioned a piece of legislation right there
.
There have been attempts toaddress just this very problem
in the past, which include billsC-26, C-27, and C-63, but they
all stalled.
Can you tell us what happenedand why this keeps happening?

Helen Hayes (36:58):
Definitely.
Basically, as you said, thepast federal government wanted
to prioritize the development ofdigital legislation, but so
many competing interests,coupled with the rapid pace of
development and deployment oftechnology, coupled with

(37:21):
lobbying efforts from big tech,meant that none of them
succeeded and none of them weremoving fast enough to get past.
We also saw, in the case of C63particularly, that a lot of the
discourse related to itstabling became highly partisan.

(37:43):
So we saw conservatives sayingthat digital policy was an
attempt to regulate people'sspeech and infringement of
people's free speech.
We saw lobbying powers from bigtech saying that this type of
regulation would disincentivizeinvestment from big tech in

(38:05):
Canada, from big tech in Canada.
And then we saw members of theCanadian civil society coming
out against various aspects ofC63 because they felt like they
didn't have a good place in thatpiece of legislation.
So, for example, we saw with C63, not only a set of core

(38:27):
provisions like the duty forplatforms to act responsibly,
the duty to protect children andthe duty to make certain
content inaccessible, but wealso saw an attempt to amend
parts of the criminal code andparts of the Human Rights Act to
address hate speech online, andthose two parts of the bill

(38:51):
became highly politicized,highly contentious and in my
view and I think in a lot ofother folks' views that stifled
the ability of C-63 and its coreprovisions that duty to act
responsibly, that duty toprotect children, that duty to
make certain contentinaccessible really difficult to

(39:12):
pass, even though in my viewdoing so would have definitely
safeguarded Canadians andCanadian children online more
effectively, because right nowobviously there isn't any
regulatory mechanism to do sothere isn't any regulatory

(39:32):
mechanism to do so, but Canadianinstitutions are also limited
in their power, right, becauseso much of the problems we're
seeing are spread on platformsthat are based outside of the
country.

Takara Small (39:39):
Is it even possible for social media
companies to be regulated in aneffective way?

Helen Hayes (39:46):
Certainly and this is a question I get a lot,
actually because what does itmean to actually regulate
foreign companies, which is whata majority of our tech
platforms are right?
We think about theconcentration of big tech in the
United States, in China withTikTok, and how can we
effectively regulate thesecompanies?

(40:08):
And how can we effectivelyregulate these companies?
Well, what's interesting aboutthe approach that Canada has
taken previously is itsalignment with other places
around the world.
So we think about alignmentwith the UK, a big powerhouse, a
first mover on online safety.
Alignment with Australia andits e-safety commissioner.

(40:29):
Alignment with the EU broadly,and this makes up kind of this
global nexus of power thatincentivizes big tech to pay
attention to regulation and torun their companies based on a
more aligned regulatory modelthat makes sense for their

(40:51):
business.
It also means, from a Canadianperspective, that the government
can say we need to protect theCanadian internet infrastructure
, we need to protect Canadiansand what they see online, and to
do so we're going toincentivize big tech.
Companies have to be heldaccountable for ensuring that

(41:34):
young people aren't subject toself-harm content that's
algorithmically fed to them ontheir feeds.
We need to make sure that we'resetting up those guardrails,
and I think we could do thatpredominantly with regulatory

(41:57):
alignment in the global context,so thinking internationally
rather than siloing Canada asits own market, looking to
regulate as part of this broaderregulatory alignment structure.
That makes sense and makes usmore powerful because we're
working in numbers against bigtech.

Takara Small (42:13):
You mentioned places like Australia and the EU
, and I'm really curious to knowwho you think the leaders are
in this space and whyspecifically they are.

Helen Hayes (42:25):
Definitely.
I often look at the uk as afirst mover and a leader in this
space, particularly when itcomes to protecting young people
online.
So you may have heard of athing called the age appropriate
design code, that that wasoriginated in the uk, and it

(42:47):
calls for a systems designapproach, which I think is a
really effective approach todigital regulation.
It basically calls on platformsto ensure that their systems
are designed in a way thatupholds rights and digital
safety for the youngest users ontheir platforms.

(43:08):
So this doesn't mean stiflingpeople's speech.
It doesn't mean censoringcontent.
It means setting up the systemitself to work for the benefit
of young users.
I think that's really effective.
This can happen through thingslike turning off location

(43:29):
services for young people, aboutallowing young people to have
more control over the contentthat they see, allowing young
people not to be in contact witholder users on on these

(43:49):
platforms.
So I think the UK, in terms ofchild safety, would definitely
be the the leader of the pack.
Australia is right up there,though, when we think about the
eSafety Commissioner and the waythat they are viewing child
safety online, which I think isa really key component to this.
This.
They actually just announced adevice ban, which obviously is

(44:12):
contentious.
It does, in some respects,align with what we're seeing in
Canada with device bans inschools, but we see kind of
these regulatory mechanisms thatare aimed at supporting
children's well-being, and Ithink that when we're talking
about, you know, broaderconcerns about the information
ecosystem, we really need to betalking about how kids are

(44:34):
affected by the spread ofmisinformation, for example, or
by the content that they'realgorithmically fed, because
they're our next generation,they're the ones making moves in
the political arena, they'rethe ones who can influence
policy, and so I think thoseissues are really inextricably

(44:57):
linked and we need to payattention to that when we talk
about digital regulationgenerally.

Takara Small (45:03):
Why do you think the UK and Australia have
formulated and created a culturewhere these types of designs,
regulations and laws have beenput in place?
What is it that makes themdifferent from Canada?

Helen Hayes (45:19):
This is a great question.
I mean, in the UK, a lot of theadvocacy related to this type of
legislation came from parentswho saw their young children
being drastically affected bytheir use of online platforms
and, in particular, there were aseries of incredibly tragic

(45:45):
stories with young people takingtheir lives as a result of the
content that was fed to them ontheir social media feeds, and
this led to a groundswell inadvocacy and activism related to
digital governance.
So how can we actually hold bigtech to account?

(46:05):
Because we know that big techcan have the ability to distort
discourse or to change our viewsor to shape our society.
How do we hold them to accountin a way that protects the young
people that we're losing?
And their activism and advocacywas incredibly strong on this

(46:27):
file, both in austral and in theUK, but I would venture to say
the UK was the most strong, andthat was led by a series of
civil society organizations,academic researchers and, like I
said, parents organizations,children's rights groups who saw
these issues the issue ofinternet use and social media

(46:50):
accountability with the healthand well-being of their children
, and that made a hugedifference to the passing of the
legislation.

Takara Small (47:02):
If we look outside of government, what other
institutions should be thinkingabout how they combat
misinformation?

Helen Hayes (47:11):
Definitely schools, I think from kindergarten to
grade 12, schools, not justcivil society, media literacy

(47:35):
programs.
I think it should be mandatedin curriculums to have some form
of media or digital literacycomponent to education in the
country.
I think we should be looking atacademic institutions to
research and report on issues ofmis and disinformation.
We see that with the MediaEcosystem Observatory, for

(47:58):
example.
We see that with DisinfoWatch.
We're seeing more of a criticalapproach to discourse and
narratives that circulate online.
That I think is reallyimportant for a healthy
democracy.
That, of course, means thatacademic researchers need access
to data, which is a huge hurdlein Canada, and we, I think,

(48:22):
need to work on thosetransparency provisions in order
to allow researchers to do thatimportant work.
I think they're critical.
And then obviously, advocacyorganizations and we see that
with parents groups in Canadaparticularly.
We need to have kind of a wholeof Canada approach to this

(48:44):
where we view digital literacy,digital policy, tech policy as
an important integral part ofall the work that we do, because
really it affects the state ofour democracy, it shapes our
society, and so I think we canlook at all sectors and all

(49:08):
places, not just with youngpeople but also community
organizations that can helpcreate a more digitally literate
and engaged citizenry, so thatthey know, when they're
encountering information online,how to engage with it, how to
understand whether it's factualor not, and what organizations

(49:33):
to trust, what to be moreskeptical of and how to form
your own opinion based on theinformation that you take in
from online spaces.

Takara Small (49:43):
One of the Canadian government's key
strategies and I don't thinkit's well known is the Digital
Citizens Initiative.
Can you break down what that isand how it works?

Helen Hayes (49:53):
for sure.
So the DCI, it's an initiativeof the government that is
basically founded founded onthese ideas that Canadians need
to and should have access todiverse but also reliable
sources of news, and that'sbecause, obviously, canadians

(50:16):
should be able to form their ownopinions, they should hold
governments accountable andindividuals accountable, and
they need to know how to do sothrough reliable sources of
information and news that allowthem to create their own
opinions.
So the DCI basically supportsthese ideas about citizen

(50:37):
resilience against disinfo ormisinfo by empowering
organizations, civil societyorganizations and academic
organizations to either doresearch or produce programming
that can support Canadians inachieving a healthy relationship

(50:59):
with their online informationecosystem and also, in the long
run, kind of protect democracyagainst ills related to mis and
disinfo, foreign interferenceand other dubious content online
.

Takara Small (51:15):
Thank you to Helen Hayes from McGill University in
Montreal.
Next week is the final episodeof our series and we're going to
look at the path forward fromhere you know, think of it like
we.

Sue Gardner (51:26):
Over the years we've had health and safety
training forward from here.
You know, think of it like we.
Over the years, we've hadhealth and safety training.
Well, now we need digitalliteracy training, and that
starts with the youngest people.

Takara Small (51:35):
Please join us again for that.
You can also reach me online atTakar Small on Blue Sky Social
and Instagram, or you can emailus at podcast at siraca.
Thanks for listening and we'llsee you again next week.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.