All Episodes

June 18, 2025 36 mins

Feeling overwhelmed by the flood of misinformation and disinformation online? You're not alone—and more importantly, you're not powerless. In our season three finale, we discuss best practices for fighting back to correct misinformation online and the best approaches to navigating our complex information environment.

Joining the conversation are returning guests Canadian author and legal professor Timothy Caulfield and Matthew Johnson, Director of Education at MediaSmarts. We close the season with insights from CIRA President and CEO, Byron Holland, who suggests practical tips that anyone can take when browsing the internet.


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Takara Small (00:03):
If you've been listening all the way through
this series, you're probablyfeeling a bit anxious by now,
and who could blame you?
Mis and disinformation is oneof the biggest problems facing
the modern world, and it's hardto see a way out of the current
crisis.

Tim Caulfield (00:20):
We need good science and we're seeing it
being eroded in the UnitedStates in a really horrendous
way.
That's going to take maybegenerations to correct.

Takara Small (00:29):
It's a stressful time, but there's no need to
abandon all hope.
Academics, innovators andgovernments all over the world
are fighting back.

Byron Holland (00:39):
We have better tools.
We have smarter strategies.
There are strong communities inthis space that are making a
big difference.

Takara Small (00:49):
This week on what's Up With the Internet.
We're going to talk about ourway out of this mess, the path
forward from here.
I'm your host, dakar Small, andthe podcast is brought to you
by CIRA, the Canadian InternetRegistration Authority, the
nonprofit building a trustedinternet for Canadians.
So to start us off, we go backto our old friend from the

(01:12):
University of Alberta, professorTim Caulfield.
As you've heard in our earlierepisodes, tim has been on the
front lines of the battleagainst misinformation for many
years and recently published abook called the Certainty
Illusion.
We asked him to hold our handand tell us how this gets better

(01:32):
.

Tim Caulfield (01:34):
Oh gosh, you know I used to be more optimistic.
Okay, let me start with thedarkness first.
Okay, why?
I do think even since Ifinished the book, the situation
has gotten, you know, obviouslyobviously so much worse.
Right, and in the United Stateswe're seeing, you know,

(01:54):
research institutions and keepin mind, you know, good science,
good, trustworthy science.
You know that's absolutelyfundamentally important to you
know, pulling the universe outof the rabbit hole, like we need
good science, and we're seeingit being eroded in the United
States in a really horrendousway that's going to take maybe
generations to correct.

(02:16):
So you know that's bad news.
The other bit of bad news we'vetouched on a couple times is the
degree to which this has allbecome political.
This has all become aboutpolitical identity and research
tells us that once it becomesabout politics, it becomes much
more difficult to changepeople's minds.
You know whether you're left orright, you know it.
Just once it becomes aboutpolitics about your tribe, you

(02:36):
know, about your community canbecome much more difficult to
change people's minds, andthere's been really interesting
studies to back that up.
You know one that came, I'mgoing to say, just like, two
weeks ago, talked about health,misinformation.
And once it becomes aboutpolitics, you know, the rhetoric
around that topic becomes moresticky, in other words, hard to
change people's minds, and itbecomes nastier and nastier, and

(02:57):
I've certainly seen that playout.
So that's the bad news.
That's a little bit of the badnews.
The good news is, despite all ofthose hurdles, we are learning
more and more about thisphenomenon.
There's so much great researchgoing on big, well-done studies.
I've been doing this for a longtime and it seemed like forever

(03:17):
.
The studies were small andspeculative.
It's hard to study this well.
Now we're getting all thisresearch that's coming at it
using different methods, and Ithink that's fantastic.
You know, this is one of thoseareas where it's going to take a
lot of research, I think,because it's hard to study it
well.
There are so many variables andthat's starting to happen.
We're starting to see more andmore voices, diverse voices.

(03:38):
You know young, creative brainsare getting involved in this
fight and that's such good news.
We have an initiative calledHashtag Science Up First that I
started with Senator StanKutcher and now I'm just an
advisor on it.
It's being run by these, as Isaid, young, creative, diverse
minds from across Canada andthey're really making a

(03:59):
difference.
They're creating positivecontent that is completely
independent, that speaks topeople's concerns in a creative
way, right In a creative,positive way, and that's just
one initiative.
There are so many great voicesout there, so I think that's
something that we can be reallyexcited about.
I can't believe how thedifference between even just

(04:20):
five or six years ago, thenumber of great, passionate,
creative science communicatorsout there so that's really
really good news.
And the other good news is atevery level of government, there
is a growing recognition of theimportance of this topic.
Yes, we've seen what'shappening in the United States,
but if you look at the UN, ifyou look at the EU, if you look

(04:43):
at the World Health Organization, this has become a paramount
topic that is just completelyviewed as critically important,
and that's good news too.
It wasn't always like that.
Fighting misinformation waskind of a niche issue.
Not anymore.
It's center stage.

Takara Small (05:06):
And I think that's good news too.
What role should regulation orplatform accountability?

Tim Caulfield (05:12):
play in addressing the spread of
misinformation.
Well, I'm going to start myanswer with this debunk, because
I think this drives me nuts.
Those who are trying to stopmisinformation research or

(05:35):
trying to delegitimizemisinformation research really
try to portray it as a force ofcensorship, that those who do
misinformation research or whoare concerned about
misinformation are againstfreedom of expression.
And, on the contrary, I'm likea strong, strong supporter of
the marketplace of ideas, offreedom of expression, and I
think it's really important tohighlight that most of the tools

(05:56):
that we use in this space tocounter misinformation, they're
utilized in the marketplace ofideas.
Critical thinking skills, medialiteracy, debunks, prebunks,
even nudges, I think, are partof the marketplace of ideas.
It's not about censorship, it'sabout getting good information
out there in a way that can makea difference.

(06:19):
But, yes, regulation, I think,is relevant, and even when
you're talking about regulation,you're not necessarily talking
about, you know, deepplatforming individuals or
silencing into individuals.
We could be talking about, andwe should be talking about,
transparency, right, and thatbrings us to these platforms.
We need more transparency aboutthe regulations, about the

(06:41):
algorithms.
We need more transparency aboutthe what, what incentives are
sort of baked into those, thosealgorithms, and I think that
talking about transparency is amore politically palatable idea.
In fact, there's been someresearch that has been done that
found that there's strongsupport for the regulation of AI

(07:05):
, especially in the context ofelections, and it's bipartisan
support.
So you frame it as regulatingAI and deep fakes, instead of
regulating misinformation, whichis often viewed as censorship
and cancel culture.
You get support because peoplerecognize that AI and deep fakes

(07:26):
can do real harm to ourdemocracy, and especially when
it's framed in a way whereyou're talking about
transparency right, just tryingto make sure that we are all
aware of what's going on.
So I think that's you know kindof regulatory step, that step
that we should be taking, and Ithink that's you know kind of
regulatory step that we shouldbe taking, and I think, I think

(07:48):
and maybe I'm being naive Ithink it's a step that we can
make.

Takara Small (07:53):
Thanks again to Professor Tim Cofield for all
his insights.
Now, if you're listening tothis, the chances are you have
reasonable digital literacyskills.
You've probably spent quite abit of time online and maybe
you've acquired some of theintuition that can spot
misinformation quickly.
But you've also probably hadthe same experience as the

(08:16):
comedian Ronny Chieng trying toexplain all this to someone who
is more digitally naive.

Speaker 4 (08:23):
How do you know that it's not real?
Because the font, the font isoff and the resolution of the
image is blurry and thatlighting state doesn't even make
sense for that image and that'snot standard dimensions for a
news article.
Okay, I can't give you thisknowledge.
It's like Malcolm Gladwellblink.
I've just seen so much shit onthe internet, my brain instantly

(08:43):
filters it.
You'll never have this skillset, so just stay off the
internet.
This world is not for youanymore.
Stop making decisions.

Takara Small (08:51):
Unfortunately, telling people just stay off the
internet isn't a realisticoption, so we need better
education.
To talk about that, we're goingback to Matthew Johnson from
Media Smarts.
Matthew is the director ofeducation for the charity and we
spoke to him about a nationaldigital media literacy strategy.

(09:12):
Other countries are doing it,so what's needed to make it
possible in Canada?

Matthew Johnson (09:18):
Well, it takes a commitment ideally a
commitment from both the federalgovernment and the provinces
and territories, but at the veryleast at the federal level and
it needs to be a commitment topromoting the full spectrum of
digital media literacy, the fullspectrum of digital media

(09:39):
literacy, not just access,although access is an essential
precondition, and therecertainly are still parts of the
country, there are stillcommunities, where access is
inadequate but it needs to gobeyond that.
It needs to go to covering allof the aspects, the core

(09:59):
competencies that we'veidentified of digital media
literacy, and one of those isaccess.
Not just having access, butknowing how to use it.
So we know, for instance, thereare a lot of supports that are
available for low-income peopleor other people who may have
difficulty accessing onlinecontent, but those don't help if

(10:25):
you don't know how to use them.
Similarly, knowing aboutassistive technology that might
be used by people withdisabilities, by seniors, by
people for whom English is asecond or third or whatever
language.
These are some of the thingsthat we consider to be access
skills.
But beyond that, we need to beteaching those basic skills,

(10:49):
what we call the use skills howyou navigate, how you use
digital tools, how youcommunicate, you use digital
tools, how you communicate, theunderstand skills, how we
critically engage that idea ofcritical thinking and

(11:13):
intellectual humility and all ofthe more concrete skills we've
been talking about, like reverseimage search, finding and
verifying sources, things likethat.
And the last are what we callthe engage skills, which are
about being an engaged member ofthe, your online communities,
helping to shape the norms andvalues of them in a positive way

(11:36):
, but also knowing how to useonline tools and media tools
more generally to be an engagedcitizen in offline politics.
So really, it is vital that anystrategy is going to cover that

(11:57):
whole spectrum, and it has toalso cover the entire life
course, because as much as K to12 is the heart of what we do.
We also know that that's notenough, that today's adult
generations did not learn theskills.

(12:20):
Even if, as I did, you didmedia literacy in school, that
won't have prepared you for theinformation ecosystem we're in
now, and, similarly, theinformation ecosystem that
today's students will findthemselves in in 10 or 20 or 30
years will likely be verydifferent, and so it has to have

(12:44):
a commitment to lifelonglearning and it has to be
equitable.
It has to be reaching all ofthe communities, all of the
people in Canada and recognizingtheir diverse needs.

Takara Small (13:00):
What are these other countries doing, the ones
you've mentioned?
What are they doing so well,what has caught your eye?

Matthew Johnson (13:09):
Probably the best known of them is Finland.
Now, finland is a bit of aspecial case because, of course,
they have been subject todirect disinformation attacks
for decades, but that's reallyhelped them see digital media
literacy as everyone's problem.

(13:31):
So, to begin with, they do haveit integrated in the k-12
system and it's not siloed, it'snot a single subject.
It is treated as a subject, butit's also integrated across the
curriculum and they don't focusjust on what might be
considered digital literacy.

(13:54):
They recognize that digital andmedia literacy are part of the
same discipline, that theyreinforce one another.
So they didn't respond just toonline disinformation, but
instead their program alsoaddresses things like
advertising, other ways that weneed to engage with media, but

(14:20):
they also see it as a whole ofsociety problem.
It's not something that is justaddressed in schools.
It is something that is seen asevery citizens responsibility.
One of the things that othercountries are doing they've done
this in the uk, they've done itin australia is just getting

(14:43):
benchmarks of current digitalmedia literacy knowledge and
practice, and that's somethingthat we don't have in canada.
We don't have really any senseof the average digital media
literacy skills, and so, goingback to the very beginning of
our conversation when you askedme to grade us as a country.

(15:04):
I'm really just guessing,because we have so little data
and it tends to be notcomprehensive.
It may focus on a very narrowcomponent and you have to piece
together data from a lot ofdifferent places.
We don't even really know whatis being taught in schools.

(15:24):
We know what's in thecurriculum, but as a former
teacher, I can tell you there'sa big gap between what's in the
curriculum and what actuallyhappens in the classroom.
So, at a very minimum, we needto be doing what they're doing
in places like the UK andAustralia and find out what our
current baseline levels are,because, of course, any strategy
to be effective, we'll have tohave those baselines so we can

(15:49):
measure its success or lack ofsuccess and make changes where
we need to.

Takara Small (15:55):
And that was Matthew Johnson, the director of
Education at MediaSmarts inOttawa.
Now research by the team hereat CIRA has shown that one of
the big fears Canadians hold isabout artificial intelligence
and mis and disinformation.
To talk more about this anddiscuss some solutions moving

(16:16):
forward we caught up with ByronHolland.
Solutions moving forward we cutoff with Byron Holland.
Byron is the CEO and presidenthere at CIRA and he began by
talking us through some of thepotential dangers of AI.

Byron Holland (16:27):
Yeah, according to our research, literally just
over half 51% of Canadiansalready see deep fakes, in
particular as a threat toelections, and that's just one
specific example.
And one of the real challengeshere is, as the AI tools become
easier to access, they canbecome literally weaponized by

(16:48):
individuals, organizations or,you know, in the disinformation
state, hostile states.
You know literally Canada'sadversaries, and they do it to
undermine trust, polarize thedebate, confuse the public and
maybe even impact things likeour own elections.

Takara Small (17:09):
I mean, I have to follow up then and ask are there
any opportunities for solutionsthat AI presents?

Byron Holland (17:16):
Absolutely.
I mean, don't get me wrong, aiis an amazing innovation.
It has so much potential to dogreat things.
But, like any tool created byhumans, we can use that tool for
good or bad.
You know, on the bad side,we're already seeing it in
elections in particular, as Imentioned, manipulated images or

(17:37):
fake news articles.
But the upside is we can alsouse AI as a potential solution
for this.
So sure, ai is part of theproblem, but it's also part of
the solution, and we're alreadyseeing some great tools that are
AI based for fact checking,content moderation even starting

(17:59):
with real time misinformationdetection, and if you can get an
AI tool to help flag falseinformation literally in real
time, faster than a human could,that's going to be a huge asset
to us in this, you know,challenging information
ecosystem of mis anddisinformation.

Takara Small (18:22):
Is there a roadmap that exists that we can learn
from when it comes to AI andmisinformation?

Byron Holland (18:29):
I certainly think there is.
You know, the world andhumanity has gone through
massive innovation phases overthe course of history and we
don't have to look back that far.
We can certainly look back tothe beginning of the industrial
age, when all kinds of new andrevolutionary technologies were

(18:50):
happening, whether it was asteam engine in particular, or
other innovations likeelectricity, where those
innovations radically changedsociety as they knew it then.
And for the good, steam engineswere great, as was electricity.

(19:11):
Know, for the good, uh, youknow, steam engines were great,
as was electricity, but ofcourse, in the manufacturing
element, you know you hadchallenges like, in that era,
child labor, pollution.
Then, along with innovation hasto follow some rules and
regulation.
If we bring it a little forward, even to our, you know, more
memorable past or more recentpast, you think of the
automobile, an amazinginnovation.

(19:32):
But there was a time where wedidn't have seatbelts or airbags
or ABS, brakes or even goodroads.
So that incredible innovationof the automobile happened, the
combustion engine.
And then we needed to follow iton with some rules of the road
so that the cars could be safer,so they could be much more

(19:56):
environmentally friendly thanthey were originally.
And then even think about itfrom an externality point of
view.
You know, unfortunately inCanada we've recently seen
vehicles used literally asweapons vehicles used literally
as weapons.
So the broader society has tothink about how do we protect

(20:16):
our ecosystem from that tool inthe wrong hands.
And you know, that's that'skind of the landscape that we're
in.
But I think there are some somereal key ways that you know we
can learn from those issues andwe can make positive change.
You know we can do things onthe technology side, where
there's lots of technologies wecan implement.
Think like two-factorauthentication.

(20:38):
That's a small step but with abig impact on security.
Now we need similar simple yetpowerful interventions for
people to protect themselves interms of how they consume and
share information online.
Ai tools can do that.
We do need some smart regulation.

(20:59):
You know I'm a big believer inthe free and open internet, but
we do need some rules of theroad, and sometimes that means a
higher level regulation.
But what I would say isprobably the most important
thing is education, and we needto start that.
You know, in grade school orbefore you know.

(21:21):
Think of it like we.
Over the years we've had healthand safety training.
Well, now we need digitalliteracy training, and that
starts with the youngest peoplein our society.
But you know, I think we've allseen some older folks do some
dumb things along the way too.
They need it as well, and it'sthat digital literacy, combined

(21:42):
with good critical thinkingskills that we can teach to
young people at the earliestages which I think will make the
biggest difference over thelong haul.

Takara Small (21:58):
When it comes to addressing the issue of
misinformation anddisinformation, what approaches
does CIRA advocate for and, moreimportantly, why?

Byron Holland (22:04):
Well, at CIRA we operate a lot of the core
internet infrastructure both inCanada and we do a lot around
the world as well.
So we focus on strengthening thefoundation of the internet and
making it more safe and secure.
We as an organization don'tmoderate content we're not in
that space at all but we operatesome of the cleanest internet

(22:28):
infrastructure in the world.
You don't have to take my wordfor that Independently verified.
But we do other stuff too, likewe sponsor original research,
like Canada's Internet Factbook,where we provide Canadians,
both individuals and policymakers, with reliable,

(22:49):
verifiable data-driven insightsto help them better understand
and navigate, in particular,data-driven insights to help
them better understand andnavigate, in particular, the mis
and disinformation ecosystemtoday.
We also do things like offerCIRA's Canadian Shield, which is
a free, privacy-focused DNSservice that helps block access

(23:10):
to malicious domains to you know, to prevent things like
phishing, malware, ransomware,other cyber threats.
So that's a very concrete thingthat we offer free to any
Canadian.
And we certainly also sharelots of information from you
know, from practitioners in thespace of keeping the internet

(23:31):
clean, making it run well andkeeping it safe.
So there's a lot of stuff justeven on our website we're
sharing through social media andother things to help
individuals and policymakersmake good decisions when they're
online.

Takara Small (23:48):
There are so many approaches to misinformation and
disinformation online.
There are so many approaches tomisinformation and
disinformation.
I'm curious what your thoughtsare on the idea of inoculation
and pre-bunking.

Byron Holland (23:58):
Yeah, that's certainly an interesting topic.
For sure, it's a smart,science-based approach and it's
one of the tools that I thinkworks.
It's really about giving peoplea kind of mental vaccine, if
you will, to prevent false or,you know, prevent onboarding
falsehoods, and essentially whatit does is show the individual

(24:21):
how misinformation is actuallymade and spread before they
encounter it in the wild, andthe idea is, of course, that if
they know and understand howit's done, it's going to make
them less susceptible to it.
So I think this aligns with howwe think about building digital
resilience Fundamentally,prepare people before the damage

(24:44):
is done.
So it's definitely one of theimportant tools in the toolbox
of a resilient digital society.

Takara Small (24:53):
And what about our listeners?
We have a lot of people who aretuning in and probably
wondering what can the averageperson do?

Byron Holland (25:01):
Yeah, sometimes it can feel a little
overwhelming, and as somebody inthe middle of the internet
ecosystem, you know, sometimes Ifeel that way too.
But you know what the good newsis there is actually a lot we
can do, certainly, you know.
One thing I've alreadymentioned is digital literacy
skills for kids absolutelycritical.

(25:23):
You know, in school they'regetting reading, writing and
arithmetic.
We need to add a fourth to thatand that is digital literacy.
And if we do that right fromthe earliest days of a child's
education, we can make a hugedent in this problem.
And we've already seen examplesof that, particularly in the

(25:45):
Baltic states and the Nordicstates that share borders with
Russia.
You can imagine the kind ofdisinformation ecosystem
happening up there, and they'veshown great results.
Teach the kids young.
They learn it for life, right?
So in the K-12 space regularlyand it's not a one and done just

(26:06):
like math, you know it's goingto be every year there's an
update, you learn a little more,get a little more sophisticated
, and this isn't going to be anice to have, right, this is
going to be a core skill forparticipating in modern life,
you know.
And as technology evolves, oureducation system must evolve
with it.
Right ensuring students are notjust consuming information or

(26:29):
information online, but reallyquestioning it and understanding
it, and then dismissing it whenit's obviously fake or
misleading, so absolutelycritical for it to be taught in
schools do you think that canadaneeds a national strategy to
address these very issues?

(26:50):
I absolutely do.
You know when a new innovationhappens, especially like a
general technology, innovationthat's going to get used in
multiple different ways, like AIwill be and already is, quite
frankly, let alone social mediaor even the Internet itself.
The underlying technology it'sbeen a pretty open,

(27:13):
regulation-free environmenttechnology.
It's been a pretty open,regulation free environment and
that was really helpful withgetting amazing innovations
online.
But we're very much seeing someof the downsides of it and
that's where you know, like Isaid in the automotive example,
at some point you need to putsome rules of the road in, and
that's where I believe we're atright now, because the harms

(27:35):
associated with this amazingtechnology are just too great to
be left completely unchecked.
So definitely we need someregulations.
But I also believe thatindividual Canadians can do a
few things, like the basicattention business model.

(27:55):
So think social media is aboutamplification and outrage, and
that's what many of the socialmedia platforms are really good
at, and they grab our attention,they make us share it and and
that's where the business modellies in monetizing all of that
activity.
So one thing individualCanadians can definitely do is

(28:19):
just take a breath before youshare something.
Pause before sharing, If thecontent that you're seeing is
triggering a very strongemotional reaction.
Take a moment Check the source.
Who created it, who benefitsfrom it.
There's, you know there's someeasy red flags, like dramatic
headlines, otherwise known asclickbait.

(28:41):
If it feels like clickbait, itprobably is, which means that's
a flag for false or misleadinginformation.
And one of the easiest thingsto do just cross-check the facts
with credible news sources.
You know, one thing I would sayis social media is not news.
Go to a real source.

(29:04):
You know, most people havenever looked at the editorial
policy of a major newspaper orbroadcast outlet or broadcast
outlet.
But things like verifiedsources, multiple independent
sources, fact-checking, thoseare all part of what a lot of
the main traditional media haveas part of what they need to do

(29:30):
in order to publish something.
So just because some guy orgirl you know has a YouTube
channel and shares their opinion, that doesn't make it real.
Go to the source, see if it'svalid or not, and I think that's
, you know, that's a prettystraightforward thing that the
average Canadian can do.

Takara Small (29:48):
When talking with Canadians.
You know some of the confusionlies in how we navigate the line
that exists between free speechand misinformation.
It's so incredibly tricky.
What are your thoughts on that?

Byron Holland (30:03):
Yeah, that's a really tough one, you know, if
it was easy, this problem wouldbe solved.
It's not easy because, ofcourse, we want free speech and,
as an open democracy, that youknow, that is absolutely
critical.
However, mis and disinformationis also now very real.

(30:24):
Now, it always has been, let'sface it, disinformation.
You know.
We just used to call itpropaganda in an analog world.
An analog world, but in ananalog world you didn't have the
amplification and the pace withwhich false information can be
spread, and that's what reallymakes it different now, and

(30:45):
that's why some of the rulesthat we have had around free
speech aren't necessarily strongenough to stand up to the speed
and amplification of the modernera.
So we have to be very carefulabout it, for sure.
But I think there are someobvious places where regulation

(31:07):
could help, particularly aroundtransparency and accountability
with some of the majorinformation distribution
platforms, because in today'spolarized media landscape, you
know, we see misinformation,disinformation, is often
weaponized to shape narrativesand, as I said a moment ago,

(31:30):
that's part of why medialiteracy is so important, like
know who's giving you, and,let's face it, everybody makes
mistakes.
But by and large, traditionalmedia are bound by certain
standards where they can't do itintentionally.
So that's a good place to startfor relatively clean
information.
But regulation, understandingthe business model of the major

(31:56):
social media platforms will bevery helpful, because we're at a
point right now where most ofthose platforms are
externalizing the costs of theirbusiness, so they get all the
revenue, but the harms that arehappening on those platforms,
the cost of those harms, isbeing externalized to people,

(32:18):
individuals and society as awhole.
So I think there's someregulatory options there that
could be very helpful.
You know, the interesting thingtoo is that there's a clear
disconnect.
75% of Canadians roughly gettheir news online yet half of
those say they don't trust anyof these online platforms to

(32:40):
provide accurate information.
So you know I think that's aninteresting dissonance that many
Canadians are ready to have theinformation landscape improved
and would probably be open tosome regulation.
And certainly we all have anactive role to play in

(33:01):
sharpening our media literacyskills, which, as I said, starts
with the youngest among us butcontinues on.

Takara Small (33:10):
Do you feel that if the government was to get
involved, seemed to legislatestrongly around this issue, it
would then become heavilypoliticized and divisive?
I just wonder because?
Is it better if it's tackledthrough non-political
organizations than thegovernment?

Byron Holland (33:28):
That's a valid concern.
Of course, over-regulation canbackfire and fuel distrust and
you know, people are can alwaysand sometimes legitimately be
skeptical of governmentinterventions in private sector
and in industry, and that's whyit's, you know, it's absolutely
critical for governments to havea light touch and strike the

(33:49):
right balance.
Easy to say, hard to do, butgovernments definitely have a
role to play, especially aroundthe accountability and
transparency part.
But I think often some of themost effective efforts are going
to come from trusted,independent, ideally nonpartisan
organizations, particularlywhen it comes to education and

(34:13):
digital literacy.
This, really this is atechnology and a time that
impacts all of society, so itneeds to be a whole-of-society
effort as well.

Takara Small (34:27):
I'm curious are you hopeful about the chances of
tackling this issue?
I am.

Byron Holland (34:34):
I work in the heart of the internet business
every day and I do it because Itruly believe in the power of
innovation and the internet andall the good it can do, which it
has done incredible good sinceits kind of public inception.
So I am positive.

(34:55):
Public inception so I ampositive, but I also recognize
that it is a tool.
Humans created it.
Humans can use tools poorly andsome, some of us do,
unfortunately.
It is a complex challenge, butwe are not powerless.
You know, more people are awareof some of these challenges,
some of these nuanced challenges, than ever before.

(35:16):
We have better tools.
We have smarter strategies.
There are strong communities inthis space that are making a
big difference and I think youknow, if we invest in the right
education throughout youngpeople's lives, build trust

(35:36):
through some smart and lighttouch regulation and stay ahead
of the technology curve from anawareness and critical thinking
perspective, we can absolutelymake progress.
The internet and the innovativetools that ride on it are a
powerful force for good.
We just need to make sure thatit stays that way.

Takara Small (36:00):
And that was Byron Holland from the team here at
CIRA, with some great ideasabout the way forward from here,
and that brings season three toa close.
Thank you so much for listeningand staying with us for the
last six episodes.
This series has been writtenand produced by Kevin McAnenna,
and thanks also to SpencerCallahan, shanila Saeed and

(36:22):
Glenna Tapper from CIRA.
If you still want to reach out,you can email us at podcast at
CIRAca and we'll get back to you.
We'll see you again next time.
Bye, we'll see you again nexttime.
Bye.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.