All Episodes

August 11, 2025 18 mins

Video-sharing website, YouTube, began 20 years ago with a 19 second video of one of its founders at the San Diego Zoo.

Now, hundreds of hours of content are uploaded to the platform every single minute.

There has been a huge shift globally to reign in the social media giants -the UK has introduced age verifications requirements, with Australia to follow suit by the end of the year. Other countries like India, Germany, Spain, Italy and Norway are also investigating exactly how to better protect kids online.

So, should algorithms like YouTube's be regulated? And how would we even do it?

Today on The Front Page, Victoria University of Wellington Associate Professor Dr Peter Thompson is with us to discuss what New Zealand should do – and whether we're already fighting a losing battle against harmful online content.

Follow The Front Page on iHeartRadio, Apple Podcasts, Spotify or wherever you get your podcasts.

You can read more about this and other stories in the New Zealand Herald, online at nzherald.co.nz, or tune in to news bulletins across the NZME network.

Host: Chelsea Daniels
Editor/Producer: Richard Martin
Producer: Jane Yee

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Kilda.

Speaker 2 (00:05):
I'm Chelsea Daniels and this is the Front Page, a
daily podcast presented by the New Zealand Herald. Video sharing
website YouTube began twenty years ago with a nineteen second
video of one of its founders at the San Diego Zoo. Now,

(00:26):
hundreds of hours of content are uploaded to the platform
every single minute. There has been a huge shift globally
to rain in the social media giants. The UK has
introduced to age verification requirements, with Australia to follow suit
by the end of the year. Other countries like India, Germany, Spain,

(00:47):
Italy and Norway are also investigating exactly how to better
protect kids online. So should algorithms like youtubes be regulated
and how would we even do it?

Speaker 3 (01:01):
Today?

Speaker 2 (01:02):
On the Front Page, Victoria University of Wellington Associate Professor
doctor Peter Thompson is with us to discuss what New
Zealand should do and whether we're already fighting a losing
battle against harmful online contents. First off, Peter, let's start
with a relatively easy one. I suppose can you explain

(01:25):
what an algorithm is?

Speaker 4 (01:27):
Okay? Well, an algorithm is a computer program applied to
the servers of online content in this context, and so
it basically determines the priority in which you will actually
come across and discover content on the platform you're using.
So on social media, they access our data. They look

(01:48):
at what we've liked where, what we've looked for, what
we've searched for, what we've engaged with, and that informs
the algorithm what kinds of content we're likely to want
in the future, and so it prioritized is things that
the algorithm tells the system is the sort of content
that we're going to spend our time with, because that's
how they make their money.

Speaker 2 (02:09):
How good do you think the algorithm is, like does
it really know what we want? Or does it know
what it wants us to want? If that makes sense.

Speaker 4 (02:20):
They're getting more sophisticated, that's for sure, certainly with AI,
but they don't really know who we are. I mean,
all they have is a data set based on our
online behavior that they can access. And that's why data
is so incredibly important because of course not only informs
the sort of content that we find on online platforms,

(02:42):
it also informs advertising and the way that advertisers might
pitch their particular products to us.

Speaker 2 (02:49):
So you know.

Speaker 4 (02:50):
Is it very effective, well, to some degree, yes. I
think most of us have had an experience where, for example,
we've gone online search for something and the next thing
we know, we're on social media and the thing that
we were just looking for is being advertised to us,
and there's a little bit disconcerting. On the other hand,
Facebook is still convinced I have a dog, which I don't.

(03:13):
It keeps advertising and maybe I'll watch dog videos, but
it clearly doesn't really know who I am, what I want,
So it still hits a mess to a large degree.

Speaker 2 (03:22):
When we talk about YouTube, for example, there have been
some controversies about how it can take kids down.

Speaker 5 (03:29):
These types of rabbit holes.

Speaker 2 (03:31):
Hey, I read some comments from parents talking about it
in The Guardian. One said he actively fights against the algorithm.
He watches one video of a gel blaster, for instance,
then gets fed about one hundred videos of Americans firing guns.
Is there a fear that this kind of thing is
happening to our kids.

Speaker 4 (03:50):
I think there's a legitimate concern there. There have been
a number of studies that have shown the tendency of algorithms,
and particularly YouTube algorithms, to take people further and further
down what we might call an extremist pathway. There was
a study by the University of minister As in Brazil.
It was a big international study. They looked at hundreds

(04:11):
of thousands of accounts and they found clear evidence of
a shift of viewser accessing you know, what they might
call the relatively light but right wing leaning kinds of content,
So maybe Fox News and the Daily Mail. Before very
long you're in the info wars with Alex Jones and

(04:32):
then going into some even more near dangerous and toxic content.
Now it doesn't happen all the time, but the tendency
is there, and that's been measured. There was another study
by Bellingcat that found similar things. They looked at a
number of self identified alt right users and the way
that they'd become red pilled, that they'd woken up to

(04:53):
their right wing reality, and a number of them explicitly
cited YouTube as a key motivator for their journey to
the far right. Now, of course, if you're on the
far right, you might think this is all well and good,
but if you know it possibly works in the other
direction if you're looking for extreme left or extreme religious

(05:14):
types of content. But the key thing is that the
algorithm is there. You know, its proprietary, it's owned by YouTube,
it's there for YouTube's use, and it's there to keep
our eyeballs on the screen because that's how they make
their money. They want us online because that's how we
get exposed to advertising and marketing opportunities. So the algorithm

(05:34):
from a commercial point of view, isn't political. It's economic,
and so it's there to try and keep our eyeballs
focused on whatever we seem to be wanting to see
more of. And does that have dangers? I would say
in some cases yes.

Speaker 1 (05:52):
Remember the algorithm is a black box. No one knows
what it's doing. All YouTube can do is change the
feedback it's getting, change the signs that say this is
good or this is bad. If YouTube wants a human
to watch and categorize every video being uploaded as safe
or unsafe in real time, they would need about one
hundred thousand employees working shifts around the clock. Plus that

(06:14):
would expose them to legal issues in most of the
countries where YouTube has an office. If you let an
algorithm do the filtering and then manually step in when
you get a complaint, you illegally flying but if you
approve everything with a human in the loop, you are
a publisher and you're opening yourself up to some very
expensive lawsuits.

Speaker 2 (06:33):
So we're talking about this now, hey, because worldwide there
have been moves to limit Internet access to under sixteen
year olds, things like social media in Australia, for example,
is YouTube a social media side It's not exactly what
one thinks first straight off the bat.

Speaker 4 (06:52):
It's a very curious environment because the key thing with
social media is that they're primarily platforms user generated content.
So third parties provide the content on these platforms, and
that's very different from say a linear television broadcaster, where
you know somebody makes the programs or licenses the programs,

(07:13):
aggregates them and puts them out as a channel where
there's always editorial oversight, or indeed from a subscription video
on demand channel, where you know the library of content
or the atalogue of content has been brought in and
provided on the server and they know exactly what's there.
With online social media platforms, you don't know what that
content is. Now. YouTube's curious because it's not the sort

(07:38):
of social media site where people just go to share
the family photographs. You know, or exchange political views. You know.
It's very very video oriented and it always has been
so music videos are a big driver of content. Increasingly
television channels are setting up online, so it's a bit
of a one stop shop. You can find all kinds

(07:59):
of content there. YouTube doesn't have complete oversight, you know,
over the material that's been put out there. I mean, regrettably,
it was quite a significant host of the christ Church
terrorist video. Now even months after there were versions of
that popping up on YouTube. Now it's algorithms do, generally speaking,
pick that up and get rid of it. And there's

(08:20):
other forms of content that get policed. But there's so
so many people putting video material up on YouTube, it's
impossible to police everything. And therefore there is this potential,
you know, for children to access content that isn't appropriate
for them. Now, are there ways to police that, well,

(08:40):
I think there are, But does it require a total ban?
I think that would be perhaps a step too far
because there's also very useful educational content on YouTube, and
some of that material, you know, if children really wanted
to discover it, well, they're pretty smart and tech savvy,
you know, they can probably find ways around it.

Speaker 2 (09:09):
What about YouTube kids. So I've got friends with children.
They watch a lot of kind of sensory videos on
YouTube kids. But those accounts, I understand is for kids
aged up to thirteen, right, and that's in accordance with
the Children's Online Privacy Protection Act. But doesn't that skew

(09:30):
towards a much younger audience that should we be looking at,
say a YouTube teen's or something.

Speaker 4 (09:37):
I think that would be a great idea. Now, does
it mean that you're you know, your fourteen year old
reaccessing sixteen year old content? Well, very likely, But I
think I think labeling is an underrated exercise in media
regulation because we know from studies by the BSA and
the Classification Office that people really do you those labels.

(10:02):
So if something is labeled R eighteen, well, okay, we
know that maybe sixteen, maybe even fourteen year olds will
probably try and access it. But if you have a
sense of responsibility, and most people do, you're not going
to let a ten year old or a five year
old access that kind of content. So if you have
built in systems, you know, where where someone can only

(10:23):
access the adult content if they have an account that
signals they're an adult, well that will cut out a
very very large range of potentially harmful exposures. It's not perfect.
If you're a terrorist and you're trying to live stream
your active terrorism, you're very unlikely to give our dance
notice that you've got an R eighteen video coming up,

(10:43):
or even an objectionable video coming up. So of course
it's not perfect. But would we all rest a little
more peacefully knowing that our children are more likely to
be playing in a safe soundbox, you know, with those protections.
I think yes, and it's doable.

Speaker 1 (10:58):
Yeah.

Speaker 2 (10:58):
I quite liked this comment from Australia's Federal Communications minister.
Her name's Aniica Wells, and she described protecting children from
internet harm as like trying to teach her kids how
to swim in the open ocean with rips and sharks
compared to a local pool. And she said, we can't
control the ocean, but we can police the sharks. But Peter,

(11:21):
can we police the sharks?

Speaker 4 (11:23):
Well? I think we can. I don't think we've tried
hard enough to be perfectly honest. But if we take
the christ Church call, for example, and give ct the
Global Internet Forum for Encountering Terrorism, I mean they've put
in some incredibly sophisticated software, you know, for picking up
or on problematic content. It can't stop someone posting, you know,

(11:47):
terrible material like the terrorist video in the first place,
but it picks it up incredibly quickly now and through hashtagging,
she's kind of a digital fingerprint on the way that
content is represented on the screen. There's be a pixel,
a series of data points that identify the problematic video,
and that goes out to all kinds of social media

(12:08):
and online operators and they all have the same code,
and they very very rapidly now have protocols for getting
rid of that content. So you know, there are things
we can do, and I think there's other other options there.
We could look at you closer identification of who's using
these accounts. We could look at age verification. So there's

(12:29):
a number of mechanisms that could be put in place.
And I have to say that none of this is
an affront of free speech, as some people on the
far right seem to imagine. And I do have to
question the motives of people that say that, such as
the threat to freedom of speech, we can't regulate any
media in any way. I think that's absolutely irresponsible, because

(12:51):
there are many ways that we can regulate the media
to protect the younger people in particular and to protect
the wider community. And it doesn't mean that we don't
get access to content. It just means that we're unable
to make more airful and better informed decisions.

Speaker 5 (13:09):
That's it constantly telling us that they're really worried about
the impact that social media is having on their children,
and they say they're really struggling to manage access to
social media.

Speaker 3 (13:17):
This bill is about protecting children from online harm, including bullying, addiction,
and exposure to inappropriate content. By restricting social media access
for under sixteen year olds, it puts the onus on
social media companies to verify that someone is over the
age of sixteen before they access social media platforms.

Speaker 5 (13:35):
This is about protecting our children. It's about making sure
that social media companies playing their role in keeping all
of our kids safe.

Speaker 2 (13:44):
Journalists, for example, we do stuff under the Broadcast Standards Authority.
We've also got ethics and things like that. I mean,
a so called journalist on YouTube just spouting off conspiracy theories,
what do they have any regulations or anything at the
moment or.

Speaker 4 (14:03):
Short answer is no, Whether they're whether or not they
would counter as journalists, I think is the question. Now again,
this is where I would have some sympathy for the
argument for free speech. I mean, people are allowed to
be mistaken. I think there's a case for looking very
closely at platforms or channels that simply perpetuate disinformation and

(14:29):
really they're they're just a troll farm pumping out malinformed
material to a wide population. I think I think there's
a very strong case for regulating those sources. But does
that mean that we should all be looking over our
shoulder because big brothers going to censor us? I don't
think that would be the case. And so one way

(14:49):
to deal with that would be to flag content. You know,
if if a source is proven to be unreliable but
it's providing something that looks like news, you put a
label on saying, look, you know, experts consider this to
be an unreliable news website.

Speaker 2 (15:04):
Facebook did for a minute, yeah that to Facebook did
do that, and where that people were putting.

Speaker 4 (15:10):
Vaccine skeptic material up on the platform, and they said, oh, look,
you know that this information is disputed, it's not seen
as reliable. Here's a link to some reliable information. Now,
that's not censorship. That's enabling the user to make more
sensible choices about the sorts of content that they engage with.

(15:32):
So again, that's not censorship. It's not saying you can't
have access to this information. It's not saying that you
can't have an opinion that differs from the mainstream. But
it is saying that we can enable the audience to
make more informed and more judicious decisions. And I think
that's where they go back to the algorithms. I think
that's where maybe we need to regulate the algorithms as well,

(15:55):
to make sure that when we're looking for news and
information that we find the reliable sources at the top
of our search or the top of our news feed,
that those sources are going to be the reliable sources
and not not ones that are of more dubious provenance. Right.

Speaker 2 (16:10):
So, lastly, Peter, what should the New Zealand government do
if they were to do something tomorrow. It kind of
seems like at the moment they're sitting back and just
kind of checking out what the rest of the world
does and see what's working.

Speaker 4 (16:24):
I think that's often the default position. There's a there's
a value in aligning whatever regulatory framework we you know,
we adopt here with the regulatory frameworks overseas, and i'd
be looking at Europe and the UK perhaps there. I mean,
Australia has got gone quite a long way towards blocking

(16:45):
social media for for for younger people. I don't think
you can ban it outright. I think that's going to
be quite difficult. But there are steps, as I say
to to looking at ways of flagging content or introducing
age verify and restricting the sorts of content available to
people in those age categories. Now they're not perfect, but

(17:07):
I think they would go a long way to protecting
younger people, and I think we should be looking at
those measures. I happen to think that the government made
a very poor decision to abandon the Safer Online Services
and Media Platforms proposals. They weren't fully formulated as a bill,
but there were some very sensible ideas there, none of
which included advancing censorship on free speech. They were all

(17:31):
about trying to make the public more able to control
the sorts of content that they're exposed to. So as
a first step, I go back to that and have
a look at what those proposals were because I thought
there were some very sensible ideas there.

Speaker 2 (17:45):
Thanks for joining us, Peter, very welcome. That's it for
this episode of the Front Page. You can read more
about today's stories and extensive news coverage at enzerld dot
co dot nz. The Front Page is produced by Jane
Ye and Richard Martin, who is also our editor. I'm

(18:08):
Chelsea Daniels. Subscribe to the Front Page on iHeartRadio or
wherever you get your podcasts, and tune in tomorrow for
another look behind the headlines.
Advertise With Us

Popular Podcasts

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.