All Episodes

December 8, 2025 22 mins

Australia’s social media ban comes into effect this week, when all under 16s there will be restricted from major platforms.

We’re talking TikTok, Snapchat, YouTube, Instagram, Facebook, Kick, Twitch, Threads and X.

The EU passed a similar resolution this month, and the UK has introduced age restrictions on certain content.

But, can you really outlaw part of the world wide web for a generation that has grown up online?

And, more importantly, should we?

Today on The Front Page, University of Canterbury senior law professor, Dr Cassandra Mudgway is with us to take us through what this means, and whether New Zealand should follow suit.

Follow The Front Page on iHeartRadio, Apple Podcasts, Spotify or wherever you get your podcasts.

You can read more about this and other stories in the New Zealand Herald, online at nzherald.co.nz, or tune in to news bulletins across the NZME network.

Host: Chelsea Daniels
Editor/Producer: Richard Martin
Producer: Jane Yee

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Kiota.

Speaker 2 (00:06):
I'm Chelsea Daniels and this is the Front Page, a
daily podcast presented by the New Zealand Herald. Australia's social
media ban comes into effect this week when all under
sixteens there will be restricted from major platforms. We're talking TikTok, snapchat, YouTube, Instagram, Facebook, kick, Twitch,

(00:30):
threads and x and more. The EU passed a similar
resolution this month, and the UK has introduced age restrictions
on certain content. But can you really outlaw part of
the world Wide Web for a generation that has grown
up online? And more importantly should we? Today on the
front Page, University of Canterbury's senior Law professor, Doctor Cassandra

(00:54):
Mudgway is with us to take us through what this
means and whether New Zealand should follow suit. First off,
Cassandra tell me about Australia's social media ban, how will
it work? And I suppose the big question is will
it work?

Speaker 1 (01:13):
So the law in Australia now requires platforms to take
reasonable steps to ensure that their users are over the
age of sixteen years. That's important. So it's not an
absolute guarantee that under sixteens won't be using platforms. It's
reasonable steps only, but that law will apply to social
media services and platforms that host user content, so think Facebook, Instagram,

(01:40):
x TikTok, Snapchat, and Reddit, and the streaming service Twitch
is also included in that law. The law does not
apply at least not for the moment, to messaging services,
so think Facebook, messager and WhatsApp or some gaming services
with those communications capabilities like roadblocks. If you have a child,

(02:03):
you probably have heard of that before. So platforms are
required to have at least one way or more than
one way to check their user's age. So the most obvious,
the most accurate way of doing that is age verification,
which is uploading government ideas, but the law in Australia
requires services to have at least one other way. So

(02:26):
the one that's most talked about that you're going to
hear a lot about is the facial age estimation tool.
So that's when they take a photo of your face
and an AI tool guess is how old you are.
So the accuracy of those tools are questioned and critiqued,
so not as accurate, but the most common measure for
existing users, if you're online right now, will likely be passive,

(02:50):
so platform will just guess your age based on how
long you've been on the app and what content you
engage with. The Other part of the law, of course,
is we've got the Safety Commissioner that's Australia's online safety regulator,
so they're going to be in charge of compliance there
and they can find platforms millions of dollars so very large,

(03:11):
significant monetary fines for non compliance if they don't take
any steps.

Speaker 2 (03:15):
For example, so the ban stops under sixteens from making
their own account, but it still lets them scroll through
the likes of tech took and YouTube shorts and Instagram
stories and things like that without making an account. Does
this protect them from seeing harmful content?

Speaker 1 (03:34):
If they can still access seeing this content then no.
I mean, because they've made that differentiation between the platforms
ver seeing messages. It means that children won't be exposed
to recommender systems, so systems that respond to their engagement
and feeds, and so they won't be funneled down feedback

(03:58):
works or pipe lines that we hear about in relation
to harmful content, so more extreme content, soogynistic content or
content that has like eating disorder type messaging, So it
will protect them from that sort of stuff, but it
won't protect them from other kinds of harm that is
more typical through messaging services in particular, so cyberbullying, deep fakes,

(04:24):
image based sexual abuse, that kind of thing.

Speaker 2 (04:27):
So when we think of teenagers, these guys aren't your
typical teenagers. So like I, for example, got social media
when I was late high school, early UNI. Right, these
kids of digital natives, they've grown up on social media,
they've grown up on the Internet, and presumably they know
how to get around these safeguards. There's a lot of

(04:50):
skepticism out there about this not actually working.

Speaker 1 (04:53):
Yeah, and they're the wide widespread concern there. So teenagers could,
for a example, use their appearance accounts, they could borrow IDs,
they could use age spoofing tools. You can register on
platforms that don't comply, they're not under the ban and

(05:15):
the Australian government has been pretty upfront about this. They're
aware of this, which is why the duty on platforms
is to take reasonable steps. It can't guarantee that some
under sixteens aren't using those platforms. So I think that
we should expect a mix of non compliance, workaround behavior
and platforms chasing those edge cases where we've got teams

(05:38):
sort of circumventing, and the other realistic outcome we are
likely to see is migration of youth to other smaller,
more fringe platforms or apps which won't come under the
band but are also probably less moderators, sort of riskier
apps where we can't protect them or it's more difficult

(05:59):
to protect them, so essentially it might push young people
towards those less safe spaces. We actually saw this in
the UK, although a very extreme example involving adult users,
but the UK has similar age verification requirements for any
websites that host pornographic content, and once that came in

(06:20):
porn Hard, one of the big pornography websites, they lost
a massive amount of traffic within a day. But then
all of these more fringe, riskier pornography websites had a
huge increase in traffic. So we can see it happening
in real time.

Speaker 3 (06:43):
It's just the act of banning. Like I'm going to
put Snapchat in the spotlight for this segment because that's
mainly a messaging app. There is a scrolling feature on Snapchat,
but it's very minuscule in like what people use it for,
and it's the way we could move to WhatsApp, and
we could move to waste messages, which many people still
already do. But the reason why this band of Snapchat

(07:06):
it's the way we've kind of grown up with communication.
It's that sort of informal, casual way of a quick
visual update with a photo or posting a story on
your Snapchat. It's just the way that my generation of
like adapt.

Speaker 2 (07:21):
If you're adapted to find new ways around social media,
you could adapt to what'sapp right.

Speaker 3 (07:25):
There's definitely ways to adapt. But if there's not a
problem right now, why change it.

Speaker 2 (07:32):
Yeah, I've seen that Australian teens are already flocking to platforms.
And forgive me if I misinterpret this is because this
my producer has written this down. She has teenage children,
so she's obviously gone on the hunt for these they're
called cover Star, Lemonade and Yop. I mean, these kids

(07:55):
are going to go and there's the opportunity for other
businesses like like away from Matter and away from the
big leader Google, et cetera, to really start up these
fringe kind of that.

Speaker 1 (08:08):
And that's the thing, isn't it.

Speaker 2 (08:09):
Because there's going to be less safeguards. You're not going
to see the CEO of Laminate at a at a
governmental hearing in Washington, but you're going to see Mark
Zuckerberg sit there and have to answer questions and stuff.

Speaker 1 (08:20):
Hey, yeah, absolutely, And that is the big concern that
we have. We've got that chase into different parts of
the Internet in a way that current regulations cannot reach
because the regulations that Australia does have, and they have
wider regulations around online safety often target large platforms or

(08:42):
services that have millions of users, and they're less likely
to target smaller and smaller apps because of those competing
concerns around capitalism and the market and making sure that
smaller sort of companies have room to grow, and they
might not have the re sources in place. They just
might not have the resources in place to ensure the

(09:04):
moderation that you would need to keep young people safe
a particular vulnerable group.

Speaker 2 (09:09):
In terms of how this has worked in the UK,
obviously you've got the age verification there for certain websites.
They haven't gone the full hog though with the social media,
and then I think the EU has also said that
it will also ban social media for under sixteens. Do
you reckon what's in Australia and New Zealand, politicians are

(09:30):
looking overseas and thinking, let's see who does it right,
because obviously the days of the checking the im eighteen
box are over. And you know I used to always
and this is putting myself out there. I'm born in
the nineties. I would always go nineteen eighty eight, so
I'm a few years older. So those days are obviously gone.

(09:52):
Do you reckon? We're just sitting back and seeing what
actually works overseas and has there been anything in the
UK that perhaps hasn't worked.

Speaker 1 (10:00):
So that's interesting because there's a lot going on all
at once. So I think Australia was the first country
to announce that they wanted to age gate social media
at least lift that age gate up to sixteen, and
other countries have looked at that, and our following suit
in New Zealand we have a member's bill from a
National MP pulled out of the ballot box that might

(10:22):
be coming in front of Parliament before the election, which
seeks to do the same thing following Australia's example. So
Australia has really kicked off a kind of movement here.
The interesting thing about the EU is that they have
voted for a minimum age sixteen sort of standard in

(10:42):
a resolution, but they are still working out other things.
So member states are still debating whether they allow for
consent of parents, so parental consent around their thirteen to
fifteen year olds using social media, so it's likely to
look a little bit different. Thing about those countries as well.
The big difference with the UK and the EU even

(11:04):
and compared to Australia is that those regions have adopted
sophisticated digital safety regulations. Both have taken a safety by
design approach, creating sort of legal duties of care on
those companies to undergo, for example, risk assessments across their

(11:26):
services regarding online harm and actively working to mitigate those risks.
So there's more architecture there that the ban or a
ban would go on top of. It's not just a
standalone ban. And even in Australia that is the same.
Australia has an E Safety Commissioner for example, so that's
the regulator that has enforcement powers. So if you have

(11:49):
a deep fate that happens online, you want it removed.
E safety can require those those companies to remove it.
And if we compare that to US Zealand, we don't
have these these structures in place, we don't have a
regulator like that, so there's no standalone band happening in

(12:10):
those other countries. So yes, we are sitting back and watching,
but we are, by the looks of things, might be
heading towards adopting a band without having adopted all of
these other measures.

Speaker 2 (12:22):
Yeah, maybe we should, you know, learn how to walk
before we run.

Speaker 1 (12:26):
Well, that's essentially exactly what I have been arguing for
out of my research. It's very clear we should be
building safer digital environments and thinking about, you know, what
does a safer digital environment look like an art or
in our context, and how do we use law and
regulation to make that a reality before thinking about age bands,

(12:50):
Because if you have a standalone age band without any
of these other things, that risk that real risk that
our kids will just end up in sort of these
riskier spaces. It's even worse because we don't have that
architecture in place to enforce wider safety standards.

Speaker 4 (13:16):
There is a very much undiscussed privilege with the idea
of banning social media because for when you've got a
country like ours with us mental health specifically system that's
so broken, and even our education system not that great,
and you take away free access and free tools, that's

(13:37):
very detrimental.

Speaker 2 (13:40):
So it seems to me that putting these bands and
the legal frameworks in place, that's a pretty time consuming,
it's expensive, and it's a pretty serious measure. So why
are we doing all of that? I mean, it must
be harder to just regulate the big social media giants
if we're going down the legal route.

Speaker 1 (14:01):
Well, I think it's one year, So I mean you're
going to have to put funding aside for regulation, especially
if you set up a regulator, You're going to have
to set up like committees around what you're going to
regulate and how, and then you've got compliance and all
of these things. And I think it's money worth spending.
And I mean the other side of that, of course,

(14:23):
is and something that we should be watching Australia for
first before we jump into the mix, is what other
companies going to do. We have seen for example x
formerly Twitter has been rather litigious in both Australia and
the UK, and you know, taking the governments too caught
or taking the regulators too caught against certain regulations, and

(14:47):
so I would expect that to take place as well,
So I think it's a waiting game in relation to those.

Speaker 2 (14:56):
Yeah, it is concerning though, the rabbit holes that the
these apps do take you down. I was just telling
my colleagues the other day actually that I liked a
couple of healthy eating videos on tick. This was TikTok,
and a couple of days later and a few scrolls later,
I was getting I don't know what the term is
for it, but very clearly anorexia content. A lot of

(15:20):
body checking, a lot of you know, showing off someone's
physique from the side them looking very dangerously skinny, and
them being in a hospital room. And the comments on
those videos were actually really alarming. They was They said
things like body goals and I You're so lucky and
the things like that, and I kind of sat there.
I took screenshots of it because I was so shocked

(15:43):
that that just popped up on my feed after liking
a couple of healthy eating videos. And you just sit
there and think, what are these kids ate those commenters?
How old are they and what are these kids getting
served up? If they do like a few of these videos?
What kind of routes do you go down? I mean
it's terrifying.

Speaker 1 (16:04):
Yeah, absolutely, and you get the sense of especially as
a parent, you would get a sense of your hopelessness,
like how do you control that? And the reality is
is that we don't have in place, for example, transparency
requirements on these companies to be like, well, how are
your algorithms working? Are they safe for children? But also

(16:25):
other vulnerable groups like women and rainbow communities and maldi
And we don't actually understand how harmful content spreads because
we're not allowed behind that curtain, and so requiring companies
to actually reveal that would be a step in that
direction rather than shielding. You know, if you put in

(16:47):
a band, you shield like a group of people from that,
but everyone else is engaging in it, and those ideas
spread regardless. And so there must be something else that
we can do to look behind the curtain and then
see those how the algorithm works, why is it pushing

(17:07):
this kind of content? And what kind of moderation can
we put in place to make sure that we're protected
from these kinds of harmful material.

Speaker 2 (17:19):
You do wonder as well, if there was a legal
requirement for us to look behind the curtain and for
these companies to be transparent about how their algorithms work,
whether the sheer fear from their side of us actually
discovering how they work changes things. Could changing the law?
Could that even be a possibility.

Speaker 1 (17:38):
Yeah, absolutely. I mean we can regulate whatever services operate
in New Zealand, so it is within our framework to
do that. And when you've got legal risks in place
and scrutiny, it changes companies' sort of prior priorities, right,

(17:58):
and especially if you attach to that non compliance penalties
that are significant enough that's going to bring them into
like oh okay, well now I have to talk about this.
And regulation can seem like a scary thing. We think
regulation will stop innovation, maybe stop competition in a market,

(18:20):
But what it can do is actually provide a lot
of certainty for your users who want to engage in
safer spaces. If it is safer in meta apps, people
are going to go back to Facebook if it is safer.
So there are incentives for companies to do that. But
those incentives need to be in place, and they clearly

(18:42):
need to be legal incentives because we've had twenty years
of nothing and this is where we are. We're at
a space where yeah, you scroll on those those apps
and you come across harmful content really quickly, and we
don't want that as a society. I think we've decided that,
and so this is one way we can do that.

Speaker 2 (19:01):
Yeah, absolutely, and perhaps I mean I saw the other
day actually as well, I didn't know that the brand
Lush isn't on social media and it's chosen to do that.
So that if they have an ethics coordinator and they've
chosen not to be on social media, what if more
companies did that.

Speaker 1 (19:20):
Yeah, I mean, there is definitely one way you can
hold companies to accounts just by not giving them your engagement.
And it is the attention economy. The more engagement they have,
the more money they generate. And if moreover, if you
choose not to advertise through the apps, that is also
quite powerful because that's obviously where they get their money from.

(19:41):
Connected to that, of course, is political sort of connection
with social media. So you know, government around the world.
If the governments around the world decide actually we're not
going to use meta anymore, that would also be a
huge change. And of course that's difficult because a lot

(20:03):
of our information around politics, I mean, that is the
issue around around culture society. Politics all comes through those
social media apps, and I feel like even if you
have companies saying no, I don't want to engage with you,
there's still that foothold. So I think it comes back

(20:24):
to political will around what you want your people to
engage with.

Speaker 2 (20:31):
And it also comes full circle as well, saying well,
why aren't kids outside playing with each other until the
street lights come on? Kind of thing. But it's like,
we do have to move on as a society from
that thinking because these kids, they've grown up in the
digital aid, They've grown up with a smartphone in their
hand and the tablet and things like that. You can't
just you know, what's been given can't be taken away

(20:53):
as easily. So we probably have to move on from
the grazed knees and the I don't know, catching tab
and things I don't know what we did.

Speaker 1 (21:03):
Yeah, yeah, I mean, like it or not. Our social
and cultural life happens online. It happens offline too, but
it happens online for our young people, and in terms
of their rights, they have a right to participate in public,
cultural and social life, and that happens online, and it
is particularly important for certain groups of young people so

(21:25):
Rainbow Youth, for example, their sense of connection and community
might be only online, it might not be found locally.
So these are important threads in their life that I
think that we do need to accept and work with.

Speaker 2 (21:40):
Thanks for joining us, Cassandra.

Speaker 1 (21:43):
Thank you for having me.

Speaker 2 (21:46):
That's it for this episode of the Front Page. You
can read more about today's stories and extensive news coverage
at nzadherld dot co dot nz. The Front Page is
produced by Jane Ye and Rich Martin, who is also
our editor. I'm Chelsea Daniels. Subscribe to the Front Page
on iHeartRadio or wherever you get your podcasts, and tune

(22:09):
in tomorrow for another look behind the headlines.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.