Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:07):
You're listening to the Saturday Morning with Jack team podcast
from News Talks.
Speaker 2 (00:11):
That'd be Google Sutherland. Second, the core psychologist from Umbrella
well Being is with us this morning. Calderdogle Curder Jack.
Speaker 3 (00:19):
How are you?
Speaker 2 (00:20):
I'm very well, thank you. Hey. A few weeks ago,
Interesting Are You and I were discussing adolescents that that
amazing Netflix series. And then this week, of course, the
well A National MP has put forward a member's bill
that would that would effectively ban social media for under
sixteen year olds. But there's a new report from the
Classification Office that actually looks at what young people are
(00:42):
being exposed to online, which is fascinating.
Speaker 3 (00:45):
Yeah, yeah, fascinating and to be frank quite quite frightening
as well. And you know that that's that they had
They spoke to about ten groups of young people so
that stage between twelve and twenty five, and look, it
was incredibly common, almost universal, that they had encountered what
what the Classic Office called extremely harmful content, and that's
(01:09):
things like you know, graphic real world depictions of violence
and cruelty to animals, amongst other things. They won't go
through the whole list because it's a little bit stomach turning.
But you know, this was I guess the message was
that young people are seeing really distressing content online. Now,
I do want to say it's not just social media.
(01:32):
There was a variety of ways that they've seen it.
But it was a real eye opener, I think, and
would be probably a surprise perhaps to many, particularly parents,
about the content that young people are actually seeing, not
not always intentionally.
Speaker 2 (01:46):
Well, and that's a key point, right, A lot of
the time it was actually unintentional that they came across
the stuff.
Speaker 3 (01:51):
Yeah, yeah, yeah, it was. It was you know, it
wasn't Sometimes people searched it, you know, they would hear
something and search it and then go, oh my god,
that's horrible. But sometimes it just sort of turned up
and their news speeds, or somebody messaged them with it,
or or they saw it on a friend's device. So yes,
(02:12):
a lot of it was just unintentional that they hadn't
searched it out. It just it just appeared and they
suddenly were exposed to these really distressing sort of images online.
Speaker 2 (02:23):
Yeah, and you can't unsee it. It's like it must
be pretty affecting, especially for young people.
Speaker 3 (02:28):
Yeah, and some of them used that to them, you know,
they said, look, some things you just can't unsee, and
they use words are horrified and terrified, and probably, you know,
depending on what they've seen, there's that there's a real
risk of some of them probably developing things like ptsd
as as a result of that of seeing something, particularly
if it's real, you know, it's real life, so it's
(02:49):
you know, it's quite it's quite sobering. I think that
the what's out there and what our young people have
been often inadvertently exposed to.
Speaker 2 (02:57):
Do young people know what to do if they come
across the stuff, like do they feel like they can
tell their parents? Do they feel like they can you know,
to pearency, like they can try and tell the platform
or you.
Speaker 3 (03:10):
Know, well that that that's there. They're often unsure. That
was one of the findings is that they were often
unsure about what to do. They were worried that if
they told a parent or a teacher, then that the
immediate result would be right, you know, give us your
phone or your device, you're not having that anymore. And
they didn't, you know, the message from them was that
(03:30):
that they didn't want that they didn't and they weren't
often confident about reporting it, you know, to a platform,
because it seemed for many of them quite a laborious
process and it wasn't really clear that it would work.
But they did say they wanted to have they wanted
to be able to have open communications with trusted people
(03:53):
in their lives when they had come across it, and
and to be able to and I guess it's respecting
the fact that they're developing into adults to be able
to be supported and figuring out what to do. So
rather than having parents are adult to jump and removing
things immediately, it was like, actually, let's try and knuck
this out together, and let's give you the skills that
(04:15):
you can use next time you come across it, you know,
because it does sound like there will be next time.
So yeah, so it was a sort of a more
nuanced thing that they were really asking for, not not.
And I think the danger is, of course, if things
are removed immediately, then it just you just I'm not
sure that that will necessarily stop it, but it's likely
(04:38):
to push the behavior underground. Yeah, so they might see it,
but then they won't tell anybody about it.
Speaker 2 (04:43):
Yeah, that's the thing, because a natural impulse from a
parenting perspective will be like, wow, all right, well we've
just gotta you know, we're just gonna have to keep
you off the enginet for the rest of your life.
That's here.
Speaker 3 (04:54):
Yeah, yeah, I agree. I think it is a really
natural impulse as parents that if you know, if our
kids have seen something that's really upsetting and distressing, you
naturally want to protect them from that. But I'm you know,
and I can see that, I can see the argument there.
But I firstly, I mean, if you've banned them from
(05:15):
social media, you probably are not not actually going to
fix the whole problem because it wasn't always on social media.
And then what are you going to do? You're going
to are you going to ban the internet? You know,
you're going to ban it for a sixteen year old.
It's literally in the air around us. And so I'm not,
you know, I understand that desire, but I'm not one
(05:35):
hundred percent sure that it would be the would actually
be the best thing, you know, And we do want
to keep those communication channels open to the young people
so if they do see it, they can come and
tell someone, because it's super important, you know, to be
able to not want to hold that in. If you've
seen something really distressing, be able to have somebody safe
that you can go and sort of de stress and
(05:56):
unwind with and let them know and just just you know,
share that with somebody. So it's perhaps take some of
this thing out of it.
Speaker 2 (06:02):
Yeah, I think that's the thing. The nuance is the key,
the key thing when it comes to the parents' response,
you just want to and try and have some nuance there.
What do you think of the banning under sixteen some
social media idea.
Speaker 3 (06:15):
I don't know. I'm mixed about today. I mean, I like,
I understand it, but I and especially being exposed to
difficult stuff. On the other hand, we know that a
lot of social it's a big source of social contact
for a lot of young people, particularly in minority groups
or who are perhaps geographically remote, that it's a real
(06:37):
source of social connection. And I know the augument will be, well,
we grew up with it, you know, we didn't have
the Internet when we grew up. But you know that's
I know that. And I grew up in that area
era too, and it's like, well, we still made friends,
but we're not in that era anymore. That's history. And
I kind of get the sentiment, but I just worried
(06:58):
that it won't actually be as effectives as people would
like it to be. Be interesting to sit, you know,
to trace Australia see what's going to happen there, because
they you know, they have effectively done that, and so
it'd be interesting to see how they go. But I
worry about the policing of it. What I would say
is i'd support I definitely support, you know, social media
(07:20):
companies from having having continuing to improve the content that
that's on their platform and kind of put it back
on them and say, hey, look, you know, if you
are going to have a make money from this, can
you at least make sure that it's not harmful to people?
You know? And maybe there's a separate platform front of
sixteens or something. I don't know, but but but you know,
(07:41):
I'm in favor of certainly putting the emphasis on those
social media companies to be continue to police things even
more strongly than they are now.
Speaker 2 (07:51):
Such a tricky one. Hey, thanks so much, Google. We
really appreciate your dialing insight as always, so of course
I just google something. They're from Umbrella Well Being.
Speaker 1 (07:59):
For more from Saturday Morning with Jack Tame, Listen live
to Newstalks that'd be from nine am Saturday, or follow
the podcast. Asked on iHeartRadio