Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Hi, I'm Francesca Rudkin and I'm Louise Aria, and this
is season three of our New Zealand Here podcasts, The
Little Things.
Speaker 2 (00:15):
A podcast we talked to experts and we found out
all the little things you need to know to improve
all areas of your life and cut through all the
confusion and overload of information out there.
Speaker 1 (00:24):
Today we're talking about something pretty bloody awful and it's
one of those issues not many of us lean into,
but we should not just if we have our own kids,
but grandkids, nephews, nieces, or any young people we love
in our lives.
Speaker 2 (00:36):
And actually, what we're about to learn through this podcast
loo where we talk about sex tution and various other
online scams, is it's not just about young people. It's
really important that we all kind of understand what it's
all about. I think that since we've since we started
sharing our lives online, all we've tried to do is
work out how to protect ourselves from people who want
to take advantage of it.
Speaker 1 (00:57):
Yeah, that're working just as hard to be sure we can't.
Speaker 2 (01:00):
It just constantly changes the threats that come at us.
Technologies constantly changing in things. Sometimes I feel like I'm
losing the battle, and actually other times I feel like
we can be empowered to make some really good decisions,
and that's what I'm you know, I have some really
good conversations, and that's what I'm hoping it's going to
come out of today's.
Speaker 1 (01:19):
Yeah, I mean, you probably remember the ongoing battle we
had with my oldest kid keeping the phone out of
his room at night. That's right, you know. I went
in there with all the best intentions of being a
good parent, of protecting him from the known dangers at
the time, and he thwarted every effort we made.
Speaker 2 (01:38):
Yeah, and in our house, we sort of followed all
the guidelines, right, we said, no Instagram or social media
till you're thirteen, and we do these things. But immediately
these kids jump on and even if you want to
follow them, well, in a day they notice set up
the other account private mom and dad. The private account
the mother and dad don't follow it. It didn't matter
what you kind of did that you thought that you
were doing all the right things and you got this
(02:01):
you didn't.
Speaker 1 (02:01):
But even then, it's not just one platform, it's two
or three platforms that they're operating on at any one time.
I mean Snapchat, for example, it's just so fast. It's
so fast, and you know you snap one person, you
can snap everybody on your list. I don't understand Snapchat,
never have, do have Instagram? Try the snapchat thing. Even
my seventeen year old daughter can't do Snapchat. She's just like,
(02:24):
I can't do it. I'm never going to meet anyone
because that's how everyone meets everyone. But I can't. She
can't do it. I think also, I mean, if your
son was thirteen when he started on Instagram and he's
eighteen now, and that five years, things have exploded, and
as we're going to hear today, the people out to
get our kids are five steps ahead of us.
Speaker 2 (02:43):
And don't get me started on some of the content
on TikTok. Like all technology, it can be brilliant and
it can be used to add value to our lives,
but unfortunately there are also risks involved, oun't there. And
I think it's really scary how quickly digital abuse changes.
And so now we're seeing this thing called sex taution,
and we did an interview on this on the Sunday
(03:05):
Session recently. We had an overwhelming response from parents, which
has kind of led to this podcast today, And if
you think it couldn't happen to your family or one
of your kids, then you just have to hear the
stories from parents who has experienced this to go Okay,
I can't I actually can't think like that. This is
(03:25):
not the time to go, oh, this won't happen to me,
or this won't happen to my kids.
Speaker 3 (03:29):
Yeah.
Speaker 1 (03:30):
I mean even Harry and Meghan are elevating this cause
with them their parents' network, and even Harry recently said that,
you know, we could be in the next door room, sorry,
a child would be in the next door room, or
a tablet or phone going down rabbit holes, and before
you know it, within twenty four hours, you may have
lost them.
Speaker 2 (03:48):
I don't think we needed Harry or Meghan to tell
us that now, we didn't think. I think you knew
that already, Louise. But I love I love the way
you've just got, you know, a royal touch to the
podcast topic.
Speaker 1 (03:57):
And it's still that we've had.
Speaker 2 (03:58):
A royal touch to the podcast yet have we? So
look to give us some insight and hopefully some strategies
for this quite terrifying digital threat to any child with
a smart phone or tablet are a couple of experts
in this area. Olivia Carvill is an investigative reporter at
Bloomberg News. She's been awarded several major national media awards
(04:19):
who work has influenced policy. She has been investigating child
safety in the digital world for the past two years.
We're going to talk to her first. Later, we're going
to hear from Sean Lyon's chief online safety officer with
net Safety. He's going to give us some practical advice
here how to deal with sex torsion. Please note this
podcast contains conversations about suicide and if you need any information,
(04:43):
any support, or advice, please check our podcast notes. We
will have contact details in there. I'd like to welcome
now to the podcast, Olivia Carvill.
Speaker 4 (04:54):
Welcome, Olivia, Thank you so much for having me talk.
Speaker 2 (04:57):
Us through sex tortion, what it is and how it works.
Speaker 3 (05:03):
So sextortion, really the definition of it is sexual extortion
and it's a form of blackmail.
Speaker 4 (05:11):
It is where predators target.
Speaker 3 (05:13):
Individuals using social media to coerce explicit photos from them
and then use those images to blackmail.
Speaker 4 (05:21):
Them in order to get something that they want.
Speaker 3 (05:24):
Now, traditionally we've seen sextortion being sexually motivated. What I
mean by that is a predator or an individual is
trying to get more explicit content out of a target,
and they will try and coerce them into sending a
nude photo in order to send more images. But what
(05:47):
we've seen more recently is financially motivated sextortion, and that's
completely changed the game. And here in the US, the
FBI have said sextortion is now a hidden pandemic.
Speaker 4 (06:00):
This is affecting.
Speaker 3 (06:01):
Tens of thousands of children in America and around the
world because this new form of sextortion is just spreading
like wildfire.
Speaker 1 (06:11):
Who are their targets? Like, is it a particular gender,
as a particular vulnerable age? Who are they going for?
Speaker 4 (06:18):
Yeah, the targets are actually quite specific.
Speaker 3 (06:20):
I was really surprised when I started reporting on this
issue that what the predators are looking for are teenage boys.
And again, traditionally, when you know, when we've done research
into sextortion and online forms of sextortion, it's mainly gone
after young girls. But financially motivated sextortion is targeting teenage boys.
(06:44):
There's are boys aged between twelve to seventeen or eighteen,
and particularly young boys who are quite popular in their
high schools, who have a lot of followers online, who
are homecoming Kings or football players, who have a presence online,
who have a reputation and have a big future ahead
(07:05):
of them, and they use that reputation to blackmail them.
Speaker 1 (07:10):
That's terrifying.
Speaker 2 (07:11):
It is terrifying. And Olivia, it's just terrifying how quickly
this has kind of taken hold, isn't it.
Speaker 4 (07:17):
Yeah.
Speaker 3 (07:17):
The FBI actually says this is the fastest growing crime
targeting children in the US right now. We've not only
seen suicides right across America, We've seen suicides occurring in Canada,
in the UK and other countries around this around the world.
This really is a hidden pandemic, as the FBI describes it,
(07:39):
and it's grown so big so fast it has become
very hard to control.
Speaker 2 (07:45):
So they're using social media to target them. Do we
know are there any particular platforms that are that are
being used to do this?
Speaker 4 (07:55):
Yeah.
Speaker 3 (07:56):
I mean when you think about your own use of
social media, like I have Instagram, I have Facebook. We
all have a friends list and it's publicly available, so
you could go onto my Instagram account and see who
I follow, who I'm friends with who likes my posts.
You could go onto my Facebook profile and see who
(08:16):
my friends are. And the predators in these cases are
using social media. They're using that roll index of connections
that teenagers have against them and that's how they're blackmailing them.
So once they coerce that explicit image from a team,
they then threaten to send that photo out to everyone
(08:39):
in their friends list. So they'll say things like I
know who your parents are, I know who your teachers
are at school, I know who you play with on
the football team or the ice hockey team, I know
who you go to school with.
Speaker 4 (08:53):
I know who you.
Speaker 3 (08:54):
Play in band with, and then they threaten to send
that image out to those individual and in some cases
we've heard reports of these blackmailers actually sending screenshots of
those images about to be sent to girlfriends, friends, family members,
and they say, I'm going to hit send in ten
(09:16):
seconds unless you give me money. In terms of what
platforms are really being used and abused in this particular crime,
we're seeing Snapchat and Instagram are the platforms of choice
for predators or blackmailers who are looking to see extort teenagers,
and that's because they have access to their friend networks
(09:37):
and who follows them on those social media sites. But
we're also seeing predators use other platforms like YouTube and
TikTok to actually share how to guides on how to
blackmail or how to coerce a teenager to send a
naked photo. And you know, that's what the FBI has
(09:58):
been focusing on.
Speaker 4 (09:59):
Red is. You know, who are the people who.
Speaker 3 (10:02):
Are behind this crime, where are they getting the information from?
Speaker 4 (10:06):
And how do we stop them?
Speaker 2 (10:07):
Oh my gosh, we'll come to that in a moment,
because if you should have seen our faces, all three
of us in the room here, our producer, care and
and Lou and I, we're just we're just with our
mouths sort of. When you see there.
Speaker 1 (10:19):
So blatant, it just seems so blatant.
Speaker 2 (10:21):
And I can understand that horror that a kid must
feel when they know so much about you, they've been
able to glean so much about your life off you
that you just you just immediately come under that pressure, Olivia.
Don't you to submit to their whims? So what tell
me what does happen? You know, what is happening to
these kids? How much are they sort of being extorted
(10:43):
for what happens if they pay, what happens if they
don't pay.
Speaker 3 (10:47):
There is a ruthlessness and a brazenness to this crime
that is unlike anything I've ever reported on, and unlike
anything that the police officers and FBI agents I talked
to during my research into the story have ever seen either.
Speaker 4 (11:03):
I've done a lot of reporting on the dark.
Speaker 3 (11:06):
Sides of social media, and I've talked to a lot
of law enforcement officials throughout the course of my career
as I've reported these stories out.
Speaker 4 (11:15):
But for this one and this.
Speaker 3 (11:17):
Particular issue, multiple interviews resulted in FBI agents and police
officers and detectives in tears because there's just not a
lot they can do to help these children in that
moment of crisis. And unfortunately, in many of these cases,
the blackmail starts late at night when the teenagers are
(11:40):
in their bedrooms alone on their cell phones and they
make a mistake. They send a photograph that they shouldn't
have sent because they've been catfished by someone online who
isn't who they say they are, And once they realize that,
they feel like their whole life is over. Once they
realize that they're being blackmailed, they don't don't know where
(12:01):
to turn. They see that that photograph is about to
be shared to their friends, to their family members, it
will spread right across their school. They've probably seen it
happen to other friends or other students who heard of
it happening before, and feel like there's no way out.
And that's what we're trying to raise awareness of, not
(12:21):
only through the reporting, but through this conversation as well.
It's trying to educate parents to talk to their children,
to try and connect with teenagers and help them understand
that if you find yourself in this situation, there is
a way out, and this has.
Speaker 4 (12:36):
Happened to other people. You're not alone.
Speaker 1 (12:39):
So you talked about the ruthlessness and relentlessness. If you
give them some money, they just never end. So they
come back for more money, and more money and more money.
Speaker 4 (12:48):
That's what we're seeing.
Speaker 3 (12:49):
Yeah, Initially they'll offer a lower figure, you know, maybe
two hundred, three hundred dollars, and then once you've paid,
they don't go away. They ask for more more, and
then they ask for more money and more money. And
in some cases we've actually seen teenagers being forced into
monthly ransom payments where they have to give, you know,
(13:12):
one thousand dollars a month, and if they don't give
that or transfer that money through, the photo will be sent.
Speaker 4 (13:20):
I don't know.
Speaker 2 (13:20):
I just wish. I just wish if that happened to me,
I'd have the balls to go send it. I mean
you don't because you've talked about the psychology of what
they're going through, so you don't. But goodness me, hey,
I have also heard about gaming platforms like Roadblocks being
used for issues like this. Is that is that the case?
Are you seeing a bit of that?
Speaker 3 (13:41):
Well, I think all Internet platforms face, you know, child
exploitation issues, and the issues that Roadblocks faces are similar
in some respects, but it's not really used for sextortion.
Predators are reaching out to children through that gaming platform.
When it comes to sextortion, they're mainly using those social
(14:03):
media platforms like Instagram and Snapchat where you can see
public friend groups and you can see who people are
friends with. They really need the connection to that target's life.
They want to see who your best friends are, they
want to see who your girlfriend is, They want to
see who your parents are and use that against you.
So you don't really get that connection as easily through
(14:26):
through video game platforms. But I do just want to
say to that point of you know, you'd hope that
maybe you'd have the confidence to say, okay, just send it.
But let's think about the psyche of a fifteen year
old boy who's looking at a photograph of himself, you know,
pulling his pants down in the mirror. His face is
(14:49):
included in that image. He knows that it's going to
be sent to everyone in his high school, and that
everyone's then going to foard it on. You know. Now,
in this era that we live in, these teenager are
connected to their devices twenty four to seven. If that
photograph gets sent to one person, it will spread so
quickly that everybody he knows might have seen it before
(15:10):
days out. And how do you psychologically protect a child
against the kind of bullying or long term consequences that
could come from that photograph being sent out. I don't
know how to do that, and I think that's something
that we should be teaching kids in school, as well
as building that resilience around online cyber bullying.
Speaker 1 (15:32):
What strikes me is my children don't know their friends'
telephone numbers, They don't text them, they don't phone them.
They do every bit of their communication via social media
Snapchat or Instagram, and I mean that's leaving them vulnerable
in the first place, that every single connection that they
have is on a publicly accessible database.
Speaker 4 (15:52):
Effectively, Yeah, it totally is. And not only that, but
we're now entering this.
Speaker 3 (15:57):
World of deep fake technology generative AI, and people can
create images that look real but aren't real. And what
happens if you get sent a photograph that looks like
you've sent a nude selfie but you haven't, and then
that image gets shared. How do you prevent or protect
children against that?
Speaker 2 (16:16):
Olivia? I'm sure there's a lot of parents out there
who are trying to you know, they're doing their best
to keep their children safe and have open communication with
them and things. And I think that we all occasionally
do think it won't happen to our kid, or won't
happen to somebody.
Speaker 4 (16:29):
We know.
Speaker 2 (16:31):
You covered the tragic case of Jordan DeMay in the
US and he was sextorted and he took his own life.
I know that you've spoken to his family. What impact
has this had on them and what do they want
other families and parents to know.
Speaker 3 (16:49):
Yeah, I traveled up to Marquette, Michigan to meet Jordan
de May's family, and I'm sure you can imagine and
anyone listening to the segment can imagine the pain that
they went through when he took his own life.
Speaker 2 (17:01):
He was a.
Speaker 3 (17:02):
Seventeen year old boy with his whole future ahead of him.
He was a homecoming king, he was a football star,
He had plans to go.
Speaker 4 (17:09):
To university the next year.
Speaker 3 (17:12):
The night he killed himself, he had packed his bags
for Florida. He was ready to go away for spring
break with his dad and his siblings. And this was
so totally unexpected and his family are you know, it's
hard to even find the words to describe the heartbreak.
Speaker 4 (17:30):
It was just palpable, you.
Speaker 3 (17:32):
Know, in the room when having that conversation with his
mum and his dad. In terms of what they want,
I would say that Jordan Demaye's parents want two things.
The first thing they want is to raise awareness about
what happened to Jordan, because they believe if it happened
to Jordan, it could happen to any teenage boy. So
(17:52):
both of his parents have been going into high schools
across Michigan talking to cyber safety advocates across the country
and around the world to try and raise awareness of
sex stortion. The second thing Jordan de Maaye's parents want
is accountability. They feel that Meta, which is the parent
company that owns Facebook, WhatsApp, and Instagram, is responsible and
(18:16):
they've filed a wrongful death lawsuit against Meta to try
and hold it accountable for Jordan's death, and that case
is currently proceeding through the US courts.
Speaker 1 (18:26):
Is there any precedents for that? Is that the first
of its kind?
Speaker 4 (18:29):
That case sadly no.
Speaker 3 (18:32):
There have been quite a number of lawsuits here in
the US that have been filed against Meta and other
social media platforms alleging wrongful death in sex stortion cases.
None of these cases have actually gone to trial because
of Section two thirty of the Communications Decency Act. And
that's a law that's existed for decades now. It's existed
(18:53):
since the dawn of social media, or when Facebook was
first created, and it says that these companies cannot be
held liable for the content that people post on their sites.
So effectively, you know, anyone who messages a child or
a teenager and coerces them into sending a naked photo
and then blackmails them or sextorts them. The companies can't
(19:17):
be held liable for that because it's what the individuals
sent and said that cause the crime.
Speaker 2 (19:23):
Is there any progress to change that? Is there any
sort of you know, has anything been done to change that.
Speaker 3 (19:31):
I'm pausing because that is just such a complicated question.
I feel like we're opening a Pandora's box here. Yeah,
And I feel like there are so many researchers and
experts that I would love to weigh in on this.
I mean, some people you talk to say the winds
of change are coming, and they're coming because they have
to that the current state of the Internet cannot continue
(19:52):
because children are dying, and Jordan de Maay is one
example of that. We've seen legislation drafted and multiple states
across the US to try and amend Section two thirty
to allow families like Jordan Demay's parents to have their
day in court to try and seek.
Speaker 4 (20:12):
Justice or accountability.
Speaker 3 (20:14):
And it's possible that we will see some change coming
in the near future, but it is so hard to tell.
This is a big question and one that's on the
you know, it's a political question as well, because it's
all about regulation. Of the Internet and the social media companies.
You know, they want Section two thirty to remain because
(20:35):
without it, they could be held liable for anything that
anybody says on the Internet. So we need to find
a balance, We need to strike a balance and you know,
understand what is the right way forward, because the current
status quo isn't acceptable and you know, what does it
look like if we overregulate or over sense of the internet.
(20:56):
So it's kind of an argument between safety advoct kits
who push for change and privacy advocates who push against it.
Speaker 1 (21:06):
Meanwhile, the children are just it feels like they're sitting
ducks for this. Just to be just to explain to
whoever was listening about Jordan, you said that he had
his bags packed, he was going to go on holiday
the next day with his family. But just to be clear,
what happened to Jordan happened in a very short space
of time, didn't it.
Speaker 3 (21:26):
What happened to Jordan to May happened within six hours.
And I think that's one of the scariest parts of
this crime is just how fast it can occur. He
was first contacted at ten pm on a Thursday night,
and he took his own life by three am the
next morning. When you read through the message exchange that
(21:51):
occurred and the lead up to his death, it is harrowing.
It is a really difficult conversation to even read in
black and white. Can't imagine what he was going through
in his own mind when those when those messages were
being sent, the blackmailer was saying things like I will
watch you die a miserable death.
Speaker 4 (22:11):
And when Jordan responded and.
Speaker 3 (22:14):
Said, I'm going to kill myself now because of you,
because of what you're doing, the blackmailer replied, good, do
that fast, or I'll make you do it.
Speaker 4 (22:24):
I swear to God.
Speaker 3 (22:26):
And it was those messages that really led the FBI
here in the US to go on this wild goose
chase to find out who was sending them, and that
goose chase ended them in Lagos, Nigeria.
Speaker 2 (22:37):
It's just going to say, how much do we know
about these scammers? Who are they and where are they
coming from?
Speaker 3 (22:43):
Well, in Jordan de May's case, the FBI.
Speaker 4 (22:47):
Did find them.
Speaker 3 (22:49):
They found two brothers in Lagos, Nigeria, Samuel and Samson Negoshi,
had created the fake Instagram account that actually hacked into
it it is a real account of a young woman
here in the US, and they hacked into it so
it looked very realistic and messaged him and befriended him
(23:09):
and a lot of his friends from his football team
and Marquette, so it looked like a legitimate account. And
then they had a script of how to blackmail him
and how to extort him. What the local law enforcement
did once they realized there had been that exchange of
messages in the early hours leading up to Jordan's death,
(23:30):
is they trace the IP addresses of the account of
that Instagram account I'm talking about, and it led them
back to Lagos, and then the FBI got involved and
they actually traced the account to a specific apartment complex
and to a specific unit and that's when they gosh,
you brothers, And last year they extradited them to the
(23:54):
US to face wrongful death charges.
Speaker 4 (23:56):
They were the first two men who were.
Speaker 3 (23:58):
Ever extradited out of Nigeria and into America to face justice. Effectively,
when we think about how explosive this crime is, the
FBI says more than twelve thousand teenagers have been targeted
in America and these.
Speaker 4 (24:13):
Are the first two who have been extradited.
Speaker 3 (24:16):
The FBI also says they're looking for more cases and
they're wanting to extradite more people to the US to
face charges. So it's likely we're going to see more
of those extraditions happen in the coming years or possibly
even in the coming months. Meta recently announced that it
had deleted more than sixty thousand Instagram accounts and profiles
(24:39):
linked to sextortion based in Nigeria. So this is really,
you know, the hot bed for this crime.
Speaker 1 (24:47):
So sixty based in one nation.
Speaker 3 (24:51):
That's right, more than sixty three thousand Instagram profiles have
been deleted, and that mass purge, all of those profiles
will linked back to Nigeria. Nigeria is the home of
the Nigerian Princes, home of the Yahoo Boys. These are
digitally savvy con men who used the Internet to blackmail
(25:11):
Westerners for money, and they cottoned onto this particular form
of blackmail sextortion recently. And it was really in twenty
twenty two that this crime started exploding because the Yahoo
Boys started sharing scripts with one another on how to
blackmail American teens and they'd suggest adding them on Instagram
(25:35):
and they'd suggest things to say to sound convincing, like
a real teenage girl. You know, I'm from Michigan. I
like tennis. What do you do in the weekends. They
had certain questions that they'd ask the boys that they
were targeting that made them seem real, and they'd share
that with one another, and hey, this was successful. When
(25:56):
I told this teenage boy that I was going to
destroy his life, he paid me five hundred dollars. And
that kind of script was shared across YouTube and across TikTok,
and that's what we're now seeing the social media platforms
take action on to try and remove those scripts, to
try and limit the curb of sex stortion.
Speaker 2 (26:17):
Olivia, you have done an incredible job of investigating this
and bringing it to our attention, and I think that
is so incredibly important because it feels at the moment
that us understanding how this works and having conversations without
our young people and telling them that you know it's
going to be okay. If this happens to you, you
must talk to us is probably the most important thing
(26:38):
that we can do right now.
Speaker 3 (26:39):
Yeah, I think that the only thing we can do
is talk about it, because unfortunately teenagers and children today
telling them not to use social media and not to
use their cell phones.
Speaker 4 (26:53):
That's just not an option.
Speaker 3 (26:55):
So we know they're going to be on Instagram, we
know they're going to be on Facebook, we know they're
going to be on Snapchat. And these platforms in these
companies aren't evil. They don't want this kind of crime
to exist on their networks, but they're so big now
that it is really hard to monitor and moderate, and
that means that issues like this.
Speaker 4 (27:17):
Can fall through the cracks.
Speaker 3 (27:20):
You know, they don't really know what they're looking for
until it's exploded and become as big as what sextortion
has become today. And because of that, I think that
one of the most important things we need to do
is tell children what's happening, warn them, share with them
stories like what happened to Jordan de May, so they
understand the risks of using these products. And I think
(27:43):
that there should be a form of, you know, digital
literacy in schools as well, where we explain to children,
you know, what does it mean when you have your photos,
your family, your friends, your location, your life on the internet.
You know, this is what the consequences of that is.
(28:05):
And as long as you're aware of it and you
understand that you can come and talk to me, or
you know, your parents or your friends. Your children need
to feel like they can talk to you about anything.
Speaker 4 (28:18):
And after I published this story in.
Speaker 3 (28:21):
BusinessWeek magazine, I went and did a TV interview for
a Business Week television here in New York. And sitting
in the green room, which is the space that people
wait in before they go live, I spoke to a
CEO who was waiting to do the next segment, and
he asked if I'd just been on air talking about
the sextortion story, and I told him I had, and
(28:43):
he said that he had read that the previous day
and he read it the whole way home on the train,
and he didn't stop. And he said, as soon as
he got home, he walked into his fifteen year old
son's room and he told him to read it. And
he sat and he watched as his son read that story,
and he said to him, I was your age once.
I made mistakes too, and I just want you to
(29:04):
know that you can talk to me if you do
anything dumb like this, like come and speak to me
about it. And that's the message that I'd want to
send to parents. I know this is an incredibly hard
era to raise children in, but you want your kids
to know that they can come to you if they
do something wrong.
Speaker 2 (29:23):
Thank you so much, Olivia. Really nice to catch up
with you again. I'm sorry that we're still talking about
such a horrific topic, but look, we really appreciate your time.
Speaker 4 (29:33):
Thank you so much for having me.
Speaker 2 (29:36):
You're listening to the little things. And that was investigative
journalist Olivia Carvil, who works for Bloomberg in New York
and has extensively investigated the issue of sextusion in the
US and around the world. Up next, Sean Lines, chief
online safety officer from Netsafe's going to join us to
talk about how we protect children and young people online.
(29:57):
Will be back shortly after this break.
Speaker 1 (30:03):
Welcome back. You're listening to the little Things, and today
we're talking about the latest online threats that could affect
your child or teenager. We've just heard some pretty concerning
information about a new trend called sex tutionion. And joining
us now to give us some practical advice on how
to keep our children safe is chief online safety officer
from need Safe, Sean Lyons.
Speaker 2 (30:22):
Thanks for being with us. Sean, My pleasure is six
tutionon a problem in New Zealand. Are you concerned about
it at Needs Safe?
Speaker 5 (30:30):
Yeah? Absolutely, and look, just to give you some contact,
I think we'd be concerned if there was a single
case of it, but unfortunately it's way bigger than that's
so we're concerned. In terms of the numbers as well.
We've seen an increase in reports of sextortion to US
eighty eight percent increased since twenty nineteen. And just to
try and put a sort of randed idea on that,
it's something that used to be five or six and
(30:52):
what's a week, it is now five or six reports today,
So it is definitely something that's concerning. It's definitely on
the increase and affects lots of New Zealander. So yeah,
without that concern.
Speaker 2 (31:04):
Sean, are you seeing we were hearing from Olivia that
the main target as teenage boys? Is that the same
here in New Zealand.
Speaker 5 (31:12):
There's certainly been a rise in that group. You know
that there's a time that we would have said the
biggest single target was probably teenage girls that were certainly
seeing a Rizon and those numbers tip over. But also
just want to make clear that we see a range
of people and not target it's probably not the right word,
but a range of people affected by this. So we've
(31:33):
had reports from individuals as young as ten, and we
have reports from individuals over sixty five, So a lot
of people can become embroiled in these scenarios. This is
not a factor of teenage life or adolescent life online,
but it does seem to be one of the places
where the greatest challenges are experienced.
Speaker 2 (31:54):
What do you do if somebody contacts you in a
sixtortion scam?
Speaker 5 (32:00):
First instance is to try and make sure that somebody
is in the best possible situation for dealing with what's
going on, So looking at whether or not someone has
the right kind of psychological support around them, whether or
not somebody has has friends and family to talk to,
because a big part of this, big part of the
impact and the thing that the scammer's trade on is
(32:24):
the kind of the psychological isolation, the feeling of embarrassment
and shame, and the feeling of fear about what's about
to happen, So that that's the first and kind of priority.
But I guess really it's trying to assess just whether
or not this is real, whether or not there was
content made because awfully and so predictably, with so many scams,
(32:45):
there are kind of copycat fake sext ortions out there.
We've probably all seen emails from time to time that
say I've got a bug in your computer, or I
switched on you webcam and I've recorded you at adult sites.
They might call you a pervert somewhere in that email
demand money to be paid, even though there is no recording.
So we have to sort of differentiate between those things
(33:06):
which are desperately frightening for it for a lot of people,
from the actual content created, and then know that there
are steps that can help to minimize the potential spread
of that legitimate platform. The platform that we all use
don't want this content on their platform. Most of them
have tools to try and remove that. There are things
(33:29):
that we can do, there are other tools that can
actually go a step further, but the first thing is
to try and establish what it is that's happening at
at an individual level, because one of the worst parts
of this is that psychological impact. It is people praying
on the isolation of often young people, and it's trying
to make take action based on that isolation without them
(33:51):
expanding the kind of support circle because they feel shamed
at what they've done.
Speaker 1 (33:56):
Are we looking at mostly at Sydney? I love you
as talking about cases where money, the exchange of money
was the main goal of the scam. Is it the
same here or is it just for kicks or.
Speaker 5 (34:08):
I mean I think so increasingly we've seen, you know,
as we said at the top, with that change from
young girls to young men, and we've seen an increase
in the financial demand. Obviously that the fake ones, the
rogue kind of versions, that's all about money. That that's
a straight out financial fear based scam. What we saw
(34:30):
early on was where where young girls were targeted, often
it was the create one image and then the payoff,
if you'd like, was create more images, create video, create video,
doing certain things that were required. And we know that
that was about the commercial production of child sex abuse material.
So this was a ready market out there, people wanting
(34:52):
to purchase this content. The shift to boys seems to
have correlated again with the shift to money as a demand,
and I think more this idea of organized crime behind that,
because it is just a profitable enterprise versus what we
were seeing before, So that there might have been more
individuals involved in that collection of imagery or video. I
(35:14):
think it's definitely more about organized units that are looking
for money.
Speaker 2 (35:17):
Sean, what kind of scams are we seeing aimed at
children and young adults at the moment.
Speaker 5 (35:23):
One of the issues that we have, I guess, around
scammed generally is this idea that there is a type
of person who is scammed. And people often point to know,
people of my generation suggest that older New Zealanders are
those that are likely to be scammed, and traditionally they
were probably targeted more. But we see scams affecting all
sorts of individuals and you know, definitely now targets that
(35:45):
that are, you know, the content of a scam is
aimed at much younger individuals. One of those could be that,
you know, the start of these sextortion scams are often
romance scams. You know, I've seen your profile, let's talk,
let's look up whatever that might be. But that's not
only young people that get broad in those. There are
older people that are out there looking for the same
thing or attracted by the same book. But we've seen,
(36:08):
you know, investment scams that we would have used to
think traditionally we're about older New Zealanders with the rise
of sort of you know, influencer culture and talking about bitcoin,
and you know, I'm a super rich influencer that's made
my money on bitcoin. Then we see scams that then
target younger people around that kind of way to make
money quick. So it happens right across the board. And
(36:29):
you just make clear I'm not a high plaiered influencer.
I was just saying that.
Speaker 2 (36:34):
I just find that fascinating that they've just worked out
that the psychology of the way a scam can unfold
and it can affect people. As you say, age doesn't
matter anymore.
Speaker 5 (36:45):
No, I look, I think the part of it is
it's trial and error. It's because they work with a
particular mechanism. If it doesn't work, they're they're quick and
agile enough to ditch it and move on. Change the script,
change the brands involved, change the attack, change the platform,
whatever that might be, and they keep evolving, They keep
changing until they hit a rich scene and at the
(37:06):
point that they do, they then exploit it and that's
when they start to show up. That's when we start
to see them happening in numbers.
Speaker 1 (37:12):
It's all there for them to mine as if it's
the reason it exists, as if it's the reason that
social media exists for them anyway.
Speaker 5 (37:19):
Yeah, and yeah, I think there's an awful lot of
mining that goes on in these spaces on a range
of things. Unfortunately, Yes, that those that seek to defraud
individuals have found a rich scene in terms of the
way that they can approach people using social media, the
amount of information I think that they can then glean
once they've got somebody on the hook, their ability to
(37:42):
kind of glean information about someone's social networks, someone's social circumstance,
and then use that to weaponize that against that individual.
But I still think I mean a lot of this
stuff at the beginning, it's not actually targeted. A lot
of this is scatter gun. A lot of this is
using the kind of you throw out a million whatever.
Now that might be adverts, that might be emails, that
(38:02):
might be WhatsApp messages, but you throw them out into
the ether and you wait to see what comes back.
And it's only then I think that the targeting really
goes on. And sometimes it's just still very generic that
we see scams running through that are the same things
over and over again. We have a bot chatbot scam
chatbot called re scam that we open to the public
(38:25):
a couple of weeks ago, where we just asked or
to send in the scam emails and allow the bot
to talk to them. So we see the progression of
these scams across all sorts of different types. Sometimes the
responses are identical. Sometimes they might actually be bots, but
often there's a very human element to it. They do craft,
they change, they mold what they're saying to fit who
they think they're talking to.
Speaker 1 (38:47):
Yeah, I got an email once and I just copied
and pasted it into Google and it came back saying, yeah,
that's definitely a scam. So it's quite useful. The more
you gather that information and put it in a public space,
you can compere and say, you know, this looks a
little bit all. This doesn't You're just providing information for
more data for Ailoy's Yeah, I probably am.
Speaker 3 (39:07):
Okay.
Speaker 2 (39:08):
So what we're actually dealing with here is just as
much as we're trying to keep our children safe, we
have to keep ourselves safe as well. But how do
we as parents keep children safe online? I mean it's
slightly unreally we'd all love to just grab those devices
out of their hands and you know, smash them to pieces,
but then that'd be very hypocritical of us. Kids are
(39:28):
going to be on devices, they're going to be online.
How do we as parents try to make sure that
they're as safe and aware as possible?
Speaker 5 (39:38):
Yeah, well, that's that really is the six million dollar question,
isn't it? And I think you know, I think the
first thing I should say is that that, you know,
parents have a choice here. If parents want to stop
their children using technology, that that's an option that they
can work out how to. But I think you're quite
right that this is a big part of young people's
social interaction, being on these platforms, talking the language of
(40:00):
these space is very difficult to keep young people away,
and if we think that they might be, then then
we absolutely have to try and prepare them for what
it is that they might be experiencing. It sounds a
little bit trit sometimes when we say this, but the
most important thing is the conversations that you have with
young people, And it's the conversations that you have with
them ahead of the traumatic times. It's not the oh
(40:24):
my goodness, this has gone wrong. Now let's talk about it,
because everybody's in such a heated state at that point.
It's about the conversations that you have before they get
the technology in their hand, or before they sign up
for a particular platform. It's about having some kind of rule,
some kind of understanding within your own house about what
(40:45):
is and isn't acceptable for your family in terms of
using technology. But a lot of it is about parents
jumping in beforehand and really understanding what this technology is.
Because when somebody comes, when a young person comes to
a parent and says, all of my friends are using
in certain name of platform here, the pester power of
that is incredible, and often parents will just say, well,
(41:08):
who's using it? And they'll give them four names, and
you'll go, well, I think they're fine, so yeah, sure
you can use it. But really, as parents, we shouldn't
be saying yes until we know exactly what that is.
If our kids are signing up for Clerk and Clerk
isn't a real thing, I don't think I'm just using that.
Speaker 2 (41:26):
Who are both like.
Speaker 5 (41:28):
People's reaction when that happens, But no, it's before they do.
We've got to be saying to ourselves, God, is what
is clerk, and if I don't know, if I can't
find suitable answers, In fact, even if I can, I
probably should be signing up for clerk myself. I probably
should be working out exactly what it is. Can you
report harm? What do you do if something goes wrong?
(41:51):
And as parents, we need to know those things ahead
of time. I mean, yes, if something goes wrong, we can,
we can find out afterwards. But it would be way
better if the conversations that we had about the places
and spaces that young people use were based on our
own knowledge as well as our own concern.
Speaker 2 (42:09):
Sawn, our conversation is definitely the place to start is
with conversations, but our conversations enough. Should we be restricting access?
Should we be monitoring them online?
Speaker 5 (42:19):
I think we should be restricting access, But I don't
know if restricting access is just. I mean, I think
when people hear about or hear the phrase restricting access,
they immediately think of some kind of technological solution, right,
They think of filters and blocks and those kind of things.
And yes, they are part of it. Yes we should
be restricting access, but some of that restriction to access
(42:40):
is in our own activity. It is in the rules
the understanding that we have with young people. There isn't
a panacea in terms of restricting access that we can install.
I wish there was, but as somebody once pointed out
to me, the fastest way around an age restrictive filter
at home is to go next door.
Speaker 2 (42:58):
I know, I feel like I'm giing you the same
question in different ways, hoping that you'll get to.
Speaker 3 (43:04):
It.
Speaker 1 (43:05):
Is the way it is they think, you know, in
teenagers is a certain point of their lives, particularly I
don't know if it's particularly boys, but they feel about
bulletproof right, and you can't articulate the danger enough to them.
So I guess it's another one that just doesn't have
a panacea, right. We just have to keep banging on.
Speaker 5 (43:22):
When I think, when we start that way and you
know I'm guilty of put my hand up, I've done it,
and if you do this, the dangers are as follows.
I think those conversations often go over the tops of
young people's heads, especially at a certain age, Like you say,
when they feel that they are in that bulletproof stage.
We know that there's an age that they get to
where the most important people in terms of influencing these
(43:46):
spaces are their peers, and that's a really difficult thing
for parents to kind of deal with that they're at
some stage their children are no longer listening to them first.
They're now listening to their friends because that's where they're
getting they see it reasonable ration advice, that's where they're
getting the practical solutions from. But I think conversations that
we have with our young people that are about what
(44:07):
would you do, what are the things that you see,
what are the issues that your friends have had, and
how would you deal with those are a much better
starting points for that conversation than if you go here
you will see these things. Because what happens is we
as adults end up having conversations about our own fears
and what the issues that we see, and you know,
invariably young people in technology, we're going to start talking
(44:30):
about predators and we're going to start talking about extreme
pornography because that's what worries adults. But we know that
young people worry about slightly different things, So those conversations
end up being at cross purposes. We need to work
out what it is that that young people are concerned.
Are young people are concerned about not assume that it's
the same thing we do and start to drill into
those things. What will they do if they find themselves
(44:52):
in a situation where they didn't know what to do,
Where would they turn, who would they go to? What
advice do they need in those situations?
Speaker 2 (44:58):
If their children do get scammed, if they're a victim,
what should we do?
Speaker 1 (45:03):
How should we deal with them?
Speaker 5 (45:05):
Well, the hope is that we do know that, because
the worst case scenarios are the situations where young people
experience these kind of harmful online situations and don't talk
about it. And we know from our own research it's
sort of thirty five percent of young people do nothing
in these situations. So the most important thing is that
we encourage our young people to talk about the situations
(45:27):
that are in not to fear retribution, not to fear
that they've broken a rule. We can deal with that tomorrow. Today,
we deal with the situation that you're in, and we
do that without judgment, and we do that without any
any kind of repercussion, because it is the shame and
the fear that keeps young people from talking about these things.
And it's also that same shame and fear that creates
the greatest harm. So, you know, talking to our children
(45:50):
in terms of sextortion, talking to our children about the
chances of this happening, and saying if this happens, I
don't care what else goes on, We don't mind, we
don't care. You talk to us, Come and talk to us,
and we will deal with this together. Make sure they
understand that they are not in trouble. This is, you know,
(46:11):
being a victim of this kind of scan. These are
sophisticated operations, people skilled and practiced in sucking people in
to these situations. This is not about foolishness or stupidity,
or gullibility or naivety. This is about getting stuck in
a criminal situation. And we need to make sure our
(46:31):
young people understand that this is not their fault, that
there are things that can be done and we can
do it alongside them. Because one of the things that
scares young people most in this particular case is the
threat that this picture, this image, this video, whatever it is,
will be released to their parents. If they can talk
to their parents and say this video is out there
and their parents say, oh, well cool, no problem, I'm
(46:55):
not bothered, then you've taken away a huge part of
that fear.
Speaker 1 (46:58):
Yep, no, we did it with Stranger Day. Yeah, I'm
sure we can do it again.
Speaker 5 (47:01):
It's a tough conversation, it really is, but it's one
that we need to we absolutely need to.
Speaker 1 (47:05):
Have, I agree for us parents in as families, we
can we go for more information.
Speaker 5 (47:12):
Nets has specific resources around sextortion. We have if you've
got to the NetSurf website and just type in sextortion,
there's a whole lot of information about what you can
do as parents, about what individuals can do. There are
tools out there that you can use when you've when
these images have been created that will actually block the
sharing and the spread of these information of these pictures
(47:33):
and videos that will that will hash the images and
then transfer that signature if you like, to platforms around
the Internet, which will mean that nobody can share or
distribute these images. That there are things that parents can do,
you know, support that parents, We can give parents to
talk about or work about how to talk about that
with young people. A whole lot of resources. But being
(47:54):
forewarned is the thing that we need to do as parents.
We need to make sure we understand what the potential
is here and then work out what's the right way
to deal with this for our young people.
Speaker 2 (48:04):
Sean, thank you so much for your time today and
all your advice. Has been really good to talk to you.
Speaker 5 (48:09):
It's my pleasure.
Speaker 1 (48:10):
Thanks, Sean.
Speaker 2 (48:17):
Stilowise. I think it's probably fair to say this isn't
one of our cheeriest podcasts that we've done to date.
Speaker 1 (48:23):
Since researching for this, I have been having the conversations
and checking in and making sure I'm going in there
last thing at night and saying I love you no
matter what.
Speaker 3 (48:32):
Yep.
Speaker 5 (48:32):
No.
Speaker 2 (48:33):
It's interesting because when we had Olivia on the Sunday session,
I went home that night, and that Sunday night at dinner,
I said, now there's something I'd like to talk to
you all about. I was actually interested to know how
much my children knew about it and understood about it.
They didn't feel the need that we had to go
into it at great length, and in a way I didn't.
I just kept repeating what both Olivia and Sean said,
(48:57):
which was, you guys, you must understand I don't care
what you do. I don't care if there's a photo
or a video, I don't care. But if somebody contacts
you and blackmails you about it. You absolutely come to
us that it is not a problem and that we
will deal with it, and you do not engage with
them and get caught up in that horrible, horrible scenario
(49:21):
that that poor kid and the States did. So I
just kept, you know, I just kept reiterating that.
Speaker 1 (49:28):
You know how everything's worse at night, you know, everything,
everything feels just a bit harder and a bit more
overwhelming at night. It's like, just come and wake us up.
Just come and wake us up and tell us what's happened.
Everything will be fine. Yeah, absolutely, And that goes for everything.
But were you terrified at how quickly they escalate? So
if you respond and you get caught up in this
(49:49):
how quickly and the language they used to this kid
to drive him to suicide, that was just terrified. Oh
absolutely merciless and can yeah, you can kind of even
put yourself in that space, can't you. I mean, you know,
I've told you when I was a kid, I got
I got a phone called prank phone, call it home,
you know, that was just in the family on the landline.
(50:09):
Parents there, and managed to convince me he could see me.
I was in a windowless room, but he still managed
to convince me. So you know, can only imagine the
million times of vulnerability in your room with your phone.
We just have to have these conversations. As we've said,
as both very expects today have said, it's about talking
to them, talking to them, talking to them, talking to them.
(50:30):
You can't get the message through often enough.
Speaker 2 (50:33):
And do all those other things that you want to do,
put restrictions on our limit time, do everything else that
you wanted to understand the platform, but just remember conversations.
Speaker 4 (50:44):
Ken.
Speaker 2 (50:45):
Thank you for joining us on our New Zealand Herald
podcast series, The Little Things. We hope you share this
podcast with age appropriate young people in your life so
that they know a scam or sex torsion is not
the end of the world and to ask for help.
Speaker 1 (50:58):
You can follow this podcast on radio or wherever you
get your podcasts, and for more on this and other topics,
head to enziherld dot co dot nz and we'll catch
you next time on The Little Things.