All Episodes

December 10, 2020 56 mins

Anna and Audrey, two of Ali Wentworth’s friends, share personal stories and advice from their experience of being around the dark web as teens. 


Ali then speaks to Julie Cordua, the CEO of Thorn, an international anti-human trafficking organization (co-founded by Ashton Kutcher and Demi Moore). Julie chips away at who is most vulnerable to sex trafficking, what all can be found on the deep web, and more.


Beware of where you end up on the internet.


www.thorn.org 


stopsextortion.com


www.missingkids.org

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Go ask Ali, a production of Shonda Land
Audio and partnership with I Heart Radio hi em Eli
went Worth, and you're listening to Go ask Ali. Where
this season, I'm asking the question how do you grow
a teenager in a pandemic? More than ever, teenagers are

(00:26):
spending an enormous amount of time online and one of
the scariest manifestations of that is the fear of them
being lured into an unsafe situation. The Internet enabled a
new era of child sexual abuse, and today we're discussing grooming,
sex torsion and how to best protect our kids from
falling victim to predators in a digital world. Here to

(00:47):
talk to me today is Julie Cordoa, the CEO of
Thorn and International Anti human Trafficking organization co founded by
Ashton Kutcher and Demi Moore and dedicated to building technolog
aology to defend children from sexual abuse. But before I
speak with Julie, I've invited my friends Audrey McDermott and
Anna pay to twenty three year old who have graciously

(01:10):
agreed to have a candid conversation with me about some
of the things going on in the teen world that
parents may not know about. Recently, Anna brought something up
that I found particularly alarming, and it actually prompted this episode.
As a teen, she received three dollars from a stranger
in exchange for a photograph of her feet. Do I
have that right? Anna? I wish it was that much.

(01:33):
It was less. Unfortunately it was five dollars. Alright, Yes,
but yes, I think if I had found potentially more
impressive clientele, I could have received in the three hundred range.
I know my friends have sent their shoes for thousands

(01:57):
of dollars. Okay, okay for before we get ahead of
ourselves and make every teenager want to do this. How
were you even educated about the possibility of doing this?
How did somebody reach you? And how did you know
to photograph your feet and get money for them? Because
you know we're gonna this is gonna take us to
a dark place. So I'm starting in the PG version. So, um,

(02:22):
I guess my story begins when I was sixteen years
old or so, when I wanted to sell some some
shoes that I had from middle school online, and I
went and I posted like my flats and things like
that that I was no longer really wearing anymore. So
I post them on like Craigslist and Facebook, Marketplace and

(02:49):
some weird niche apps for start buying and selling things. Um,
because I was a broke tchanger, I really wanted to
buy more bundle packs something like that so I could
play my video games. Um. So I finally got some
some responses for my shoes, and at first it was like, great, fantastic,

(03:13):
these are clearly worthwhile for other people. But then the
messages got a little weird. I was now suddenly requested
to to send photos of my feet along with the shoes,
and would you say were finishes that were? I think
I think that that is what we were dealing with. Indeed, Um,

(03:35):
I told Audrey's a Joke, who was very alarmed and
told her mother, who was even more alarmed, who called
me and said, how much money do you need? You
cannot sell your shoes online? Um, which honestly made me
want to sell them even more. Yes, because you're sixteen

(03:57):
and you're rebellious in a risk taker. Well no, no,
no no. First first to fact check what happened was
he said, can we you can pictures with them? And
I said no, very intelligently, because they wanted them for free,
and she knows what her feet are worth with their photographed.
And then they said, would you mind wearing the shoes
a couple more times before I picked them up, and

(04:19):
she said, yeah, that's fine, assuming that he wanted them
worn in for his girlfriend or something like you're very trusting.
And then finally the final straw, as he said, when
I picked them up, can I take them off your feet?
And then such a good memory. Yeah, these are crazy stories,
and Anna people would remember the story, we would remember.

(04:41):
That has completely slipped my mind. But honestly, that's why
the sale fell through, because you said no, you can't
touch my feet, so there wasn't even though you were
a rebellious teenager, you did have some boundaries. Yeah, you
didn't do the sale. First of all, there's a huge
difference between like feet pictures and selling your nudes. The
first thing I would say, because we walk around and

(05:02):
sandals all simmer and no one seems to care um
slash like. I don't find that as sexualized part of
the body. I've never engaged in any of it, and
as my bridge to that side of the world, I'm
kind of over on the more conservative side of things.
But the other thing I would say too, is with
stuff like Only Fans. There are girls from my high
school that are on Only Fans like and I don't

(05:24):
know how much money they're making, but I think what's
tough is it does glamorize when you read about like
people like Bella Thorne, who's like a you know, former
Disney star turned Netflix movie star. She made like two
million dollars the first day she opened up her Only
Fans where she was selling pictures of her boobs And
can you just wait, I'm grandma Moses. What is only

(05:46):
fans dot com? I don't even know what that is.
It's a it's a crazy account. It's basically like a
form of social media where people subscribe to see you
posting your nudes. And people not only nudes, it's just
posting like a personal arsenal of content that videos photos
but usually knew it's because that's what people pay for typically.

(06:09):
Is it celebrities doing this? It's celebrities. It's real life
down you know, your neighbors, neighbors. Yes, Um, let me
ask you guys a question. Have you ever looked at
the dark web before? Have you ever gone to the
dark Web? I've watched YouTube videos of people going into
the darker web. You know that you can you can

(06:32):
buy sell anything heroin, guns, child porns, and credit cards, artwork, um,
fake I d s. So you know my concern is
that almost like the feet picture, that they can dangle
the carrot of hey, how did you like a really
uh real looking fake ID? And the teenager goes, oh yeah,

(06:55):
because then I can you know, go to clubs and
then you go down that path and before you know it,
you know you're in some weird sex trafficking thing just
new to you. I mean, I'm terrifying. I love that
the I love the reactions. It's sort of I feel
like you're my own soundtrack to that. But you guys

(07:15):
haven't been in the dark web. I haven't either, but
it's always every parents nightmare. But you do have cautionary
tales of people that have um been used and ended
up in the dark web. And Audrey, I know you
have a story which I think is the most terrifying
story I've ever heard. Will you tell it? Um? Yes,

(07:36):
I would love to. I heard a really cautionary tale
about a girl my freshman year of college who one
of my friends was at school this girl lived in
her dorm. She ran into her in the hallway one day.
She was hysterically sobbing, and she pulled her aside and said,
you know what's going on? Maybe November of her freshman
year UM and the girls the same year, and she said, well,

(07:58):
I'm I'm leaving campus US and I don't know when
I'll be back because someone basically put out a hit
on me on the dark web and it had teared
financial like responses. So it was like dollars for murdering her,
twenty dollars if you assault her first thirty if you
film it like horrible, horrible things that were literally every

(08:21):
person's nightmare. And the girl had to get the FBI involved.
She wasn't even a US citizen. She had to be
evacuated from her US campus. I mean, I would say
that is definitely not a normal story. It ended up
being some crazy ex boyfriend and thankfully she's safe and
nothing ever happened to her. How did she even know
that she was on the dark web being advertised like that?

(08:43):
So someone stumbled this is the thing that's so surprising
in me. She had a friend who was coincidentally on
the dark web just surfing around seeing what was for sale.
I don't think participating in buying you know, tigers or
Heroin hopefully, but who knows. And I saw her picture
because it had her photo, her dorm address, all of
her information posted, literally miraculously stumbled across it and immediately

(09:08):
alerted her, and she intern alert at the school, who
alerted the FBI, who put her into protection until she
was safe. But the Internet is like the biggest platform
in the world, and the dark web is like anything,
it can be abused. I mean, it can't really be
used for good, um, but it can definitely be used.
I call the dark Web the evil twin of the Internet. Yeah,

(09:30):
I think that's probably pretty accurate. Revenge porn is another
thing that keeps popping up. How does a girl guard
herself from anything that results in that, either male or
female X Well, I think a couple of things. One
that it's kind of like safe sex, Like the best
thing you could do is abstinence and just not take

(09:51):
photos of yourself or videos of yourself being a good
because if it doesn't exist, they can't post it when
you don't, you know whatever. And the final thing that
you just shouldn't send picture is two boys because they're assholes.
But I think that there are like tears of justice
that can be enacted. But I mean, unfortunately, like you know,
when you're young and I love, you trust people and

(10:13):
they can't exploit that. Have you guys sent nude pictures,
even faceless ones. I'm not gonna lie. I've only done
it once, which I feel like it's kind of shocking
for my personality. I'm pretty, I'm really sex positive and
I love, um, you know, just embracing my body and

(10:34):
you know, promoting female empower empowerment through uh, embracing our bodies.
But now when it comes to nudes, I've always been
really kind of freaked out. You can be you can
be sex positive and yeah, my body myself, but you know,
like we we were just talking about, you just you
don't know what people will do with it so exactly.

(10:55):
But is that a norm? Is that something that you
know a lot of people that do that? Is that? Yeah?
I mean I definitely think that a lot of my
friends do it much more liberally than I do. I mean,
I would say we are probably in the minority of
girls who have only done in one, one or two times.
But I think a lot of guys. If a girl
they're hooking up what sends a nude and they mentioned
it to their friend, their friend will often say oh really,

(11:18):
and then it'll gets texted around. I don't think it's
often posted in an online setting that a random person
could stumble across. But I I've seen in high school
and in college, like boys have like locked apps where
you know there's the secret those And I've seen girls
who I know never slept with that guy or sent
that guy a photo on my kid's phone. Do people

(11:39):
do people do video sex? I mean yes, and what happened?
They wait? What you you? You meet somebody online and
then you have video sex? Oh god, I people do that. Yeah,
people do that, and I don't. I did not do that.
I wanted that clear. I have not done that. That's

(12:01):
that's problematic to me. That that, to me, is something
that could somehow get you into trouble. No, because I'm
not advocating for that for any teenagers at all. As
a nanny, this should never happen. Not let your children
on the internet. Which brings me to this question for
both of you, how do you think your generation will

(12:23):
parent their kids their teens when it comes to some
social Internet. I've thought a lot about this, I really,
I'm I'm so torn. Um. I discovered horn when I
was in the fourth grade. Um because I tried to
spell jessin mcartney Radio Disney. I butchered it somehow. I

(12:43):
asked my nanny, who did not speak English, to help
me out, and she, um somehow felled it in a
way that came up Jessine McCartney need photos in some way,
shape or form um. And so going on there was like, whoa,
what is this? Did you on introspect? Very? Oh? Yeah,
I was, well what is I mean? Your kid? Kids

(13:05):
are exploring their bodies, especially during these ages, and so
I had no idea idea what a penis was? What
is this appendage on this male who I admires body.
Taught me a lot about how I'm going to teach
my child to use the internet, and when the child
is going to be using the internet. I think, realistically,
kids are going to be using the computers much earlier

(13:26):
than I did, to be honest, especially my own children.
As a parent, I'm going to just talk to my child. Frankly,
I'm just gonna say, you know, these are things that
are on the Internet. I'm not going to hide it
from you. There are things like this. I don't think
you should watch it. It's not appropriate for you. You
can watch another time when you're much older and it exists.

(13:48):
But I don't want my child to stumble upon that,
be confused, and then go behind my back to talk
to other kids or other parents about it when it
is a conversation that's obviously not going to be easy,
but it's necessary to have. So I agree with Anna.
I mean, I don't kids in the internet. I don't
like even know how to have a kid like, like

(14:10):
I'm still in the birds and the bees confusion, Like
I have no idea how I'm gonna raise them. I
don't have a parenting philosophy yet. But I mean I
was introduced to porn out of like co ed hangout
when I was thirteen, where a boy opened up a
computer and was like this is and it was terrifying,
and that it's a horrifying experience. I do think that

(14:30):
you want to have like an open communication on that stuff.
My parents had a thing with us we were growing
up where if you did something wrong or you made
a mistake, if you came to them, you could have
infinite things called amnesty. So you just go to them,
You go like amnesty, and it was like the equivalent of,
you know, protection from the UN that you weren't tried
as a war criminal and exchange for your compliance. And

(14:53):
I broke the massive, you know, thirteenth century pot in
the hallway. I'm so sorry, knowing that they were going
to find out eventually anyways, but it developed this habit
of coming to them when you're like, I'm scared, I
screwed up, I'm confused. I sent a nude. Now I'm
hearing this rumor about him, about this boy sending it.
What do I do? And you know, I think that

(15:15):
was instrumental in me figuring out how to run the
deal with the internet. Make it comfortable and free for
your kids to come to you right and and make
it like there's no consequences. I mean, obviously, the thing
is kids know more about like what's safe and what's right,
and what's like the conservative mature choice. We're not going
to make that choice most of the time, but we

(15:37):
know what the right one is. So the lecture is
usually skippable, I mean, have a couple of conversations around it,
but I do think that there's a way to like
be a parent that knows what your kids online presence
looks like without suffocating them. And that's what I would
encourage people to strive for. You just have to kind
of prepare so that if something God forbid does happen

(15:57):
like that your child trusts to know that they can
come to you and know that if something really scary
it happens to them, that you will be there for
them and that you'll protect them as much as you can.
Thank you, Audrey McDermott and Annape. I hope my daughters

(16:17):
grow up to be like you. You guys are strong,
amazing women. And thank you so much for being on
my podcast, and thank you for helping me grow my
teenage girls. I adore you. Thank you, my mama. We're
going to take a short break and we'll be right back.

(16:42):
Welcome back with more. Go ask Alli and now on
to today's expert, Julie Cordoa. Julie is the CEO of
Thorn That's t h o r N, a nonprofit that
works to address a sexual exploitation of children. As digital
Defenders of Children, Thorn has been doing an incredible job

(17:06):
equipping tech companies with the tools to eliminate child sexual
abuse materials from the Internet. Julie, you guys are doing
the work that should be praised and admire for centuries
to come. Thank you for joining me on go ask
Gali and and right off the bat, can you start
by sharing the initial genesis of the nonprofit Thorn, How

(17:29):
it started, why it started? Sure? Yeah, Well, first of all,
thank you, um for having me. I'm always grateful for
the opportunity to talk about this because sometimes these are
uncomfortable subjects, so making them comfortable is the first battle
that we all need to do so we can talk
about this more and get it out of the darkness. Um.

(17:49):
So yes, Thorn was created almost ten years ago now
Ashton Kutcher and Demi Moore. They were starting their philanthropic
journey trying to better understand child sex trafficking. They had
seen a documentary about it happening in Southeast Asia and
then started to do some research in the issue and
realized it was happening here in the United States as well.

(18:12):
Um And as they dug deeper, they really wanted to
understand what were some of kind of the risk factors
and trends and what was making children more vulnerable and
really think about how they could use their unique talents
and networks and assets to make a difference. And I
joined them early in that journey. And one of the

(18:33):
things that we saw when we spent about a year
or more out talking to survivors, law enforcement, nonprofits, government,
was this intense and emerging role that technology was playing,
not just in child sex trafficking, but in the growth
of UH. Technology facilitated child sexual abuse. And while you

(18:58):
were seeing this growing role of technology and abusing kids UH,
there was no concentrated effort to actually use technology in
a way to stop that abuse, to protect them, to
prevent it. And so that was the genesis of Thorn
was looking at, Okay, how do we bring the best
and brightest minds and technology to bear on behalf of

(19:21):
some of the world's most vulnerable children and UM and
that was the birth of Thorn. Yeah. It's so interesting
because I was talking on a panel in Silicon Valley
to a lot of big tech company people about exploitation
of girls on Instagram, which to me is a big issue.
I just I I fear for it. And afterwards, a

(19:45):
lot of the people that work at these tech companies
came up to me and they said, you have no
idea how bad it is, and we won't let our
children do it. And I thought, oh, well, thanks for
keeping it to yourself. And the other thing is that
I've noticed with when you talk about sex, try picking.
And my eighteen year old is um she's the one
that told me about thorn, and she's she's a big

(20:06):
activist and she, you know, she said, the thing about
sex trafficking is when I discuss it in clubs and
at school, everybody thinks it's an international problem. They don't
think it's a national problem. You know. People always say, oh,
the sex trafficking, you know, oh that's in the Middle East,
or that it's like no, no, no, no no, that's right
next door. Thank you for thorn. And here's I know

(20:28):
this is going to be a scary answer, but what
are the current statistics of sexual abuse online? Yeah? I mean,
what we do know is that the trade of child
sexual abuse material is skyrocketing. So in the United States
last year, tech companies reported over sixteen million reports of
images and videos documenting the sexual abuse of a child.

(20:48):
Six million m HM. That's in just the United States.
UM and that's only what was found, so right, it's
it's it's the tip of an iceberg. Um. As far
as how much grooming is happening or how much such
stortion is happening, the numbers are really hard to come
by because most kids don't talk about so how you know,

(21:12):
how how do you measure something that no one's going
to bring to the surface. And so what we do
know is that law enforcement is telling us that the
number of cases they're working for grooming and sextortion is
on the rise. So we're seeing a bigger trend. But
is that, um, that's only what's getting reported, right, So, uh,

(21:34):
we know it's it's a growing problem. How big it
is is really hard to measure. How is the presence
and prevalence of child abuse online evolved since Thorn We
work on child sex trafficking and we also work on
the issue of child sexual abuse material, which um is
the same as child pornography. In the law, it's written

(21:54):
as child pornography, but we don't call it that because
it is the furthest thing from any type of pornography.
It is documentation of the abuse of a child. In
both of those areas, the market it has just skyrocketed.
And and really that's over the last two decades, not
just over the last nine years that we've been working
on this. And there's a couple of reasons why. I mean,

(22:18):
you you take crimes that used to be very hidden,
and it was very difficult to participate in them. So
if you were someone who had, you know, a thought
about abusing a child or getting your hands on images
or videos of children being abused, it took a lot.

(22:40):
There was a lot of barriers to try to go
do that, and you probably felt a lot of shame
and there wasn't really anyone who was going to tell
you that that was okay. Well, the Internet has really
removed all those barriers, so it's really easy you within
a few search clicks, you can find it. And not
only can you find content, but you can usually find
communities of people if you try that are gonna tell

(23:02):
you it's all okay, and they're gonna tell you how
to get better at finding it. They're going to tell
you how to groom kids for it. Um And so
you know, those aspects of the Internet have done the
same for lots of really good things, right like we
all confine, you know, recipes that we like in groups
that like to grow the plants that we like, and
well it just does the same thing for bad things

(23:23):
in the world as well. Um so, so we've just
seen the rate of child sex trafficking increase, but also, uh,
the volume of child sexual abuse material just skyrocket. How
is it not against the law? Why aren't there laws
against this? I mean, I know that the FBI does
you know, deep dives and and they do make arrest

(23:46):
and everything, But how is it possible that there there
isn't a full government agency dedicated to blocking, stopping, arresting
making this completely criminal all the time? There are um
so it's yeah, no, no, no, no, it just um

(24:08):
it is an underground crime. So two factors. One, you
have to really go look for it, so you know,
it takes a lot of effort to go find the
people who are engaging in it. And then second it's
so big. Well, I mean we just can't. We have
entire task force across this country both focused on child
sex trafficking and focused on the spread of child sexual

(24:31):
abuse material, and they're busy all day every day, and
there's much more out there, uh that needs to be tackled.
But there are definitely things that you know, while the
laws are in place, that more resources could be directed
to addressing these issues for sure. So first of all,
what is the difference between the deep web versus the

(24:52):
dark web? Sometimes people use them interchangeably. So for the
purpose of our discussion, I talk about, uh, the open
web and the dark web. So the open web are
the web we use every day, the recipes, Yeah, it's
the recipes. How do we can Yeah, it can be indexed,

(25:13):
so if you Google something, it will show up in
a Google search result. UM. The deep web, which is
often conflated with the dark web, but they are different,
is things that are not indexed. So think about if
you work at a corporation like your intra net. It's
not necessarily bad, it's just not indexed on Google, right,
So it's not it can't it can't be searched. The

(25:36):
dark web is something different, So not only is that
not indexed, but it often is fully anonymized and encrypted UM.
And the dark web often was built for good purposes
like national security UM, but has been hijacked UM in
many ways for illegal activity, this air being one of those. Well,

(26:01):
I've never been on the dark web, but I have
heard lots of horror stories about UM, people that for
revenge porn or whatever have put people on the dark Web. Um,
I'm afraid to even investigate it for my own curiosity.
I'm just terrified of it, just even saying it. Um.

(26:25):
But it is where the majority of child sexual material
is circulated, am I right? So we don't know there
is a lot. There is a lot there, definitely, um.
And but the scary thing is there is a lot
on the open web. So going on these sites that

(26:48):
we see every single day. So an eight year old boy,
if he googled boobies, he could pretty much get to
child pornography pretty quickly, right, Um he if he if
he google that, Plus you know a few other terms
that are related to children. Um. Many of these companies

(27:08):
implement tools to make sure that that type of abuse
material is removed, but um, sometimes it's it slips through.
And what is sex torsion? Yeah, it's like extortion, but
related to uh sex or sexual images. So this is
where we see when we talk about teens and growing teens,

(27:30):
really the one of the kind of biggest watch out
areas right now. So it is essentially someone extorting sexual
images from you. Or holding something over your head and
getting you to participate in this type of exploitative behavior,
usually self generated. So give me an example, will give me,

(27:52):
give me a story that you've dealt with. Um. Sure,
so someone meets a stranger on mine. So perhaps you
were talking about Instagram and you know, maybe the sexualization
of girls. Someone, Uh, they have a public account, anyone
could comment on their images. They start commenting on their images.

(28:12):
They then get into their d m s. These people
build what they think is a relationship, but they really
have never met each other. Um, she your heat. It
happens to girls and boys. UM thinks that this person
is their friends, so friends them on multiple other social
media platforms, and then all of a sudden, this person
starts asking for sexual images. Show me this, show me that.

(28:34):
They send them an image, and then that person says,
I need more. I need a video of you doing this.
And the child says, you know what I've done. I don't,
And this person says, well, you know what you friended
me on Facebook. I now have the d m s
of all your family and everyone you go to school with.
And if you don't do this, I'm going to take
all those pictures you just sent me and I'm going

(28:56):
to send them to everyone in your school, and I'm
going to send him to your mom. So do it.
And then all of a sudden, this child is in
a position where they are doing worse and worse documentation
UM of abuse. And we've seen cases where not only
is it self production, but where they are then asked
to abuse a sibling or or someone else um in

(29:20):
in the home. And you know, any anyone is vulnerable,
right we we look at it child sex trafficking, and
that affects you know, we can talk about that at
another point. Predominantly UM a more at risk population of youth.
This type of groomming and exploitation can really reach anyone

(29:43):
who's online. So let's let's talk about that. Who do
you feel is the most vulnerable to be targeted for
online grooming or sex sexploitation? Yeah, so UM, for online
grooming and extortion, it is really any child that is

(30:03):
on social media or connected device or in a community.
It doesn't discriminate. So uh, any child that can be
connected to a stranger or really anyone who might want
to UM extort them is vulnerable. When we work on
child sex trafficking, we see different vulnerabilities. That is much

(30:27):
more likely to be someone who has a risk factor
of being in child welfare, being from a home where
there's violence, or child sexual abuse, being runaway homeless, But
those factors don't hold when we get to online abuse
and grooming. It really is any child that's on the
connected platform, and so they are highly vulnerable to someone

(30:52):
who comes along and says, hey, I'll take care of you,
and then they start selling that child for sex. And
the way child sex trafficking manifests in the United States
is that oftentimes these children are sold on escort sites
as adults, and so the person buying them doesn't really
know that they're fourteen or fifteen years old, and that's
not used as an enticer. I mean, do they use children.

(31:15):
I mean, there's obviously going to be people that prefer sometimes,
but not the majority. And so there's a lot of
there's a lot of false narratives out there right now
about child sex trafficking in the United States that are
really trying to drive fear through communities. But the reality
is is that there's some high vulnerabilities that are associated
with child sex trafficking and intigranted. It can happen to anyone,

(31:39):
but it it predominantly happens to children who are already
vulnerable because of another reason, when we talk about online grooming,
we don't see that being a similarity. The main vulnerability
is are you online and and um, are you putting
yourself in a position where you're having conversations with strangers

(32:00):
or building those relationships? Do you feel like you don't
have a safety net, an adult or caregiver to talk
to in case something bad happens online? So there are
very different profiles of what could put a child at
risk for those two different types. I mean, I'm I'm
like a hawk with this stuff, with my kids stuff.
And still my daughter who doesn't do the sexy sexy

(32:24):
Instagram at all, she was on Skype or something and
this thing popped up and said, hey, how'd you like
to have a sugar daddy, which is terrifying. So besides
a teenage girl who's doing sort of sexy Instagram, walk
me through how a teenager can be groomed online. So,
usually someone who's grooming a child online is not just

(32:44):
grooming one child. They usually have multiple children and they're
perfecting their technique. They're seeing what happens. They might be
blindly reaching out like you said on Skype, or they're
watching their Instagram or their social media posts and understanding
really what maybe motivates that child. One of the best
safeguards to UH ensuring that a child isn't susceptible is

(33:07):
having the conversation with your child before it happens. So
if that happens first, before a child has any idea
that that could happen, like why did someone say to
ask me for three books for a foot photo? And
they think at two points, oh, that's funny, I can
make three in bucks, as opposed to they already knew
the minute they were handed the device that they might

(33:28):
get texts or messages from people who they don't know,
and those people might try to convince them that they're friends.
But here's what happens. We're always going to have an
open dialogue as the person who's handing you that device.
I'm always gonna be here for you. I am not
going to get upset with you. You can't do anything

(33:48):
wrong in this environment that would get me angry. I
want you to be safe. Have you found that that
a lot of um children that have been vulnerable to
this kind of stuff, have kept it to themselves, have
been fearful of communicating it with other people, and that's
why they become victim to it. Yeah. Well, in our research,

(34:10):
one of the main factors in an extortion or grooming
situation getting out of hand is that the child felt
like they couldn't talk to anyone. So uh, when the
perpetrator has that much control over child where they've isolated
them from their family or their friends, um, then it

(34:35):
gets bad very quickly because it's when the kids feel
like they don't have someone to talk to that they
feel trapped, and then the person who is grooming and
extorting them is winning. They're going to keep that bind.
I feel like we do so much parenting with the
real life scenarios. You know, if this uncle or this person,

(34:59):
you know, you have to come tell me. But I
find less and less parents do this with devices. I
almost think it should be part of the ritual of
giving your child their first device. Is this whole piece
of education. The minute a device is put into the

(35:21):
hands of a child, you start this conversation depending how
old they are, um, right, and it has to be
ag appropriate and then you have it often so it's
not a one and done. You m them the device,
you talk about it once a week, you sit down,
you say, show me what you're doing on those apps.
Would love to better understand what among us is, you know,

(35:41):
show me how to play Fortnite, and have those conversations,
and then at some point you really have the direct
conversation about what the threats are. And I think for
many of us who are parents today, we didn't grow
up with any of this. So it it is, it's scary,
and it's unknown. We have to remove the fear from it.

(36:05):
We have to have these conversations as just normal conversations
about how we prepare our children to live in a
digital world. Think about it about empowering them, making them
more resilient, um helping them navigate it, versus coming from
a place of a lot of fear where we're gonna

(36:28):
shut it down or get really you know, angry, because
then they don't feel like they're going to have a
safety net, right, And whoever makes them feel comfortable online
will make them feel more comfortable than their caregiver, the
person who needs to be there to support them as
they navigate this world. Now, a quick word from our sponsors,

(36:57):
welcome back to go ask Gali. Let's get back to
the discussion. What are the suggestions UM, particularly from Thorn
about making social media safer? That there are other tools
or techniques that we can use to kind of keep
our kids from this kind of harm. Much of this

(37:18):
is based on kind of age appropriateness. I think one
is have a conversation early, have it often, UM, understand
what applications or kids are using, and understand the safety
features on it. So there are some applications where the
default setting is that kids can chat with anyone or

(37:40):
their accounts are public. So make sure you don't just
let them sign up. Go in and see are are
these accounts private? Can they not chat with anyone? Can
they chat with only someone who's in the house. Can
they chat with someone who's on their friends list? UM,
go through on a regular basis and have them talk
about who who their friends are that are on their list. Right, So,

(38:03):
if you've had an agreement that I'm okay with you
friending people in this environment, but I wanted to be
people that you know in real life. So once a week,
let's talk about who's on that friend friend list and
explain to them why right. So these blanket rules without
an explanation of why, um, I think are hard for
kids to consume. But kids want to be safe too,

(38:25):
So if you can explain to them why, um, I
think that's important as you go along. So, for instance,
a twelve thirteen year old, how much do you divulge
about a person that could be grooming them? Do tell
it it's for sexual exploitation, Like how far do you go? Um?

(38:46):
Sometimes you have to give a little bit and then
see where they go. Like, every child is different, right,
It might it might be based on what they've heard
at school. Um. If they're in a group of friends
who maybe aren't online a lot or don't talk about
these things, they might not be ready for a more
in depth conversation. But if they've heard things at school

(39:06):
and they're curious, um, you start the conversation, give a
little see what they ask. If they're asking, it's best
to answer it factually. And you're a mother as well.
You have three kids, so being immerged in this world,
what kind of conversations are you having with your children?
I know they're younger, but no, but still, I mean,

(39:30):
I think, um, I have taken the position of trying
to be very factual UM, straightforward, non emotional UM, creating
a a safe space so that they know the facts
going in and they feel like they can come to

(39:50):
me if anything, UM go sideways with them or themselves.
Sometimes I have to take like ten deep breaths before
I go into a conversation because I know a little
too much. But I don't want them to feel fear.
I want them to feel empowered, and so I really
just try to anchor all my conversations that way. So

(40:12):
I have one child who will ask all the questions
and so I give all the answers. I have another
who who won't, and it's just not even on their radar.
So I I just say, you know what, this person
UM may not have the best intentions for you, UM
or we just don't know, and so let's focus on

(40:32):
the people we do know and our friends. Here. I
have those those same two children, and with my younger one,
the one that really doesn't want to hear, gets very
frightened by stuff. You know, we keep it in the
stranger danger thing. Where is my older daughter who wants
to know everything? She could rattle off every sex trafficking
statistic around the world, like she confronts fear by giving me,

(40:55):
give me all the information, that's my ammunition, tell me everything.
So I mean, I think you're right. I think you
have to kind of uh create a scenario that's age
appropriate if you're sort of these learning techniques or telling stories.
The other thing I wanted to talk about was, um again,
I call it real life. In real life, I've had

(41:17):
situations where, uh, like one of my daughter's friends came
over when she was fourteen, and she was asking me
all these questions about statutory rape, what does it mean?
And how old you have? You know, to the point
where I was like, this is a huge red flag.
This fourteen year old is asking me question about statutory rape.
How do I look for those red flags online? Yeah? Uh,

(41:44):
it depends by age, right, So there's there's also something
too as kids get older, demonstrating that you respect and
trust their privacy and balancing their safety. So when they're younger,
I go back to some of those maybe like weekly
conversations about looking at friends lists or understanding apps if

(42:05):
you've started to do that. Um, as they're younger, when
you get older, when you want to respect their privacy
and let them do their thing. Um. Then it comes
down to have you planted the seeds of an open
conversation so that instead of actually having their device in
their hands and looking at it, you can talk to
them and say, have you know, have you experienced anything

(42:27):
online lately that makes you feel uncomfortable? Um? Is there
any any challenging text you've gotten their messages that you
know I can help you with? UM? Maybe red flags,
especially when we're in a pandemic. Is have they become
more withdrawn? Are they in their room alone with the
door closed for extended periods of time? UM? Do they

(42:50):
seem more stressed or irritable or uh, you know, always
having their device at a different degree than they used to. Now,
this may not This may mean something else and may
mean that we're in a pandemic. Internect their devices all
the time, so you have to you have to be careful,
but you have to partner that with trying to have
those open, non judgmental conversations with the kids, UM, so

(43:13):
that they feel safe raising concerns. It's interesting have you
guys discussed at all at Thorn about being in a
pandemic and how people are not as strict about device
rules right now? And how more and more kids are
on their devices, much more than they were pre pandemic,

(43:34):
right right, Um, we see all of the ingredients for
it to get worse. The I think we'll have some
reflective studies and you know, in a year from now,
we get through this and see what it was. But
you know, you have more kids online, you have more
people online, and just as I said before, all the

(43:56):
good that has come with the Internet, all the bad.
So when there's a rise of net presence, you're going
to get a rise of bad things happening. But I
I also think in addition to the perpetrators, you know,
potential perpetrators online, we have to just look at the
behavior of sexting. I don't know if that's something your

(44:17):
kids have brought up, but just how common you know,
consensual sexting is, and and that something that can go
from this is my boyfriend, I actually know this person
and I've chosen to send a naked image can turn
bad really quickly when that person decides to send that
image on to the rest of the school. Well, I'm

(44:38):
glad you brought this up because I don't understand sexting.
I don't understand it. I mean it's almost like when
you know, twenty years ago people would say I made
a sex tape. Why would you make a sex tape?
It's gonna end up in the wrong hand, like nothing
good can come from it. Um and again, this sexting
is something with my own children, I say, there's no

(44:59):
reason of her to do this. It will live forever.
You know, Harvard will see it. You're not going to
Harvard if they get a whiff of this. But also
you know it's going to end up in the dark web.
But it's it's become so commonplace. And you know, I
also think, I mean, that's child pornography if you're a
teenager and you're sending naked photos to your boyfriend. And

(45:21):
by the way, the boyfriend, just because his frontal lobe
hasn't developed, he's going to show it to all his friends.
And it's just terrifying to me, the sexting thing. But
when you have adults around you that do it because
it's you know, sexy and cute, somehow it takes the
stigma off of it, I think. Or we'll see it
in shows and you know, it all seems normal. I

(45:42):
said to my daughters, if you ever get sent a
dick pic. You bring it to me right away, I
will destroy that boy. Yeah, we we we were doing
a lot of research right now with kids. About two
years ago we started looking at this issue because again
we focus on the broad you know, group of child
sexual abuse material. What we were seeing was that a

(46:05):
big portion that was growing within that. A lot of
that is produced by perpetrators documenting hands on abuse, but
what we were seeing was this trend where a large
part of it was becoming the self generated. And it's
not all generated when someone grooms you. A lot of
it was happening in this way through consensual sharing. And
we went to go look at the studies that were

(46:25):
out there, and we found that a lot of them
were kind of five years old, and things moved so quickly.
We thought, we've gotta we gotta understand right now what
kids are experiencing and trying to understand the conversation that
might change behaviors or make kids more safe in a
situation where this is happening. I'm not sure we're putting
the genie back in the bottle right So, in a

(46:45):
in a world where kids are going to do this,
how do we make them as safe as possible. Even
with teenagers I know or my own, you could easily
see the scenario of you know that they have a boyfriend,
they trust him, show me some sexy pictures. They do that. Uh,
later on they break up, there's revenge born, meaning the

(47:06):
boy or the girl uses these pictures against them. It
ends up in the dark web, and then you've got
the sex tortion right there, right, Yes, she can be
blackmailed for this these pictures she sent to her you know,
cute boyfriend and Khaki's in a tie, who seemed like
the nicest, most honorable boy. It can happen um to

(47:30):
anyone I are. The setting we did of kids found
that about teen girls had shared a nude UM, one
in ten teen boys had shared a nude, But over
of kids thirteen to seventeen had seen nudes of someone
else without their consent. So that's nearly half of kids

(47:52):
have been shown a you know, shown a picture of
someone else um on their on their device. And this
gets back to teaching our kids the idea of consent.
So how how can parents define consensual sharing for their
kids when it comes to social media. I think we
should start with outlining if you're going to share images

(48:16):
of your body and your you know what you consider private.
Everyone has that differently. Just understand the consequence and understand
what can happen, even if in that moment in time
you firmly believe the person you're sending it to will
never do anything. We gotta help us expand the realm
of possibilities for kids. So let's make sure a child

(48:38):
knows that. And then second is if you receive an image,
asking yourself was that image intended for me? Is it
being sent to me with the intent and consent of
the person whose body it is representing? And if not, um,
it needs to stop at your device, and you probably

(48:58):
need to alert someone either if you know the person
who's emitted it is a caregiver, someone at your school, um,
and and really help stop that. And then the third
would be to talk about how it's a criminal offense
child pronocate, I mean, do you go into the scare
tactics at all? I think that's part of the first conversation,

(49:20):
you know, just in a factual way, like let me
just talk to you about what you're gonna be coming
it up against. In middle school or high school, you
might get asked to share a picture of your chest
or your private areas. I want to prepare with all
the facts before you get that question. Um. One, it's
illegal you know to Uh. You might think you're sending
it to the boy who says he loves you or

(49:40):
the girl who says, um, he loves you, she loves you,
but it could go elsewhere. And what how are you
going to feel if the whole school has that picture? Right, Like,
let's just talk through this scenario so that you can
process this before the time comes. Um. So that's one,
and then the second is if you are a recipient
of that, what is the definition of consent and intent? Um?

(50:04):
And so these are just new conversations that we're not
used to having that we have to make normal and
have with our kids. All right, So, as a as
a parent and citizen, of course, I want to know
how we combat with this epidemic. But also because I
want to circle back to Thorn, because you're addressing all

(50:24):
these issues, Can you tell me a little bit about
Thorne's strategy in working with law enforcement and tech companies? Yeah, so, um,
you mentioned this at the beginning. We have a bold goal.
We want to eliminate the trade of child sexual abuse
material online in the next ten years, and uh, we
are building the technology systems globally to do that. And

(50:48):
we think that there's three kind of key things that
we have to do. One is we're building solutions for
law enforcement to help them find children faster. So every
imager video that we see online is a child. Um
some of those represent children who are being abused right now,
and some of it's just recirculated content. So you know,
once an image goes viral, it can be seen millions

(51:11):
of times. So you have to use data and technology
to sift through all those mounds of data, get down
to where there's a victim right now, and help law
enforcement have all the information they need to find that child.
So we build solutions for law enforcement to help them
do that and get to the kids who need their
help immediately. The second thing that we do is every

(51:34):
single company this is we're talking about the open web
now that hosts user generated content Instagram, Fisco, Facebook, Flicker, Shutterfly,
anywhere you can post an image or video I believe
needs to proactively be detecting child sexual abuse of material
and ensuring it doesn't go viral on their platform. That

(51:55):
is a basic standard I think for any tech company,
and the reality is is that there's probably fifteen to
twenty that actually have the systems in place to proactively
do this, and yet there's hundreds of companies out there.
And so that's an area of work where we've actually
built software for companies to make it really easy so
they can no longer say like I can't afford to

(52:17):
build this, or I don't know how to build it.
We've built it. It needs to be used this software
some other type of software to detect this type of
abuse content which is also illegal to have on your
platform at scale. And so those two areas we build technology.
We connect the data around the world so that it

(52:38):
moves really quickly, uh, law enforcement and tech companies have
the intelligence they need to find kids and take content down.
And then our third area of work is what we've
been talking about today, is um there is no technology
that you can put in place that can help change
behaviors with kids and caregivers. That these are conversation sations

(53:00):
that we need to be having. This is why we
need to be talking today because we need to reach
moms and parents and dads and caregivers and teachers who
are with children to really normalize this conversation and realize
that when we're talking about sex said, when we're talking
about developing kids, we have to add in the fact
that they are developing in a digital world and they're

(53:22):
gonna do some crazy stuff and we need to arm
them with the information, um and the security and the
resources to reduce the potential horm Absolutely, God, I agree with.
Is there any kind of software that a parent can
use now that we can see our devices? There are
software products out there that you can put on kids

(53:46):
devices or home devices to you know, detect potential threats. Um.
I don't know any off the top of my head,
and I think some may work well. My only caveat
to that is that I don't think that type of
software is a substitute for the good conversation at the
open door. Yeah, every time you're child, you're given a

(54:09):
child of device. You go. This comes with instructions for me,
sit down, grabag ginger ill, and let's talk about this.
So um, yeah, you guys are just doing such great stuff.
Is there any government organization that we can write to?
Is there any push we can sort of accelerate from
our side? I think, um, stop sex stortion dot com

(54:33):
is a resource that I think you know, parents can
pick up one to educate themselves, to to share with
their schools because there's there's school resources on there of
how to start to have this conversation. There's a cute
little video on there that makes it really easy. It
has cats in it. Everyone loves a cat. Uh, some
people some people don't say, some people don't, but it'll

(54:56):
make you laugh. Um. It really kind of breaks down
some of the barriers to this conversation in a really
really easy way. And then if you or someone you
know is being extorted or being groomed and you need
an outlet, the National Center for Missing and Exploited Children
has a hotline. So you call that hotline you report that.

(55:18):
I know I have been approached by many parents who
feel like we don't know where to go. If you
call your local police department and this is a digital
crime and they don't know where the perpetrator is, they
might not know how to even handle this. So reach
out to the National Center for Missing Exploited Children if
you need help now. Uh. And then thorn dot org

(55:39):
has is our organization has a ton of additional resources
and research. If you want to learn. Julie Cardulla, thank
you so much. Well, thank you for being brave enough
to put this on there. It's it's not always comfortable
to talk about. Sometimes it can be scary for us parents. UM,
I'm eternally grateful for that. Thank you for well. As

(56:00):
Kara Sawa says, never look away, and I believe that,
so thank you so much. I agree. Thank you the
power of informing, being honest and being an advocate. Just
recapping some of the main recommendations from today's episode that
parents can use for the future. Thank you for listening
to Go ask Ali. Remember to subscribe to Go ask
Alli and follow me on my social media Twitter, at

(56:24):
Ali E Wentworth in Instagram. The Real Ali Wentworth. Go
ask Ali is a production of Shonda land Audio and
partnership with I Heart Radio. For more podcasts from Shondaland Audio,
visit the I Heart Radio app, Apple podcast, or wherever
you listen to your favorite shows.
Advertise With Us

Popular Podcasts

1. Las Culturistas with Matt Rogers and Bowen Yang

1. Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.