Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to the Next
Talk podcast.
We are passionate about keepingkids safe in an overexposed
world.
Today we're talking about KidsOnline Safety Act, known as COSA
, and we have brought in aspecial guest.
Her name is Tori Hirsch.
She's the legal counsel atNational Center on Sexual
(00:25):
Exploitation.
Tori, we're glad you're here.
Please tell us a little bitabout yourself.
Speaker 2 (00:30):
Yeah, it's so good to
be here.
Thank you for having me.
So I am legal counsel with theNational Center on Sexual
Exploitation and we're anonprofit, nonpartisan
organization that is committedto using corporate advocacy,
public policy and litigation tofight for a world free from
(00:52):
sexual exploitation and abuse.
So I'm just here to talk a bitabout COSA, which is a bill that
I think most people have heardof at this point.
It's very well known.
Just kind of talk about what itwould mean for our kids and
social media.
Speaker 1 (01:06):
Absolutely.
I think.
We see it in the news, we hearit, and so I want to break down
just real simple.
What would it do if it passes?
What would that mean as theprotection for our kids?
Speaker 2 (01:19):
Yeah, so just kind of
generally.
Cosa is a bill that targetsdesign features of social media
companies and other onlineplatforms some video game
systems as well messaging appsand requires them to really
simply design with the safety ofminor users in mind.
That's really simply what thebill sets out to do.
(01:40):
It is longer than that, butthat's just kind of a summary
out to do.
It is longer than that, butthat's just kind of a summary.
So right now it is COSA's themost comprehensive child safety
legislation pending today inCongress and it basically
fundamentally changes socialmedia platforms focus from a
duty of profit and making moneyand getting more users on their
(02:02):
platforms and making themconsider children's safety and
design with their well-being inmind.
Speaker 1 (02:09):
So what are some
practical things that it would
do?
Like, say, I have a kid onInstagram and their account is
private?
How would I see these changeson their account if COSA passed?
Speaker 2 (02:22):
So, for one, it
requires platforms to default to
the highest level of safety forchildren.
So this will include disablingprivate messaging from unknown
adults to minor accounts, whichthis, I think, we know is a lot
of times the mechanism by whichsextortion schemes happen on
Instagram.
So adults are no longer goingto have access to just reach out
(02:44):
and message those accounts.
Additionally, it's limitingdesign features, including
infinite scrolling, things likeauto-playing videos, rewards or
incentives to keep users usingthe platform, keep them addicted
, and other features that resultin compulsive usage.
So, essentially, tech is goingto be required to slow the
(03:09):
addiction that children have ofusing their own platforms.
It's going to affect thealgorithm, so it's going to
disable algorithms thatfacilitate content between
predators and connecting them tominor users and connecting them
to minor users.
And then, on the parentalcontrol side, like I said,
(03:33):
safety settings are defaultedfor minors.
So this means private accountsby default, which is huge, and
these can be changed or enabledwith parental consent.
So it kind of puts that backinto the hands of the parents,
which is extremely important,and it's going to also require
easy access to parental tools.
So, you know, making it easy tosee the account settings that
their children are using,allowing them to restrict
(03:55):
purchases on platforms, viewingtheir screen time and usage,
allowing them to set limits onthat.
You know, things like not beingallowed to access the app
during certain hours, and a goodthing about COSA is that it
does apply to existing accountsas well, so it's not just going
(04:16):
to be accounts that are createdafter COSA is passed.
It's going to be all accountsfor minors.
Speaker 1 (04:21):
And so they will use,
I assume, that birth date that
you have entered when youstarted that Instagram account.
Is that correct?
Speaker 2 (04:30):
That should be
correct.
Speaker 1 (04:31):
So, parents, I want
you to take note of that,
because if your kid has anaccount that you don't know of,
that they entered an incorrectbirth date, that's going to be
an issue.
That's why the conversationswith your kid at home is so
important about keeping themsafe online.
But I love all the default.
As you were running throughthat list, one of the things
that you said was it's going tolimit contact with adult unknown
(04:55):
predators with child accounts,and I just kept thinking why is
that not the case right now?
Speaker 2 (05:01):
That's the question
we're asking too.
I don't have a good answer.
This is something thatadvocates in the space have been
asking for saying this simplefix is going to have meaningful
ramifications.
It's going to make actual,preventable harms from occurring
(05:22):
.
Speaker 1 (05:23):
The two things that I
really love too the default to
the private account.
That's something we've alwaysadvocated for, that if you have
a minor on social media, theyneed to have that private
account.
But I also love the restrictedhours that's kind of like is it
called sleep mode or somethinglike that?
They referred to it that way tobasically be like social media
(05:45):
is off during these hours.
Speaker 2 (05:47):
Right, and I don't
think COSA particularly uses
that language, but I knowInstagram's new teen accounts,
which you might be familiar with, that they conveniently
launched the day before COSA wasmarked up in the House Energy
and Commerce Committee.
But that's one of the thingsthat Instagram is now
implementing.
Cosa is getting at the designfeatures.
(06:07):
That's a design feature that weknow.
A lot of sextortion happens inthe middle of the night, when
kids are isolated and alone andon their phones, so if they're
not on them during that time,hopefully things like that will
cease to happen.
Speaker 1 (06:22):
Predators are coming
into the bedrooms at night
through the phones and gettingaccess to our kids and
manipulating them, and so themore we can prevent that, the
better.
I know we advocate no phones inbedrooms at all at bedtime.
That's one of our big next talkprinciples.
Have you seen a lot of support?
Tell us about that, because Ithink there's a lot of
bipartisan.
(06:42):
Educate our audience on that.
Speaker 2 (06:45):
Absolutely.
This is an extremely bipartisanbill.
It passed the Senate with 91votes in favor, so if that
doesn't tell you something aboutthe bipartisan support, I don't
know what will.
And it has that support becauseI think people are
understanding that this reallyis needed to protect children
(07:12):
and it's the only bill thatwould comprehensively regulate
social media in this way, rightLike we haven't seen a
comprehensive bill thataddresses social media in this
way.
And I think Congress has seenthe harms.
They've heard from the harms,from parents, from children who
have been impacted, and we're ata tipping point that it's
finally time to take action andget this passed.
(07:32):
So right now, where it's stuckis in the House.
There's been a lot ofmisinformation from big tech
spreading about how it wouldimpact First Amendment rights,
free speech, conservative,pro-life views, and at the same
time, they're spreading messagesto Democratic House members
(07:53):
saying that it's going toprevent LGBT youth from being
able to have access tomeaningful resources for them.
Neither of those things aretrue.
This bill gets at designfeatures of platforms, not
content.
That's very explicit in thebill.
So at this point, I think wecan really point to big tech and
say they don't want to beregulated in this way because
(08:14):
they're not liable right now.
Speaker 1 (08:15):
Well, if they're
regulated, then there's
liability.
Comes with that liabilitycorrect.
And also, like you're losingconsumers because you're losing
time, that you're pushing forthose kids to stay on and stay
scrolling.
Speaker 2 (08:28):
And they're losing ad
revenue.
You know one of the things thealgorithm has I think probably
every you know third or fourthvideo is an ad.
So that's how they make moneyand if they have fewer users on
their platforms, they're notgoing to be able to sell that ad
space.
Speaker 1 (08:44):
Well and I know I
want to go back to how you said
the sleep mode was Instagram.
I'm glad you corrected me onthat.
When they rolled that out, Iknow some people were frustrated
that they conveniently rolledout some more new safeguards in
basically some of the thingsthat were in COSA.
I actually thought it wasalmost like great work for all
(09:06):
the advocacy work, because theyknow they're in trouble, they
got to do something and that isall the hours of parents going
testifying about how their kidshave been harmed.
I mean, I know some of thesefamilies personally who have
gone and testified and have losttheir children to sextortion
schemes on Instagram, and so Ijust think it's wonderful that
(09:30):
we're able to use those voicesfor change, whether it's through
Instagram, rolling out newchallenges or COSA, both would
be great.
How, how hopeful are you thatit's going to pass?
And what do you think about thetimeframe?
Do you have any kind of idea ofwhat we could expect?
Speaker 2 (10:05):
no-transcript.
So right now, the chance thatCOSA has to pass is going to
happen during the lame ducksession, which is later this
year.
If it passes, it'll pass duringlame duck.
Speaker 1 (10:20):
So we should be eyes
on that and if it doesn't, the
next year but hopefully it'sgoing to pass right after the
election is what we would behopeful for.
I just want to thank you forall the work that you're doing
to help kids and families.
I know for next talk, you knowwe're really focused on parents
being involved in their kids'lives and online, and there's
(10:40):
millions of kids out there thatdon't have involved parents, and
I feel like that's where COSAand these new safeguards that
Instagram is rolling out willreally help protect these kids
and put at least some safeguardsin place for them.
Speaker 2 (10:54):
You hit the nail
right on the head.
Tech has consistently put theresponsibility and the duty on
the parents, but parents can'tdo everything.
They can't, you know, belooking over their kid's
shoulder on their phones.
These safety settings thatexist right now are sometimes
really hard to find, sometimesthey're buried, and it is time
(11:16):
for tech to be responsible insome sense and making sure kids
are safe on their platforms.
Speaker 1 (11:23):
It also goes to show
when kids open accounts, it's
going to be really important,going forward, that they put
their actual birth date in thereso that the safeguards can be
in place.
So, parents out there,listening, this matters.
Your relationship with your kidmatters.
We don't want your kid lying toyou.
We want them learning theplatform with you and that's how
they can be healthy and thenhave all these safeguards in
(11:45):
place of COSA.
Speaker 2 (11:46):
You know, thank you
to the parent advocates who have
been really brave and beenwilling to share their stories
or their family stories that Ican't even imagine like the
things that you know they'vebeen through, and I think it's
absolutely meaningful torepresentatives and senators to
hear those stories.
Those are really what pushchange forward.
(12:08):
So keep being brave and thankyou for using your voices to
push for change.
Speaker 1 (12:15):
So well said and we
know of families who have been
able to save their kids becauseof an unawareness story Like, oh
, we've heard this story andthey're sharing their story with
COSA advocacy and they'reputting a face to a tragedy and
people are actuallyunderstanding this is happening.
(12:36):
One of the things that hasshocked me is I just thought
some of these extortion schemesthat were claiming the lives of
these kids were so isolated, andthe more podcasts we've done
with these parents, the morepeople have come out of the
woodwork to say this happened tomy kid as well, which is crazy
to me that it's not so isolatedanymore.
Speaker 2 (12:58):
Sex extortion is.
I mean, it's certainly on therise.
We've seen that reflected instatistics from the National
Center for Missing and ExploitedChildren.
They publish, you know, annualstatistics about their reporting
every year and it's absolutelyon the rise.
We've seen really disturbing,actually, handbooks on from the
(13:20):
dark web about how tosuccessfully run a sextortion
scheme, like details down tolike, you know, how to ask
certain questions or like how tobuild a relationship.
So I think it can't be, youknow, overstated that these
harms exist and are real and areout there.
So it's extremely important tobe vigilant and that's why we
(13:43):
need these changes in place too,because sometimes, as parents
I'm sure it's, you know hard toknow what's up against your kid.
Speaker 1 (13:49):
Well, and I think
most parents would agree.
You know it is ourresponsibility, but we need help
.
We need help and I think that'swhere COSA comes in, and it's
helping the parents navigate allof these online challenges that
we're faced with.
Speaker 2 (14:05):
Parents and tech for
good is what we want to see.
Speaker 1 (14:09):
So there's a term
floating around that I've been
seeing, and I want you toexplain it to our listeners and
to me image-based sexual abuse,ibsa.
Tell us what that means.
Speaker 2 (14:23):
Yeah, so, like you
said, IBSA is kind of what we
refer to it in casualconversation, but it is
essentially the sexual violationof a person that's committed
through abuse, exploitation orweaponization of any image
depicting that person.
So it includes creation,distribution, theft, extortion
(14:49):
or any use of an image for asexual purpose without the
meaningful consent of the persondepicted.
Could this?
Speaker 1 (14:56):
include nude photos
that were willfully exchanged
but then being used for revenge.
Speaker 2 (15:04):
Yes, yep, that's
included in the definition.
Speaker 1 (15:07):
And it would also
include, obviously, any
sextortion cases, any groomingcases, where kids are
manipulated into sending nudesto a predator online, that sort
of thing.
Speaker 2 (15:19):
Yes, that's right.
Speaker 1 (15:21):
So tell me also.
We just did a show on AI deepfakes and nudes and how AI is
changing the conversation aboutthis.
Can you speak into that?
Speaker 2 (15:46):
is so prevalent.
In creating these deep fakes,there's also the chance that you
know victims of it just don'tcare about it, or you know are
less likely to be affected inall the negative ways because
it's so prevalent.
That was something interestingI hadn't thought of before, but
AI has had, you know, a profoundeffect on IBSA.
It just means that you knowanyone's at risk.
It can put anyone at risk ofhaving you know fake sexually
(16:08):
explicit deep fakes created ofthem and because these AI tools
are so new and because tech inthis space is moving so quickly,
the laws are really catching upto be able to address this and
ensure that victims have, youknow, accessible routes for
justice when they've beenviolated in this way.
Speaker 1 (16:29):
So what happens right
now?
Say, somebody takes my imageand creates a pornographic video
of me using AI.
What are the legal steps that Icould take right now?
Like what?
What would I do?
Speaker 2 (16:43):
Yeah.
So I'll just say now it'sdifferent, for if you're a minor
versus if you're an adult.
Um, so if you're an adult, uh,it's currently not a federal
crime to upload non-consensual,sexually explicit content to the
internet, but there is a civilcause of action.
So if you were to say thisimage, this video, was taken of
(17:08):
me you know, I didn't consent tothis or if deepfake was created
of me, that's not actually me.
Most websites have a way youcan report that as violating
their terms of service.
So then, really, the onlyremedy at that point is to
follow up with the website andhope it's removed or, in certain
cases, you know, file a civillawsuit.
(17:31):
But that's, you know, obviouslynot the best route for a lot of
victims.
It's not convenient, it's timeconsuming, it's expensive and
just isn't the most practical.
Speaker 1 (17:40):
And there's research
that has to happen to figure out
who you're suing in that civillawsuit, and so that's a whole
challenge as well, correct?
Speaker 2 (17:49):
Right, Absolutely.
A lot of times images are, youknow, taken and uploaded and
explode and go everywhere andyou can't find, you know, first
the person who created it andthen a lot of times, you know,
the distribution is just sowidespread it's hard to pinpoint
anything.
Speaker 1 (18:05):
Lot of times.
You know, the distribution isjust so widespread it's hard to
pinpoint anything.
That's so scary for people tothink that you know, I could
have an AI generated videouploaded to Pornhub and I could
try and contact them to tellthem to take it down.
It's not me and I can try andcivilly sue them.
But in the meantime my life iskind of ruined, tainted, because
(18:26):
if people saw that they wouldthink it was possibly me because
AI technology is so real.
Speaker 2 (18:34):
Right, absolutely,
and the harms are the same too.
Victims of deep fakesexperience, you know, mental
health issues.
They experience depression,ptsd, traumatization, high
levels of anxiety.
(18:55):
You know fear that people aregoing to see the video and
recognize them.
It's oftentimes the same as ifyou know they were an actual
victim of a sexual crime.
I mean, this is a sexual crime,I should say a sexual crime.
Speaker 1 (19:06):
I mean this is a
sexual crime.
I should say AbsolutelyHumiliating and traumatic for
sure.
I can totally see the PTSDhappening when you're a victim
of that.
So what is being done at thefederal level to pass a law
federally for IBSA for adultsWe'll get to the kid portion in
a moment for IBSA?
Speaker 2 (19:24):
for adults.
We'll get to the kid portion ina moment.
Yeah, so there's actually anumber of really great bills
that some of them passed theSenate.
One of them is called the Takeit Down Act and it does
criminalize the uploading ofIBSA.
So it makes that very explicit.
And then it also requiresplatforms to have IBSA reporting
(19:44):
requirements in place,something more, I guess, narrow
and just specifically for IBSA,so you wouldn't just be
reporting.
You know this content isoffensive, it's like I'm a
victim of IBSA and then it hasrequirements for the platform to
take it down in a swift measure.
So that's kind of called likethe report and remove
(20:07):
requirement for the Take it DownAct.
And then there's another billcalled the Defiance Act and that
creates a civil remedy forvictims who are the victims of a
deepfake, essentially so thatone explicitly applies to
deepfakes although the argumenthas been made that IBSA includes
(20:29):
deep fakes already in itsdefinition but that one makes it
really explicit it applies todeep fakes.
And then that also has privacyprotections for victims in court
during discovery, so theiridentities are more protected.
And the last one that I'll justmention is called the SHIELD
Act, so that also establishesfederal criminal punishments for
(20:51):
those who share explicit orprivate nude images without
consent, so that one also fillsin existing gaps, and it
addresses sextortion scams aswell.
So there's kind of a patchworkof a lot of bills out there
right now that are addressingdeep fakes specifically, and
this would apply, you know,across the board, to adults as
(21:13):
well as children.
Speaker 1 (21:14):
Is there one of these
?
You mentioned the Take it DownAct first.
Is that the one that's gainingthe most traction?
You said it had already passedthe Senate.
That's right, and how hopefulare we that that's going to pass
the House?
Speaker 2 (21:27):
If I'm remembering
correctly, I think the Take it
Down Act hasn't garnered as muchattention in the House.
I can't speak yet to whetherit's passed committee or not,
but that one is the mostprominent in the Senate as it
did pass.
Speaker 1 (21:42):
I think this is just
a reminder that these laws are
so good but it takes so muchtime in getting everybody moving
in the same direction.
I assume there's bipartisansupport for this.
Speaker 2 (21:58):
Yes, absolutely.
Speaker 1 (21:59):
And anyone who is
against this.
Are there reasons for?
Speaker 2 (22:03):
that.
So anything that has to do withcriminalizing some offices just
don't want to see.
They don't want to support abill that would put people in
jail, you know, regardless ofwhat the crime is that they're
committing.
So it's more of things likethat.
Also, in an election year, Ithink it's just very
(22:24):
unpredictable with Congress toknow what's going to happen, to
know what's going to gaintraction and what isn't.
But one positive thing I willsay is that 49 states have state
laws that criminalize IBSA.
Some even specificallymentioned deepfakes.
So even though we don't have,you know, a specific federal
(22:45):
bill addressing deepfakes,states are they have the ability
to move very quickly to respondto these needs.
So you're not completely out ofluck.
If you know you're the victimof a deepfake, almost every
single state has laws to addressthat and give a remedy of some
sort.
Speaker 1 (23:04):
Okay, so it's.
So it's really state law andwe're working on that federal
law, and so if you've been avictim, it's probably a call to
your local emergencynon-emergency number, correct?
Speaker 2 (23:17):
Yeah, absolutely.
I don't have the provisions ofall the states in front of me,
but I think the majority of themare criminal statutes, so it'd
be criminalizing the person whocreated it, uploaded it,
distributed it, that sort ofthing.
Speaker 1 (23:34):
So criminally you
could go state law, and then
civilly you could go federal lawright now, and then hopefully
we're going to have a federallaw criminalizing it nationwide.
Okay, okay, great.
Now we talked about image-basedsexual abuse, ibsa, and that
was for anyone adults really butit's different when we're
(23:55):
talking about child sexual abusematerial.
Now, that is different becauseit's minor, so we're talking
about anyone under 18 here.
What is the federal law forthat?
Speaker 2 (24:07):
The federal law is
very clear and federally it's
still called child pornography.
But we don't use that termbecause pornography implies
consent of some sort and ifyou're a minor you legally can't
consent.
So we call it CSAM to moreaccurately describe it.
But technically federal lawit's still child pornography,
but it's defined as any visualdepiction of sexually explicit
(24:31):
conduct involving a minor.
So very simple, it includesphotographs, videos, computer
generated images, deep fakes,adapted, modified, you know,
taking someone's body andputting a different face on them
.
Taking someone's face, puttingthem on a different body.
So that is kind of thedefinition of CSAM.
And what's criminalized ispossession, production,
(24:56):
distribution, reception, sharingacross state lines, which
includes posting on the Internet, texting somebody a photo,
things like that.
So federal law for child sexualabuse material is quite black
and white, I would say, andquite strong.
Speaker 1 (25:13):
So we've got a minor
say, a 16-year-old kid takes
their own nude photo, shares itwith a boyfriend, girlfriend.
What could they be charged withfor sharing their own nude
photo?
Speaker 2 (25:28):
Yeah, so legally that
does still fall under the
definition of child pornography,but again you would have to
have, you know, a prosecutorbringing that case against them,
and that's not something in mywork that I've seen.
So I mean, if you want to betechnical, like that image does
constitute child pornography,but as far as if they're going
(25:49):
to be criminally charged, Ican't really speak to that with
any certainty.
Speaker 1 (25:55):
Well, I think what we
see is law enforcement's
overwhelmed overwhelmed withonline crimes, and so they have
to take the most severe cases.
So, when we've got a childmissing with an online crime or
deceased because of an onlinecrime, that's going to take
precedence over this and, justlike schools and parents, we're
(26:18):
all overwhelmed with dealingwith these digital crimes that
we're seeing.
Okay, so, in this scenario, kidshares nude.
Now the recipient of thatairdrops it or post it.
Now we've moved into adifferent conversation, correct?
Speaker 2 (26:40):
Yeah, so you've gone
from, you know, just possession
to distribution and reception.
So that's also included underthe statute.
Speaker 1 (26:51):
And I think we see
those crimes get prosecuted more
than a minor who shares theirown, even though that could be
the case.
We see that less and less.
Speaker 2 (27:07):
Yeah, absolutely.
Media or hope yeah, hopefullythat wouldn't happen, but that,
you know, draws more attentionto it and then more people can
report it.
Maybe there's an investigationopened.
Things like that that are morelikely to happen than if it's
just you know, existing onsomeone's phone.
Speaker 1 (27:25):
Yeah, and then you
even said the distribution
across state lines, if you postit on certain platforms.
Speaker 2 (27:32):
Yeah, that deals a
little bit with what's called
interstate commerce, butbasically commonly recognize
that posting on social mediathat crosses state lines,
because me in one state and aperson in another state will
have access to that material.
Speaker 1 (27:51):
This has been so
helpful.
What else would you want to sayto parents out there?
You know we we deal with a lotof nude photos over at Next Talk
and just a lot of them betweenpeers, you know, and they're
they're figuring out.
This is not okay to do, butsometimes it's a predator, it's
(28:13):
a grooming situation.
Are there any tips or justpractical advice that you would
like to pass on to parents?
Speaker 2 (28:20):
Obviously, just
having open conversations with
your kids about those harms.
I think it never hurts to havethe reminder that once you take
a photo, it exists forever, evenif you delete it, even if you
send it on Snapchat to somebody.
You don't know who else is inthe room taking a photo of that
(28:43):
phone or taking a video of it.
So just be very mindful of thetype of images you're sending.
Mindful of the type of imagesyou're sending.
Not everyone who tries tofriend you on social media is
who they say they are, becausethat's a lot of time.
What we see with sextortionschemes is oh, I'm a friend from
(29:03):
your friend's school, we'venever met, but do you want to
talk on Snapchat?
And then things escalate fromthere.
Speaker 1 (29:09):
I'm really glad you
said that.
We see that a lot with thesextortion cases.
We work too, and predators havejust gotten very smart about
that.
Because, your kid, we talk alot about online strangers with
our kids and so when thesepredators come in with hey, I
know so-and-so on your friendslist, it automatically lowers
their guard down, lowers theirguard down and is and makes it
(29:38):
so easy for the predator then toum, to manipulate your kid
because you think, oh, this isnot a stranger, this is somebody
who knows so-and-so.
Speaker 2 (29:43):
Right, exactly, and
they've gotten, unfortunately,
really creative.
They've learned how you know,they've learned lingo that kids
use, they've learned how toconversate and things that they
talk about, things that arepopular to them.
So it's very easy to disguiseyourself as a kid or a friend
when they're actually not.
Speaker 1 (30:02):
You were saying that
there's stuff on the dark web
telling people how to do this,how to run a successful scheme.
Speaker 2 (30:09):
Yeah, it's disturbing
and I think we've seen too that
sextortion is unique because alot of times it happens very
quickly.
So in some cases I've heard ofit's even like overnight.
They friend the person and thenthe extortion happens like
hours later, and I think that'sone of the reasons why it's so,
(30:34):
you know, alarming andconcerning is because a lot of
times it's not this slow,grooming over time.
Sometimes it is.
But you know the sex extortionhandbook that you know we saw on
the dark web talked about that.
It's like what are you tryingto get out of your victim?
Are you trying to get them tobe like a long-term thing?
Or, you know, just get a couplehundred dollars if they have it
(30:56):
really quickly.
So, yeah, these predators arereally smart and you know they
know how to use the platformsvery well.
And yeah, circling back, Ithink that's why we really need
COSAs.
It's going to prevent adultsfrom friending kids on platforms
like that.
Speaker 1 (31:13):
That has been a shift
I've seen in the last couple
years actually the last threeyears on how fast the sextortion
schemes are happening just acouple hours and then we've got
a deceased child because there'sbeen so much emotional
blackmail and humiliation andyou know these biological
(31:33):
responses to what they're beingshown on a screen and I can't
imagine, as a young kid, beingfaced with that all alone in
your room at night and feelinglike your world is over, like
everybody's going to see.
This video that was justrecorded of me, and so that is
where I am super passionateabout is making sure parents are
(31:56):
aware that that is happening,because it is not just one case
here and one case there.
Speaker 2 (32:02):
It's, it's concerning
and it's scary, and I think too
, you know, kids need to knowwhat to do in that situation.
They need to know that theirparent is like a safe person
that they can go to and notworry about.
You know what their peers aregoing to think if the extorter
sends those photos out.
So it's hard to.
I mean it happens at avulnerable age for a lot of kids
(32:24):
, when they're already feelinginsecure or you know self
conscious or they care a lotabout you know what their
friends and peers think of them.
I think that's like natural tohappen, but, um, you know they
need to know that, um, I mean,law enforcement can get involved
and, um, there, yeah, there areways to prevent that from
(32:45):
happening.
Speaker 1 (32:46):
Yeah, and I think
it's just so important parents
you know, tell your kids, evenif you make a mistake on your
phone, like I'm here for you,I'm always your safe place.
Don't ever keep sending, keepengaging.
Don't one of the things Ialways tell parents.
Tell your kids don't ever sendmoney If, if somebody online is
demanding money from you, redflag alert you got to tell your
(33:08):
parents right away, Cause it'sonly going to get worse Anything
else that you would say.
So to wrap this up, ibsa isreally concerning adults.
There's no federalcriminalization law in place.
We're working on it with thetake it down and some other acts
, but there is civil recourse,but there is state laws in place
(33:34):
in most states.
Yes, that's right, correct.
And then the CSAM, which isChild Sexual Abuse Materials,
and that's anyone under 18,there is good federal laws in
place for that, for thepossession, distribution of that
, and so our kids need to beaware that there are legal
(33:55):
consequences for sharingsomebody's nude photo.
Speaker 2 (34:01):
Yeah, a few things
I'll just add.
So the IBSA laws, or just IBSAgenerally, I should say, is a
bit more broad, so itencompasses CSAM.
I think that's a good way tolook at it.
And then, like you said, csamapplies for minors under 18.
I would just add to that thereare civil remedies for CSAM.
(34:25):
So even though it'scriminalized, there are federal
civil remedies for victims ofCSAM as well.
Speaker 1 (34:32):
Okay, so
criminalization and civil for
CSAM victims.
Yeah, okay, this is great.
Tori, I am so thankful for yourwisdom and for you breaking it
down in real, simple terms forus and answering our questions.
You're welcome back at NextTalk, anytime.
It's been a pleasure to talk toyou Well, thank you so much for
(34:58):
having me.