Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The subject matter of
this podcast will address
difficult topics multiple formsof violence, and identity-based
discrimination and harassment.
We acknowledge that thiscontent may be difficult and
have listed specific contentwarnings in each episode
description to help create apositive, safe experience for
all listeners.
Speaker 2 (00:22):
In this country, 31
million crimes 31 million crimes
are reported every year.
That is one every second.
Out of that, every 24 minutesthere is a murder.
Every five minutes there is arape.
Every two to five minutes thereis a sexual assault.
Every nine seconds in thiscountry, a woman is assaulted by
someone who told her that heloved her, by someone who told
(00:43):
her it was her fault, by someonewho tries to tell the rest of
us it's none of our business andI am proud to stand here today
with each of you to call thatperpetrator a liar.
Speaker 1 (00:53):
Welcome to the
podcast on crimes against women.
I'm Maria McMullin.
The National Center on SexualExploitation is the leading
organization preventing sexualabuse and exploitation at mass
scale by eliminatinginstitutional practices and
societal norms that perpetuatethese harms.
According to a 2017 US study,one in eight participants had
(01:15):
been targets of the distribution, or threat of distribution of
sexually graphic images withouttheir consent, with women
significantly more likely tohave been targets of this abuse
compared to men.
Moreover, studies reveal thatapproximately 51% of
perpetrators often demand sexualphotographs.
42% of them demand that thevictim return to the
(01:36):
relationship and 26% demand thathe and the victim meet in
person, usually for sex.
Additionally, the NationalCenter on Sexual Exploitation
learned that 73% of victims ofIBSA image-based sexual abuse
did not turn to anyone for helpwhen they discovered that sexual
images of themselves had beenshared without their consent.
(01:59):
In another study, northwesternUniversity analyzed 462 accounts
from teenage girls pertainingto their exchange of nude or
semi-nude photographs.
Two-thirds reported strugglingto decide whether, when and to
whom they should sendphotographs.
This episode will take a deepdive into sexual exploitation
(02:19):
through digital means anddiscuss what can be done to
minimize and ultimatelyeradicate this increasingly
expanding form of abuse.
Our guest for this discussion isDani Pinter, who serves as
Senior Legal Counsel for theNational Center on Sexual
Exploitation and its Law Center.
In this role, dani serves as avoice for human dignity in
precedent-setting legal cases onbehalf of victims of sexual
(02:42):
abuse and exploitation.
She also drafts and consults onstate and federal legislation
behalf of victims of sexualabuse and exploitation.
She also drafts and consults onstate and federal legislation
to support victims of sexualexploitation and hold exploiters
accountable.
Dani speaks regularly on avariety of exploitation topics,
with a special focus onprotecting youth in a digital
age and legal remedies forsurvivors of exploitation and
(03:03):
abuse.
Dani Pinter originally joinedthe National Center on Sexual
Exploitation Law Center at itsinception in August of 2015.
She was instrumental inreinvigorating the law center
and traveled the country,building relationships and
raising awareness.
Notably, she drafted the firstpiece of legislation recognizing
(03:23):
the public health impacts ofpornography.
Dani, welcome to the podcast.
Hi, thanks so much for havingme.
It's good to be with you.
You've been working with theNational Center on Sexual
Exploitation Law Center since itbegan in 2015,.
Is that right?
That's right.
Can you tell us a little bitabout the National Center on
Sexual Exploitation, its historyand the founding of the Law
(03:44):
Center?
Speaker 3 (03:45):
Sure, so our
organization is actually over 60
years old.
So it started, like I said, 60years ago as morality and media.
It was focused on the harms ofpornography and then it did that
work for a while, went kind ofdormant for several years and,
under new leadership, work for awhile, went kind of dormant for
(04:07):
several years and under newleadership around 10 years ago
it was picked up and kind ofdusted off and changed a little
bit.
The name was changed to theNational Center and the focus
shifted from pornography to thefull spectrum of sexual
exploitation to include sextrafficking and all of those
harms.
And the Law Center was foundedin 2015 because the nonprofit
was largely a public advocacyorganization, an education
(04:30):
organization, and we wanted toadd that legal aspect to the
advocacy.
Nacozi Law Center specifically,where I'm senior legal counsel,
we represent victims andsurvivors of sexual exploitation
in civil cases to holdaccountable those people that
facilitated their abuse or thosecorporations that facilitated
their abuse.
So you know our mission andwhat we believe in is that there
(04:54):
are major corporate offendersthat are exacerbating harm and
when you can hold themaccountable, that's how you get
top-down societal change.
So if you think about bigtobacco, how we really help
people and how we really helpthem to even know the truth that
it was unhealthy is you have togo after big tobacco because
the big tobacco is wreakinghavoc.
So similarly, we think majorwebsites like Backpage that was
(05:17):
facilitating sex trafficking.
We have a case against Pornhub,which was massively
distributing all kinds of sexualabuse, but including child
sexual abuse, child pornographywe sue them.
So if you're a survivor who'sexperienced image-based sexual
abuse and you want to holdaccountable, you know, maybe
there's a website that youreally feel played an
(05:39):
unbelievable role and reallyexacerbated the harm.
You know, without that websitefacilitating it, you know you
wouldn't have been harmed theway that you were.
Let us know because we want tohelp you.
We want to help you hold thosecorporations accountable and
make that change so this doesn'thappen to anyone else.
But also, the Law Center workson policy, advises and drafts
public policy.
So those are our main mainfocus.
Speaker 1 (06:01):
That's outstanding.
Now, this is in DC, correct.
Main focus that's outstanding,now, this is in DC, correct,
correct, okay, and a specialfocus of your work is
image-based sexual abuse, whichis sometimes IBSA, which we may
refer to it to IBSA in this show.
Help us define that.
Speaker 3 (06:16):
So image-based sexual
abuse is exploitation involving
images or pictures, videos and,fundamentally, like all sexual
abuse, it's a violation of aperson's privacy and personal
autonomy.
Speaker 1 (06:30):
So can you give us
some examples of like what
abuses are image-based?
Speaker 3 (06:35):
Yeah, sure.
So that could be anything fromyou know, a sexual assault that
was filmed, or a secretrecording, which that could be.
You know, one person isrecording the other during a sex
act and that person doesn'tknow they're being recorded, or
both people would be recordedand they don't realize there's a
hidden camera.
Or it could be a hidden cameraplaced in a private place like a
bathroom.
(06:55):
It could be an image thatsomeone consensually shares with
another person but did notconsent for that to be
distributed to third parties oron the internet.
So all of those things can be apart of IBSA.
Speaker 1 (07:09):
How common is this
type of offense?
Speaker 3 (07:12):
Well, it's extremely
common and has exploded
massively, even since thepandemic, where our lives became
even more online.
But ever since the internet youknow, I mean this is this has
been around ever since you couldtake a picture of someone you
know there's been non consensualimages taken and shared.
But of course, the internet hasexacerbated that and
(07:32):
particularly because it's verydifficult to hold online
websites accountable like youcould maybe hold on other
distribution companies or placesin the real and the physical
world, the internet is harder.
It's harder to get things takendown, it's harder to hold
perpetrators accountable.
So it's really exploded and themore that our lives become
(07:54):
virtual, the more that thisabuse is growing.
Speaker 1 (07:57):
So are there any
types of image-based abuses that
are more common?
And I mean you mentioned videos.
There's photographs that can beput on websites.
It can be used to blackmailpeople.
What's most commonly, what arethe most common tactics of doing
this?
Speaker 3 (08:14):
You know, I want to
maybe make a distinction between
image-based sexual abuse thatinvolves children and
image-based sexual abuse thatinvolves adults, because of
course it does affect childrenand adults.
But when I talk about IBSA, Iam usually talking about adults
because, in my opinion and basedon our laws, image-based sexual
(08:35):
abuse that involves a child ischild pornography, so that's a
much more severe crime, it'scontraband, and so I wanna keep
that as the most severe form andso I separate it.
So, setting aside the ways thatchildren are exploited that way
, which that is growing andexplore and exploding as
children are completely onlinethese days, common ways is
(09:01):
people consensually sharingimages that then someone else
puts you know, their partner nowdistributes to other people or
their.
Those images are taken becausethere was a hack or some other
kind of, you know, privacyviolation.
Somehow a consensual image isdistributed.
That that's really really,really common.
But, um, the I you know, use ofhidden cameras and secret
(09:25):
recording is also unbelievablyon the rise, especially how,
because these hidden cameras canbe almost undetectable.
Now, you know, I think we'veseen the news stories where
hotel rooms, airbnbs, all thiskind of stuff, but also just bad
actors filming women, you know,without their knowledge, and
(09:50):
the truth is, unfortunately,this kind of voyeuristic
pornography that it produced.
This whole concept of you knowa woman or a man being filmed
and they don't know about it is.
There's a high demand forconsuming that material online,
so that, because of that,there's an incentive for people
to do it.
Speaker 1 (10:04):
Unfortunately, so
that because of that there's an
incentive for people to do it.
Unfortunately.
How is maybe AI, artificialintelligence really compounding
this issue?
Speaker 3 (10:12):
Well, because most of
these images and their
distribution involve theInternet.
It's going to touch AI, right?
Ai is going to be involved withany content that's on the
internet today, and the truth isthe way that AI is progressing
so rapidly.
Anyone with a photo on theinternet so you have social
media, or even you have apicture on someone else's social
(10:35):
media you don't even havesocial media but, like your mom
has a picture of you puts it onsocial media.
Somebody could take that imageand create a sexual photo of you
, a nude photo, even a video.
They could take your face andput it into a pornographic video
.
That would look so real that noone would be able to detect
that it's fake or that's AI.
Speaker 1 (10:54):
That is so
unbelievable to me.
I mean, we hear about it a lot.
It happened to many celebritiesthose are usually the ones that
we hear about on the news andthat's a deep fake, right, Right
, there's deep fake images andit's everywhere, and a lot of
times I can't even trust whatI'm seeing on the internet.
(11:14):
You know, just related to couldbe looking at houses or looking
at, you know, interior designor something like that.
Like nothing looks real anymore.
Everything looks like tooperfect or it's been.
It can't possibly be a realhouse or a real room.
You know, it's just all soperfectly laid out or even
(11:36):
architecturally impossible.
Sometimes, Like some of thethings I say, I'm like that's
just not possible.
So it's making it much morechallenging, right, too, because
of AI.
So AI can do, I think, manywonderful and amazing things for
us, but it can do so much harmas well.
(11:56):
What are the implications thatwould be associated with the
criminalization of IBSA where AIis concerned?
Speaker 3 (12:05):
Well, I think we need
to address it.
So AI like, I believe, theinternet, writ large because
there are no current regulations, strong regulations or even
remedies for victims.
There's no accountability andthere's no responsibility.
So someone designing an AImodel is not making that safe.
(12:25):
They're putting that as opensource code so any bad actor
could use it to create badmaterial.
I think that is the wrongattitude.
I think it's really great thatwe have all the technology we
have today, that we have theinternet we have today.
Part of that is because of thissort of hands-off regulatory
strategy we've had.
But I think it's time tore-examine that, because the
(12:47):
real world consequences andharms of basically having a
lawless internet are real andthey're severe.
So I think it's time to putsome balance in there.
You know, we can still haveprivacy, we can have innovation.
We may have to give some of itup or slow it down so that we
can keep people safe.
So I think we need to do that.
Our laws are woefully behindwhen it comes to the
(13:09):
non-consensual sharing of images, even without AI.
But AI really isn't accountedfor.
But the good thing is there areactually three bills before
Congress right now that wouldaddress AI and non-consensual
images.
Oh, can you tell us about those?
Yes, so the SHIELD Act, theDefiance Act and the Take it
Down Act all addressnon-consensual images, and I
(13:33):
think two of the threespecifically address AI.
So the Take it Down Act is myfavorite one because, as a
lawyer, what I have always foundfrustrating about these IBSA
issues is you know, copyright ishandled Okay.
There is no copyright issue onthe internet.
Um line wire is gone.
(13:53):
People, we don't have a problemwith you know, videos or images
that are copyrighted, they comedown immediately right, Like if
you go on YouTube you are notwatching a Disney movie, Um, so
that proves to me that, while itmay be difficult, it is
possible to control for some ofthese things.
Yet when we talk aboutsomeone's, you know you have a
(14:25):
period where you're notifiedthrough this laid out, already
built out process, same as youwould a DMCA.
That's a copyright.
If there's a copyright image,you send a DMCA takedown.
They have, you know, a certainamount of time to take that down
and if they don't, there'sgoing to be big penalties.
If they think it's allowed tobe there and not a copyright
violation, they can do a counternotice and then you can
(14:47):
litigate that.
So the Take it Down Act wouldprovide a process for victims to
get their stuff taken downhopefully quickly, because the
penalties would kick in if thewebsites don't comply quickly.
And it would also address AI,non-consensual images.
So as long as the image is anidentifiable person, so it's AI.
But we can tell you know, wecould tell it to me, you could
(15:08):
tell it to you, you would beable to take action.
You'd be able to say hey, thisis a non-consensual AI image of
me, Take this down.
It would also provide penaltiesagainst the websites, but also
the perpetrators.
So I really like that bill, butthe SHIELD Act also would.
We don't even have a federallaw that makes non-consensual
sharing of images illegal, sothe SHIELD Act would make it
(15:29):
illegal.
So like that's a bigfundamental that we just need
for sure.
Speaker 1 (15:33):
So how would a
federal law like that work on
the state level?
Speaker 3 (15:40):
Well, it's about
prosecution.
So right now, not all stateseven have a law that make
non-consensual sharing of imagesillegal.
A lot of states most statesreally have something, but some
don't, and it's a patchwork ofstandards, you know.
Some of them cover something,some are really really narrow
and are not very workable.
So this would put prosecutorialauthority in the Fed's hands.
(16:03):
So if you had an image that wasnot consensually shared, if you
could report that to FBI and,like a United States assistant
attorney would prosecute thatcase, which is honestly, I mean,
in some ways you want thishandled locally and for some
things you want locals to handleit'd be better.
It's usually much faster.
But the problem with internetis it usually involves a lot of
(16:24):
jurisdictions.
So I think having federalprosecutorial authority actually
makes sense, because thefederal prosecutors can cover
all the jurisdictions where thisimage appears, whereas, like
otherwise, you'd have to gostate by state where maybe this
image is being uploaded from orthe perpetrators are residing,
that kind of thing.
Speaker 1 (16:44):
So let's talk about
some of those battles that
you've had to fight withcompanies and others, of taking
down images.
Give us an idea of how thecenter has, you know, approached
companies, businesses orentities that perpetuate or
exacerbate IBSA, and some of theoutcomes.
Speaker 3 (17:04):
So we often help
victims of sex trafficking or
sexual exploitation, try and getsome of their images taken down
.
We just do that because it'sthe right thing to do.
I wouldn't even say we'reparticularly adept at it, but
we're at least trying and ourexperience is even when I send a
legal letter to a website, mostof the time it's ignored, even
(17:25):
if this is a big website.
So even if, like we've sent wesent hundreds of letters and
requests to Reddit, for example,for child exploitation material
, and they just ignored us andnever took any of it down.
And then a lot of times, foradults, we may get off some of
the bigger sites, but it's beenshared on tons of smaller
(17:48):
foreign sites.
They won't respond.
It's hard to even figure outwho really owns them.
It's very difficult to getcontent taken down.
I mean, the truth is, if youhave abuse images online right
now, hopefully things willchange, but with the current
legal landscape it's almostimpossible really to get them
all removed.
It's like a game ofwhack-a-mole.
Speaker 1 (18:09):
That just seems
absurd Because, to the point you
made earlier, if it was aDisney movie that's copyrighted
and somebody put it up onYouTube, you're going to get
sued.
You know you can't do that,it's just an infringement of the
copyright.
So I'm not sure how that'sreally any different of someone
(18:29):
putting a photo up of you or ofme against our will and we can't
say that we want it taken down.
It just, it just seemsludicrous.
Speaker 3 (18:41):
It is because the
violation of rights is the same.
The violation of of our legalrights has occurred at least as
much as a copyright infringementis a violation of legal rights.
But the problem is there's afederal law called Section 230
of the Communications DecencyAct.
It was passed in 1996 in thebeginning of the internet, and
(19:01):
essentially what it says is itwas actually a bill that was
supposed to help protect kidsonline, but it was poorly
written.
A lot of it got struck down andall that left is left of that
bill is what was supposed to bethe concession to tech companies
.
So they would agree to the bill, which said you know, a website
can't be held liable for whatthird parties put on their site.
(19:24):
Now that sounds kind of commonsense, but it's been interpreted
by courts to be almost blanketimmunity blanket immunity.
So anytime you try to sue awebsite like you would for
copyright infringement becausecopyright infringement is an
exception the court will justtoss it out and say they're
completely immune.
It does not matter if they knewabout it, it doesn't matter if
(19:44):
they profited from it.
They are not responsible forthird party content.
So, for example, we had two 13year old boys who were extorted
on Snapchat into providingsexually explicit images of
themselves.
Those images later ended up onTwitter, were massively
distributed.
The whole high school knew oneof the boys was suicidal and he
(20:06):
had reached out to the platform,pleaded for them to take it
down.
They actually finally respondedto him and asked for his ID.
Meanwhile kept this contentlive, checked his ID, said we
reviewed the material.
So they checked his ID thatshowed he was a child.
Reviewed the material whichdepicted a 13 year old engaged
in a sexual act and said we'renot taking it down.
And so far the court has saidsorry, like they're immune.
(20:29):
They're immune for that becausethat was a third party that
uploaded it, even though he's achild, even though it was a
child and even though it'scontraband.
So like if, if, if the childpossessed that, if he had it on
his phone, he could go to jail.
But twitter, which acknowledgedpossession of this child
pornography, claims they areimmune.
We're appealing that decision.
We'll be having oral argumentin february at the ninth circuit
(20:52):
.
But that's the current state ofthe law is that these websites
are immune for the most heinousactivity.
But that law is what'sresponsible for sort of the
landscape we have today.
So this, this landscape of wecan't do anything about things
online or because of that lawand how it's been interpreted.
So you know, that's what weview at our law center, at
Nicosia Law Center, like that'sour, that's our goal, that's our
(21:13):
job is to get some legal reformin this area, because otherwise
victims are powerless.
Speaker 1 (21:21):
For sure.
So let's go back just a littlebit, because I want to try to
understand from a legalperspective.
When did IBSA become definedand codified to, where it became
a pursuable offense in thecourts?
Speaker 3 (21:35):
codified to where it
became a pursuable offense in
the courts.
So it isn't really.
There was a movement about 10or 15 years ago by the Cyber
Civil Rights Initiative to passwhat they called then revenge
porn laws, because they werelawyers who had women having
this issue and there was nothingon the books they could do
about it.
It wasn't even a crime and sothey just pro bono.
(21:56):
These lawyers got together and,with their clients and lobbied
to have these state laws passed.
So that's why we have some laws.
It's usually called differentthings revenge porn or sharing
of non-consensual sharing ofimages.
There's those state laws but,like I said, it's really not a
clear federal offense.
(22:16):
And for civil remedies meaninglike you could sue someone for
this there's not clear civilremedies.
And I think it was 2019 or 2020, the Violence Against Women Act
was amended to provide somecivil remedy, so that was the
first time.
So we're talking 2020-ish wasthe first time there was any
kind of potentially federalcivil remedy.
So that was the first time.
So we're talking 2020-ish wasthe first time there was any
kind of potentially federalcivil remedy for women or men or
(22:39):
anybody who had, you know, thiskind of abuse happened to them,
but it's very, very limited andwho you can bring this against?
And it excludes websites.
You can't hold a websiteaccountable.
So that's why this isn't reallyan open legal landscape.
This is an emerging legal issueand that's why we have sort of
a suite of bills before Congressright now to address and fill
(23:00):
these gaps.
So and the Take it Down Actjust passed unanimously through
the Senate.
So you know, for all thelisteners here, let your
Congress people know that thisis something you care about, Let
them know this is a prioritythat you're dumbfounded and
upset that you would have noremedy, that your children would
have no remedy, your loved oneswould have no remedy if this
(23:20):
kind of abuse happened to youand the truth is, it could
happen to anyone because we'reall online.
Speaker 1 (23:25):
Yeah, absolutely.
And now let's back up just onemore time, because you use the
term revenge porn, and so I wantto try to get a definition for
that and what it means and whatit does not mean.
Speaker 3 (23:36):
Sure.
So it was a it's just acolloquial term that rose
because there was a trend ofboyfriends or husbands or male
partners, you know, if they wereangry with their spouse or
girlfriend or they broke up withthem, they would take explicit
images and share them online aslike a way to get revenge.
They would take explicit imagesand share them online as like a
way to get revenge or theywould use it to blackmail them.
(23:59):
There was rise there's a rise,of actual revenge porn websites
where they called it that like.
They called it revenge porn,and so it was an entire
community of men sharing andconsuming this content.
There's men who like the factthat this was, you know, the
girl next door whose husband'smad and is sharing this image
without her will, and then therewas men doing it right.
(24:20):
So that's why that colloquialterm rose to prominence and in
our sort of collectiveconscience, we got to know what
that is.
If I say revenge porn, mostpeople kind of they could guess
what I mean, but the limitationsof that term is that then
people don't understandimage-based sexual abuse outside
of that, and the truth is,image-based sexual abuse can
(24:43):
happen for a lot of reasons andmale or female, but mostly male
partners can share privatephotos for motivations that are
not revenge-based right.
They could do it not becausethey want the revenge.
They could do it because theyhave a fetish where they like
sharing images of their sexualpartner, or they're just a nasty
person, or they want to makemoney.
There's lots of motivationsoutside revenge.
(25:04):
So that's why the term you know, we would advise not using that
term and moving to image-basedsexual abuse, because it's just
covers.
It's more accurate.
Speaker 1 (25:12):
Yeah, I mean that
makes sense.
Thank you for giving us thebackground and context on that.
Now, the center recentlyreleased a thorough guide for
IBSA for practitioners andservice providers in particular
to use to identify, classify anddefine IBSA so that they can
adequately address it in theirrespective disciplines.
(25:32):
Tell us about that guide andwhere it can be found.
Speaker 3 (25:35):
Sure, so you can find
that on our website, which is
nsexualexploitationorg, and thenthe specific web address would
be nsexualexploitationorg.
Slash issues, slash image-basedsexual abuse, and there's a
hyphen between each wordimage-based sexual abuse.
But if you go to our website,image-based sexual abuse is a
(25:57):
main topic so you can navigateyourself there.
But what we do there is reallytry and compile the research and
the testimony of individualswho've experienced this, to
define terms, to discuss thecontext in which this happens
and help people understand.
You know, because a lot ofpeople will experience this and
they're confused as to if, ifthis is even wrong, is this even
(26:18):
illegal?
Is this abusive?
They don't know how to feel.
So it helps everyone understandyou know what is right and what
is wrong in this context.
Is it okay to draw a line andsay, yeah, okay, I did share
that image with you, but thatdoesn't mean you should post it
online, because the problem is,I think a lot of people start
blaming the victim and say, well, you shouldn't have shared that
image with your partner and soyou can't really complain now
(26:41):
that it's on Pornhub, right?
Speaker 2 (26:42):
And that's just wrong
.
Speaker 3 (26:44):
You should be able to
have a relationship and share
images, and not that.
Doesn't a blanket consent forthat image to now be put on the
Internet and even be makingmoney being monetized on a
pornography website?
Speaker 1 (27:04):
Yeah, absolutely.
Now we talked a lot about theperpetrators, the people who are
posting these images.
Right, let's talk about thevictims a little bit.
As you work with clients andpractitioners, what have you
found to be some of the mythsand misconceptions that these
victims encounter?
Speaker 3 (27:16):
I think the biggest
one is that this doesn't really
cause harm when it causesgrievous, grievous harm.
There's a couple examples I cangive.
One is you know, I am honoredto serve survivors of sex
trafficking or other abuse whoare brave enough to you know,
want to hold others accountableand prevent this from happening
(27:38):
to others.
And they will say you know, theoriginal abuse is traumatizing,
as that was, and as horrifyingas that was.
The images being circulated isworse, because I feel I can't
move on from that.
I can't escape that.
That's out there for everyoneto see and it won't end.
It's an abuse that just keepsgoing.
(28:00):
And another example I want togive, because I think this was
really educational for me, is Iwas speaking with a woman who is
being targeted by an AI IPSAcampaign.
So somebody is taking images ofher and creating pornographic
content against her will.
So she was.
She doesn't even know who thesepeople are, but she was not
(28:23):
sexually assaulted.
These aren't real images of herin the sense that, like they're
not taking images of heractually engaged in sexual
activity, they're contrived,they're totally AI generated.
And she said in tears it feelslike I've been raped, actually
engaged in sexual activity.
They're contrived, they'retotally AI generated.
And she said in tears it feelslike I've been raped.
It feels like I've beenviolated because these are
everywhere and people areconsuming them.
(28:44):
So that's what I think peopledon't realize Like even if you
didn't experience a sexualassault, having your images
consumed by people, gettingsexual gratification out of that
against your will, is anunbelievable violation.
It's humiliating and it feelsit's just.
It's very, very harmful to theindividual.
Speaker 1 (29:03):
Oh yeah, I think
that's harmful.
It's so disgusting and I'm justtrying to get my head around
what's happening to this woman.
So this group of people whooperate this AI are using her
image and likeness and justputting it into pornographic
images and videos.
Is that right?
Speaker 3 (29:23):
Yeah, it's a.
It's like a stalking,harassment campaign.
So they're harassing her andthis is their method.
So you know, stalkers and herpeople stalk and harass women in
an obsessive manner to destroytheir lives, right.
And now they're using AI to doit and they're just bombarding
her with these images andbombarding Right, like she keeps
(29:45):
getting some taken down, theykeep appearing and it's just
this constant humiliation shecan't escape and we don't
exactly know what theirmotivations are.
But, yeah, they're stalking andharassing her via AI.
Image-based sexual abuse.
Speaker 1 (29:59):
So what's the
objective here with these
perpetrators?
Is it to humiliate, diminish,destroy?
Just have fun all of that.
Speaker 3 (30:08):
All of those things.
It can be purely because theyjust want to make money.
It could be because they getgratification off the
humiliation of the person.
It could be because it's likethey get a boost in their
community.
They're part of, like acommunity, and sharing these
things gives them clout, givesthem credibility.
It could be to hurt the person.
So it really.
(30:29):
I have seen each one of thosethings be at play.
All of them are.
You know.
These things are happening forall of those reasons.
Speaker 1 (30:38):
So people who commit
those types of offenses?
How common is it for them toactually engage in physical
violence or other gender-basedcrimes?
Speaker 3 (30:49):
So there's strong
research to show that sort of
low-level sex crimes have aconnection to eventual or
unknown very serious sex crimes,and a really current example of
that, I think, is the GisellePellicott case in France.
So you probably know and yourviewers probably know, because
(31:09):
it has rocked the world.
Because it has rocked the world.
But Giselle Pelicott, you know,was a 65 year old woman who
recently discovered that herhusband was drugging her and
then filming strangers coming tothe house and raping her, and
this included over 70 men in hercommunity.
55 of them are being prosecutedand her husband fully admitted
(31:30):
it.
But her husband was actuallycaught, which I think this is a
detail most people miss.
Her husband was caught becausehe was doing upskirting videos
in a grocery store.
So he was in a grocery storetrying to snap pictures of women
underneath her clothing.
So that's socially deviantbehavior, right, that's
antisocial behavior, and sothat's a huge red flag.
(31:53):
There is usually a connection.
If somebody is engaging in thatkind of dehumanizing,
antisocial behavior, it'sprobably not an isolated
incident, and especially themore brazen it is, that's
usually a sign like this isescalated to quite a degree
because they're now not evenafraid of getting caught, right,
doing it so brazenly in thepublic is usually a good
(32:14):
indication that they've done itbefore and they're going to do
it again.
And doing it and getting awaywith it is a motivation and a
reward for that behavior.
That will often cause men tocontinue doing it and escalating
that behavior.
Speaker 1 (32:27):
So the men who've
been raping her are being
prosecuted?
Is he being prosecuted as?
Speaker 3 (32:31):
well, he is.
He's being prosecuted and hefully admitted it.
What's alarming?
And that's now wrapped up andgoing, I think, to sentencing.
At this point, the sad thing isin France the most he can even
get is 20 years.
This went on for 10 years.
She was raped by over 70 men andreceived multiple STDs.
(32:52):
Years she was raped by over 70men and received, got multiple
stds, and the most time he couldpossibly get is 20 years.
Um, which is just disturbing,but he admitted it.
And what's crazy is all theother men.
So the 55 other men beingprosecuted were all trying to
make excuses.
Um, I didn't know she wasdrugged, I thought this was a
sex game, I thought she liked it, all this stuff, even though,
(33:12):
ironically, the main perpetrator, the husband, was saying, oh no
, they knew.
Like I fully told them thewhole story, like they knew she
was drugged and she was unaware.
Um, and the other scary thingis these were not a list of 55
sex predators in theneighborhood.
They were all like very normalmen.
Um, there was a fireman, alocal baker that actually knew
(33:33):
her in her neighborhood, a nurse.
You know, these were youraverage everyday men engaging in
this wildly antisocial behavior.
So that to me, is such a suchan indication of a bigger
societal problem.
Speaker 1 (33:48):
Absolutely.
I mean, she's so courageous toeven come out and talk about
this and take it as far as shehas, and it's something I think
that could possibly be thetipping point for the rest of
the world in reallyunderstanding that this stuff
happens and the scale is massive.
Right, that's exactly right.
(34:11):
Now.
The center has been on theforefront of policy and
legislation when it comes tocombating IBSA and particularly
in the fight for corporateaccountability.
Have you made, have you beenable to make, any strides in
this space?
Speaker 3 (34:26):
Yes, so we have and I
mean we so we have a corporate
advocacy department where wespecifically try and reach out
with corporations that we think,either knowingly or unknowingly
, are facilitating sexualexploitation.
So you know, we've sat downwith or sent letters to,
(34:49):
snapchat, google, apple andidentified.
You know the issues and youknow I think often, sometimes,
they really don't know what'sgoing on, but then sometimes
they do, and I think, for thereasons I explained earlier
about you know, they know theycan't be sued or they feel they
can't be sued.
So at some point there's anequation that happens where they
say, well, we could fix this,but it will cost us money and
lose us engagement.
It will cost us money and loseus engagement, whereas if we
(35:12):
don't fix it, nothing willhappen.
And every time then theultimate decision is to not do
the right thing.
Um, we do end up having success, but it just takes an enormous
amount of pressure.
Um, so, for example, when um,google was providing tons of
laptops and things to schoolsfor free which is great but we
(35:33):
were like, can you please putthe safety controls for these
kids?
You just defaulted on, becausethese schools especially like it
was, underprivileged schoolsoften really need them.
They don't necessarily have afull time IT staff.
They don't know how toimplement that stuff to protect
these kids.
You're basically putting intheir hands, you know, a really
dangerous device that adultscould use to harm them.
(35:55):
They could be exposed toharmful material and they
wouldn't voluntarily do it.
It's just crazy, you know.
But ultimately, if we do a bigenough campaign, we get traction
and there's a public outcry,they will.
Or if they feel there's goingto be legislative change they
don't like, then they'llvoluntarily make a change.
So we helped, with many otheradvocates, arranged the January
(36:18):
hearings earlier this year wherethe big CEOs of the tech
companies came to Congress, andwe provide a lot of material to
Congress to ask them thesequestions.
Like, for example you know, Ithink I remember Josh Hawley
asking Facebook.
You know, or maybe I can'tremember, who it was.
You know you have this.
I think it's Ted Cruz.
Actually.
You know you have this warningthis may what you're looking for
(36:38):
may depict child sexual abuse.
Do you want to see the imageanyway?
And then you allow them to seeit?
It's like why?
So you have an algorithmpicking up that this might be
child sexual abuse?
Why are you not just deletingthat image and not allowing the
person to view it?
Why are you allowing them totake that step of view anyway?
So asking these tough questions?
And so they have made somechanges.
(37:00):
Finally, snapchat's turned somedefault things on automatically
.
Instagram has made some changes,like, for example, simple
things.
We said Instagram please don'tlet strange adults be able to
direct message minors that theydon't have a connection with.
Like it's not a parent.
You know they could tell RightParent, same last name.
Or you know there's a.
(37:21):
They're already following eachother.
This is an adult they're notfollowing.
Why should they be able to DM aminor Just like disable that,
and it took forever, you know,but now they do do that by
default.
So we do get these incrementalimprovements, but that's where
you know the advocacy you have.
In our view, you have to havethe public advocacy.
You have to engage thegrassroots, you need to get
(37:42):
parents saying what they careabout, and then you also need to
push for legislative change orlegal recourse.
Sue, because without pressureand without a feeling that this
will affect their bottom line,they won't do it and truly like
reputational harm isn't enough.
They will not.
That that they have provenreputational harm is not enough
(38:03):
to motivate them to do the rightthing.
Speaker 1 (38:06):
Yeah, I mean, that's
pretty obvious, right, because
there's so much of the wrongthing that's actually still
occurring, because there's somuch of the wrong thing that's
actually still occurring.
I'm guessing that peoplelistening have a lot more
questions and I may not knowwhat all of their questions are,
but where could we send themfor advice, guidance, resources,
additional information aboutthis type of abuse and what they
(38:28):
can do about it?
Speaker 3 (38:32):
So come to our
website nsexualexploitationorg,
and you can find our legal pagetoo, where you can send a direct
message.
If you want to speak with alawyer, we have a form there you
can fill out and we'll reachout to you.
So, and even just our public atNCOSE, n-c-o-s-ecom Public at
N-C-O-S-E dot com is our publicinquiries email.
(38:52):
We always get back.
I mean, it might not be rightaway, but we'll always get back
to you.
Speaker 1 (39:02):
So, in addition to
that, what recommendations would
you give to practitioners,service providers or just people
who are concerned not victims,but people who are concerned
about this issue to help themget involved as well as advocate
for or protect victims?
Speaker 3 (39:11):
So let your Congress
people know you care about this.
I really believe at this pointwe need the laws to change.
I don't think public pressureis enough, although we won't
stop doing that.
So let your Congress peopleknow that you care about legal
remedies for image-based sexualabuse.
You care that prosecutors couldactually do something about
(39:35):
this.
At this time, if youspecifically want to support
these bills, take it down.
Act, defiance act, shield act,please do.
We have information about allof those on our website.
We also, on our website,provide action.
So we have a whole page whereyou know we can help you quickly
send a message, either to likeCEO of a corporation or send a
(39:58):
sign, a petition that we'regoing to send to a
congressperson, those kinds ofthings.
So we do have actions that youcan quickly take, but that's the
priority I think right now isletting Congress know.
This is important.
Speaker 1 (40:08):
OK, before I let you
go, give us your website just
one more time so people can takethat down and make note of it.
Speaker 3 (40:14):
Sure it's
nsexualexploitationorg.
Dani, thank you so much fortalking with me today, thank you
so much for having me and thankyou for caring about this
important issue and talkingabout it.
Speaker 1 (40:26):
Absolutely.
Thanks so much for listening.
Until next time, stay safe.
Thanks so much for listening.
Until next time, stay safe.