Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Hey, Welcome to Tech Stuff. OZ is out this week,
so instead of a weekend tech episode, I wanted to
bring in an expert to talk about the absolute hellscape
unfolding on Elon Musk's social platform X. Heads up, We're
going to be covering a sensitive topic, so if there
are children with you, you might want to listen to
this at another time. Since late December, Elon Musk's AI
(00:37):
model Grock, which is built into X, has been generating
non consensual sexual imagery when prompted. Now you may be thinking,
I've heard of this happening before, and you'd be right.
Congress even passed a law to curb the creation and
distribution of sexually explicit deep fakes, So why is it
still happening? Four or four Media Samantha Cole has been
(00:58):
following the adult industry online culture of sex for years.
She recently wrote an article titled Grock's AI sexual abuse
didn't come out of nowhere, and while it's disturbing, we
are thrilled to have her here on the podcast. So welcome, Sam.
Thank you so much for joining us. Thrilled to be
here to disturb. It is the thrilled disturbance that we
(01:19):
are living in sam Can you talk a little bit
about the early days of the Grock scandal, Like how
did this start?
Speaker 2 (01:25):
Grok has been doing this for a while, like you said,
like the prompts that are like make her blank, make
her wear a skinny bikini, make her wear see through shirt,
make her bend over and do such and such has
been a thing for a while, and I think the
escalation in the last couple of weeks has been I mean,
(01:48):
it went very viral and became very popular with a
lot more users on the platform, and it's started generating
what other outlets have reported is actual child sexual abuse
material AI generated, which is, you know, non consensual intimate imagery,
And now they're reports of it creating images of children
using AI, So not real kids that we know of,
(02:11):
but still illegal in a lot of places and obviously
still extremely harmful content to be on one of the
biggest and most mainstream platforms that we have right now.
Speaker 1 (02:20):
So how does this compare to other, like commercially available
AI models, How is it different?
Speaker 2 (02:25):
So Grock was made by Elon Musk as this alternative
to open AIS chat, GBT or even Claude or some
of the other big popular tech giant chatbots that also
create images, And he made it because he wanted a
quote unquote based, un restricted air quotes, free speech chatbot
(02:51):
that would not have a bunch of guardrails that these
others have. And these others have guardrails such as they
won't generate sexual igery in many of these cases, they
definitely won't generate images of children in sexual scenarios. So
it wasn't his explicit intention when he first made this chatbot,
(03:12):
or at least what he said what we know of publicly,
to create a se sam generator, a child sexual abusymery generator.
But he wanted something that was going to be like
quote unquote, non woke, no censorship. You know, it had
the cool little like rodent that would say a curse
word and like the anime wife Fu Girl, who would
(03:33):
you know, sex with you while you're in your test tesla,
stuff like that. So I think that's that's what sets
it apart in a lot of ways. And it's also
it's being run, like you said, on a platform natively
where people are also just posting normal stuff, posting news articles,
posting like their thoughts for the day, posting a joke,
a meme, and then alongside that you have images of
(03:54):
women being undressed and you know, neutified is what a
lot of these apps that do it professionally call it
notification or undressing alongside just everyday life.
Speaker 1 (04:06):
So unfortunately, creating explicit, non consensual imagery of women and
girls is not something new. Can you give our listeners
a sort of brief rundown of when this started and
when we started to see this online?
Speaker 2 (04:23):
Yeah, So the thing that pretty much struck me at
first when I first saw this happening with Rock and
everyone talking about sexual abuse imagery on x formerly known
as Twitter, is that this has been happening on Twitter
when it used to be called Twitter for a very
long time. It's something that lots of people who are
(04:44):
in the space of preventing sexual abuse imagery harms have
been talking about for a long time is that Twitter
was and still is full of abusive imagery of women,
real stuff like photos and videos that women don't want
on the Internet. We used to call it revenge more
it's non essential content that should not be spread like this,
(05:05):
especially on a very mainstream platform. So that's been an
issue for years and years and years. I would say
it's kind of something baked into Twitter at this point
where they didn't solve that problem before elam Must showed up,
and they're definitely not solving it now after he showed
up and fired thousands of moderators. So that's the context
that we're in as far as the real stuff, like
(05:26):
the actual recorded images and videos of women who don't
want their images out there. And then to add to this,
we have generative AI and apps that we've put it
on a bunch of four or four and are being
advertised in lots of spaces on you know, TikTok and
Instagram and are very accessible to lots of people, including
(05:48):
teenagers that say, would you like to see that girl
from the gym nude? Do you want to make a
chat out of her where she can never say no
to you? But that over to x where people are
just like I said, having normal conversations or not anymore.
Nobody has a normal conversational next anymore and every thing.
But you know, people were using it like they would
(06:11):
any other social platform. And then at the same time
you have built into it a non ponsential imagery generator,
this factory that is an escalation of what we're already
seeing with the standalone apps that you can download and
then swap someone's face into someone else's body, put them
in any situation you want.
Speaker 1 (06:29):
Can you talk a little bit about gam or Gate
and just like the rise of deep fakes in seventeen eighteen,
you know, where people were not actually using generative ad,
but they were using photoshop. Yeah.
Speaker 2 (06:39):
I mean, that's another thing that this entire situation is
reminded me of is the way that Gamergate, which was
a harassment campaign primarily focused on women in gaming and
then kind of spread out to just be women on
the Internet in general. Very misogynistic harassment campaign. It originated
on places like four tan chan and on these forums
(07:02):
that were known to be toxic and full of in cells,
but it eventually migrated and got the big popularity that
it eventually got and had the big impact on people's
actual lives because all of that content landed on Twitter,
and that's where you see people actually, you know, reporting
(07:24):
on it in a serious way, you know, people meaning
like the media taking it seriously. Because now it's on
this mainstream platform, it's getting so much more exposure from
lots of people. It's radicalizing some people being this harassment
campaign that it is a lot of it relied on
shame and sexual shame, so that definitely felt like a
(07:45):
parallel to me. It's like we had these sort of
like almost but not quite underground communities for lack of
a better word, creating content, creating harassment campaigns away from
the mainstream that eventually make their way to a place
like Twitter or literally Twitter and in this case x
where it has this explosive effect and it has this
(08:07):
farther reach on people's lives because you know, you're searching
sample on Google, it might be my Twitter that comes up,
and then you click on my Twitter, it might be
like tons of replies from people talking about my body
and harassing me in that way. And this is something
that like lots of women have to do with on
the Internet today. So that's just kind of a hypothetical example,
(08:27):
but it was actually happening to lots of women during
the time that we're talking about Steen twenty seventeen, twenty teen.
It feels very like a pattern, like a formula at
this point.
Speaker 1 (08:37):
So you talk about Twitter sort of being the for
lack of a better word, the dumping ground for this
kind of content and you write about how the National
Center for Missing and Exploited Children has consistently ranked Twitter,
and you know, subsequently x as quote one of the
leading hosts of child sexual abuse material every year for
the last seven years. What do you think it is
(08:59):
about Twitter an x as a platform that has allowed
for this type of content to proliferate for so long
before Elon bought it.
Speaker 2 (09:09):
I would have said moderation and the unwillingness and also
like the hesitancy to overmoderate or the fear of being
seen as like stamping out free speech and letting people
just harass other people on your platform endlessly. Twitter had
a Nazi problem before Elon took it over, and now
we have Elon, who is a white supremacist sympathizer at best,
(09:33):
owning the platform that had a Nazi problem and is
seemingly not at all interested in creating a healthy and
productive social media platform. It's mostly just an engagement and
rage bait and outrage bait farm, and part of that
is the non consensual stuff. So you had an existing problem,
(09:56):
it's just ten times worse now because Elon is you know,
in fucking mode and doesn't care what happens to the
people that are using his platform other than using it
as a way to talk directly to people who are
in power at this point. So I would say that's
what makes it uniquely fraught, and also just the influence
that it has on everyday life. I mean, tweets were
(10:17):
on CNN and still are. It's very much a place
where people thought was like the town Square. Hate that phrase,
but that's kind of about how people saw on Twitter.
Is you know, things were getting hashed out on Twitter
that were representative of real life. Was the attitude towards
Twitter for a really long time. That's part of why
Ellan bought it. He wanted that sort of influence. He
thought that he could make it the everything app where
(10:40):
everything that you wanted to do in life happened on
x So I think that effect where it's like it
is somewhere where everyone is or used to be. Now
a lot of people are not, but the fact that
it was this place where everyone was gathering, everyone's congreting.
You get jobs off of being on Twitter, especially as
writer or creative. You might get discovered off of Twitter, and.
Speaker 1 (11:03):
So many of people got hired off of Twitter. Yeah.
And comedy writers.
Speaker 2 (11:06):
Comedy writers, yeah, it's like where you like go to
stand out and end up working at you know, like
Buzzfew News or something.
Speaker 1 (11:13):
It was huge for that.
Speaker 2 (11:14):
So yeah, I know that I felt like I had
to have a Twitter presence because I was a journalist
in that era.
Speaker 1 (11:21):
Do you feel that way anymore?
Speaker 2 (11:22):
No, not at all. I mean I am on Twitter.
I'm on Twitter now because I'm reporting on Twitter, but
I'm not using it in the way that I did
then at all. Thank God I don't have to be.
I'm not required to be.
Speaker 1 (11:34):
But yeah, so you mentioned that the child sexual abuse
material is being created, but we don't currently think the
children depicted are real. You know, what would you say
to someone who says that this is a victimless crime.
Speaker 2 (11:49):
So we know from just speaking about the child sexual
abuse material stuff. We know from experts who work in
fields where they're trying to find victims, who are trying
to find perpetrators of sexual abuse material, that generated AI
has made all of their jobs so much harder, and
(12:11):
their job was already so incredibly hard aside from being
often trained on real children, and in some cases we've
found trained on sexual abuse material to make the generative
AI stuff. So there is actually a victim behind this stuff.
It's used to groom children in cases where a perpetrator
(12:31):
might send an AI generated image to a child and say, look,
this kid did this. Can you do this? And it's
like you don't even need to have an existing victim
to create more victims. It's just something you can generate
up with AI. Now. It also just it makes it
harder to find the real stuff and the real victims
(12:53):
who are being abused in real life because it looks
real and it's hard to tell the difference. So investigators
are spending timelyzing AI to find out whether or not
it's real and whether it's fake in the way of
finding actual kids, it's wasting their time. So it's just
it's polluting their occupation in a way that is unfathomable
(13:15):
considering what they have to contend with every day. Already.
Speaker 1 (13:18):
Has X tried to address this issue at all.
Speaker 2 (13:23):
I mean, I think there have been gestures at condoning
or condemning, sorry for Randslip, at making this problem go away,
But I think anything that they try to do at
this point, it's short of turning off Grock entirely and
making the damage generation not something that people can access
(13:46):
anymore is not enough. They're now being investigated by the
Attorney General. Rab Banta is just announced that he's gonna
investigate groc and X. I mean, I think it's just
like the cat is out of the bag for them.
This is a problem that they are going to end
up having dancer for I think, I hope.
Speaker 1 (14:09):
I don't know.
Speaker 2 (14:10):
I mean, who knows, you know, it's like these elon's
slippery like that, so who knows if we'll get away
with it. But it's definitely caught the eye of investigators.
It's caught the eye of a lot of these groups
that are working to prevent harms like this online. I
think it's definitely it's crossed a line for a lot
of people who maybe saw X as toxic and not
a place they want to be, and now it's like, oh,
(14:31):
this is actively creating really really bad material at scale,
and that's a difference. That's a line to be crossed
for them.
Speaker 1 (14:52):
After the break is AI generated non consensual imagery the
new normal stay with us. You know, it's obvious that
(15:17):
not everyone on X is generating and posting these explicit images.
Some people just use X the way that they use Twitter,
but I'm curious what effect you think these images have
on the broader X community.
Speaker 2 (15:31):
Yeah, I mean, I think that's a good question and
something that we should actually be thinking about more and
the effect that it has on people who are using
X who are probably used to this anyway, and then
also people who are not using it, And it definitely
normalizes normalizes it in a big way. I'm sure it
emboldens the people who are making apps that are for
this purpose, that are monetizing deep fakes and non essential
(15:55):
pornography at scale. I'm sure that they see this as like, Okay,
hell yeah, he's not facing any repercussions, so we're gonna
go full tilt at it, right, And I think that's
a worst case scenario, and I'm sure it's happening. But
then it's also like it just it normalizes this content,
(16:17):
which at the heart of it is about non content.
It's about using women's bodies in ways that they don't want,
it's about ignoring body of the autonomy. It normalizes all
of that to other people on the platform who might
be like, oh, well, if that guy's doing it, you.
Speaker 1 (16:34):
Know, maybe I should do it at the gym.
Speaker 2 (16:36):
Yeah, maybe I should take a picture of the girl
at the gym and try to see what ROC can
do with.
Speaker 1 (16:40):
It, which is just.
Speaker 2 (16:45):
Like, it's a shocking I think that's probably the most
shocking thing about it is that it's just it's no
longer like I'm using this app that pretends to be
something else but is secretly a porn app that makes
face wop images of girls in my class. It's we're
doing this in the open on Twitter, on x on
like the biggest, one of the biggest social media platforms
(17:05):
out there, and nothing bad is happening to me for it,
and the platform says it's fine. I think, especially when
there's no one else in your life saying it's not fine,
you know, it's it's just kind of creating the snowball
effect of people who are interested in it want to
try it.
Speaker 1 (17:22):
So what are the numbers in terms of like proliferation
of this, Like what what are we looking at in
terms of how much of this is online?
Speaker 2 (17:30):
Anyone can search on x right now you're at rock
and then like make her, you know, do this and
that or like some version of the prompt that they're doing,
and it's just multiple times a minute, every minute. People
are making a new one of those. I think conservatively
it's in the like hundreds of thousands of images all
together in the last week or so.
Speaker 1 (17:51):
So crazy, I.
Speaker 2 (17:52):
Mean, just based on the every time I check on it,
and it's just like a running You can't keep up
with the feed of people trying to make her do
this and that, make her where clear date bikini or
whatever it is.
Speaker 1 (18:03):
This is obviously a huge problem or other countries responding
differently than the United States.
Speaker 2 (18:10):
Yeah, so I think the UK was talking about banning
x entirely in the country.
Speaker 1 (18:19):
On account of this deep fake pornography and grog. Yeah.
Speaker 2 (18:23):
I think offcom was getting involved in saying that it
could happen if they don't figure it out and turn
it off and somehow make it get better, Which is
interesting because they passed the Online Safety Act a while back,
and it's requiring a lot more censorship and a lot
more it's the age vocation stuff that we're seeing here
in the States where they're requiring a lot of sites
(18:45):
and social media sites to verify ages before you can
use them because they want to kind of push adult
content in that direction. And now they're kind of teasing
the idea of banning X over something that is so
blatantly criminal and so so harmful. It's like, you know,
where's the line for you guys? Where's when are we
(19:06):
gonna hit the button? But I don't know, I think
they I'm sure it's more complicated than that.
Speaker 1 (19:12):
And is there anything that like the general public can.
Speaker 2 (19:14):
Do stay off acts? If you don't want to see
some bad stuff? I would say that's that's a big one.
Maybe call your reps if this is an issue you
care about. It's definitely illegal in a lot of states
to create and to at least to generate and spread
and disseminate AI child sexual abas material, So you know,
(19:36):
what are those states going to do about it? Is
kind of my question.
Speaker 1 (19:39):
How does the Take It Down Act play into this?
If at all?
Speaker 2 (19:42):
The Takedown Act affords victims more recourse. So if you're
in a deep fake, and if you're in a sexually
explicit deep fake and you want that taken down, you
can message the websites and say I need this removed.
This is non extent material, and those websites need to
(20:04):
comply with those sorts of requests. And that's a federal law.
So if someone's making images of you that you don't
want to happen on X. Under that law, X should
not be hosting them. I mean, it's it should require
platforms to take some action when someone reaches out and says, hey,
this is bad stuff, I don't want it on here.
That's different than like actually putting the onus on the
(20:27):
platforms themselves.
Speaker 1 (20:28):
Right, So, the Pentagon actually announced that GROC will be
used in their network. This is kind of related to
what we've been talking about, But like, what will it
actually take for the US government to regulate X and GROC?
Will the US government regulate X and GROCK?
Speaker 2 (20:49):
I mean that when I saw that, I was like,
this is so perfect for the way that things are now. Yeah,
I don't know what it'll take anymore, especially since there
all in bed together. I really don't know. It feels
like we're in this extremely weird gray zone as far
as what is and isn't law, and like what is
(21:13):
I mean, what is law?
Speaker 1 (21:13):
Is what is law?
Speaker 2 (21:14):
But like, is there going to be any kind of
any kind of action taken to abide by the law
when it comes to these sites? And especially now that
the Pentagon is like, hey, we love Rock and we're
going to use it just unreal to announce that right
in the middle of this huge scandal.
Speaker 1 (21:32):
Is there anything about this particular scandal that we didn't
cover that you are concerned about.
Speaker 2 (21:38):
It's something that's kind of in the back of my mind,
but I'm not even sure how to articulate. Is how
striking it is that there is such a demand for this,
that it is so many people who really want to
make this kind of content kind of tells me that
it's already normalized and it's already happening, and this is
just the next step up from that. So you know,
behind all these prompts, it's a lot of them are
probably bots, a lot of them are probably people. And
(22:00):
we know for a fact that this sort of thing
and making images of people in intimate situations when they
do not want that is a very popular industry. And
I think that's something that we need to take some
action on, do some educating on, intervene earlier in especially
young men's lives, when they start going down these routes.
(22:21):
I think it's just and it's going to get worse
and worse is it gets more and more industrialized and
monetized and productized in the way that we've seen with ROC.
Speaker 1 (22:29):
Have you talked to victims of this content and like,
if so, what have you heard from them? Yeah? I have.
Speaker 2 (22:34):
It's it's always hard to really grasp what it feels
like to be in a deep fake, and especially a
sexual deep fake, until it happens to you. So I
think it is important to keep talking to victims of
this because they they have felt it firsthand and they
(22:55):
can really describe it. But what they feel is it's shocking,
it's rings, it follows you throughout your day, your week,
whatever it is your life. Some of the women that
have talked to you have ben targets of defate porn
talk about how they are afraid to go out in
public because their face is on this stuff and they
didn't want it to be. And it's like they have
(23:17):
nothing against like the porn industry or adult content or
sex work, but they didn't ask to be in this,
and it's something that changes your life and you have
to be ready for that, and they're afraid to apply
for jobs, they're afraid to date in some of the
extreme cases. It's just something that really latches onto your
(23:38):
name and your image in a way that is hard
to imagine if it hasn't happened to you, And most
of them just want it to stop. They just want
the imagery to stop being spread. They want people to
stop putting their name on this stuff and tagging them
and replying to them on social media with it. They
care less sometimes about the person who's making it and
more that they just stop doing it.
Speaker 1 (23:57):
Right right, sense that this is something that will slow down?
Speaker 2 (24:03):
No, no, Now, I mean the way that it's going now, No,
no way. I think people whose interest in a way,
you know, it's like this particular news cycle and like
hype cycle of Grock making this abuse imagery will slow down,
and as a result, people will make less of those
(24:23):
images on Grock, I'm sure. But all of this has
just created more and more of a wildfire for the
bigger issue of Yeah, I generated abuse material. So no,
I don't think I've been wanting I've been waiting for
it to slow down for six years, and.
Speaker 1 (24:41):
It just seems like there are new tools. It doesn't
seem like there's less content.
Speaker 2 (24:46):
Yeah, yeah, for sure.
Speaker 1 (24:51):
I want to say thank you, Sam, are we disturbed?
I don't. I don't want to say thank you, but
I appreciate your time and care to covering the story
and for taking the time.
Speaker 2 (25:02):
No, thank you for covering this. Thank you for shedding
light on this for sure.
Speaker 1 (25:05):
Thank you. That's it for this week for tech Stuff.
(25:27):
I'm Cara Price. This episode was produced by Eliza Dennis
and Melissa Slaughter. It was executive produced by me Oswa Oshan,
Julian Nutter, and Kate Osborne for Kaleidoscope and Katrina Norvell
for iHeart Podcasts. Jack Insley mixed this episode and Kyle
Murdoch wrote our theme song. Please rate, review, and reach
out to us at tech Stuff podcast at gmail dot com.
(25:48):
We want to hear from you.