All Episodes

June 7, 2023 48 mins

When it comes to what’s next in technology and the internet, we’d all benefit from listening to sex workers. Dr. Olivia Snow, dominatrix and researcher at UCLA’s Center for Critical Internet Inquiry, explains why people who engage in sex work have been right about the internet all along. 

 

‘Magic Avatar’ App Lensa Generated Nudes From My Childhood Photos: https://www.wired.com/story/lensa-artificial-intelligence-csem/

Are You Ready to Be Surveilled Like a Sex Worker? https://www.wired.com/story/roe-abortion-sex-worker-policy/

 

WANT TO SUPPORT THE SHOW AND GET AD FREE CONTENT? SUBSCRIBE ON PATREON AT PATREON.COM/TANGOTI 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
If you want to support there Are No Girls on
the Internet, please check out our patreon. There you can
get ad free bonus content. Just go to patreon dot
com slash tangoti and thanks so much. As someone who
thinks about the intersection of gender, sex and technology, where
do you.

Speaker 2 (00:17):
Think we are headed to hell? There are No Girls
on the Internet.

Speaker 1 (00:25):
As a production of iHeartRadio and Unbossed Creative, I'm brigit
tad and this is there are No Girls on the Internet.
People who engage in sex work face hostility online, but
they're also the ones who remain at the forefront of
understanding technology, the Internet and the role that it plays

(00:46):
in all of our lives from increased online surveillance and
legislation like Cesta Fasta, it's people who engage in sex
work who are often sounding the alarm about how online
harms might start by targeting one community like sex workers,
but will later manifest for all of us. So in
conversations about the future of the Internet and technology, it's

(01:07):
critical that all marginalized voices, including sex workers, are included.

Speaker 3 (01:12):
My name is Olivia Snow. I am a dominatrix and
a research fellow at UCLA's Center for Critical Internet Inquiry.

Speaker 1 (01:19):
Olivia studies sex work, technology and policy and writes about
it on the Internet. She's been ducks and harassed online
for it too, but that hasn't stopped her from centering
sex workers in conversations about the Internet and technology.

Speaker 2 (01:33):
So, something that I love.

Speaker 1 (01:34):
About your work is that you really like lean into
and interrogate this intersection of sex work and technology in
the Internet. What has that been like, Like, why do
you think it is so important for people who care
about the Internet and technology to understand how sex workers
show up on the Internet, how sex workers use technology,

(01:55):
and how has that been sort of prescient about where
we're headed next. I feel like sex workers are sort
of like they know what the fuck is happening on
the Internet and where we're going shit that like, as
they are constantly calling stuff correctly, how is that?

Speaker 3 (02:12):
I remember I was on the subway home from work
at like green morning, and I was going through I
think Facebook and on my people you may know, all
of my coworkers were on there, and like we never
we didn't know each other's real names, Like I don't
I don't want to know any other sex workers real names,
especially for in like a double session. You know, I
don't want to be like Mistress Megan or whatever, like

(02:34):
her name's like Emerald or something. So I was like,
oh my god, Facebook is doxing me, like docxing on
my cook I mean, I guess not docxing because docxing
means with malicious intent, but well maybe, but still, and
you know, it was obvious, like okay, like we're in
close physical proximity, we're sharing the same like Wi Fi network,
Like it makes sense that you know we're together twelve

(02:56):
hours a day. But you know, that's when I started,
really I don't know noticing how harmful, just like being
on the internet is, even if you're or having a
cell phone even back in like the nineties and the
early thousands, for getting their big accounts closed for doing
sex work, which like like how did they know, you know,

(03:19):
even like before cell phones, Like how the fuck would
like Chase Bank know that the money you're depositing is
from sex work and like try like okay, you know,
if you're depositing a certain amount of money and a
certain uh like size of bill at a certain atm
at a certain time, then like you're either selling drugs
or you're selling sex.

Speaker 1 (03:39):
Yeah, I've read your piece about or I've read pieces
about like being today being banned from platforms like Venmore
or even grub hub. Hor It's like I'm just trying
to get a chicken sandwich, Like I'm just trying to
get lunch. And I guess I wonder is there a
lot of the people that I talked to on this show,
they are marginalized people who have been raising the alarms
about things on the internet, And oftentimes it's like this

(04:02):
harm is going to impact a marginalized group, and then
it's going to impact everybody. Is there a vibe that
you feel where it's like a constant kind of I
told you so, or a constant kind of like I
hate to be right all the time where folks engagement. Yes,
we are like constantly talking about these increased levels of
surveillance and digital judgment and have been kind.

Speaker 2 (04:22):
Of for a while.

Speaker 3 (04:24):
Oh absolutely, And like you know, I used to really
love being like a too so because I'm like a
kind of an asshole, but like I don't love it now,
I hate it. I'm not working on an article right
now where like Canary and the coal Mine is the
metaphor that's usually used, but like that depends on the
miner actually listening to the canary and getting the fuck
out of the mind doesn't happen. It's more of a

(04:44):
like Cassandra kind of situation where we're like god like
and you know, people figure making it up. So like
I got banned from Door to Ash about a year ago,
and I tweeted about it went like semi viral, and
a bunch of sex in my reply is like, oh
my god, this happened to me, Like what the fuck?
I'm not a bunch of uh non sex workers were like, well, hi,

(05:07):
I'll know, how are you sure? Like like maybe it
was something else like did you violate terms of service
or whatever? And you know, like it was abundantly obvious
to me because I've also been you know, I've been
kicked off that, I've been kicked off cash up, I've
been kicked off field, which is weird because that's like
a kink sligh. Really, I've been kicked off, and you

(05:33):
know it's it's TikTok, like a budget. I'm my TikTok
account has been suspended like I think twice, and I've
never even posted anything on TikTok ever, Like at all.
So it's clear to me that there's some type of
data sharing happening, whether it's you know that you're in
the same device, whether you know other whatever details. I
don't know. I'm not the kind of.

Speaker 2 (05:54):
Doctor, but.

Speaker 3 (05:57):
You know, an algorithmic sweep of high risk account then
sex workers are going to get caught up in it, which,
like I mean, also, I feel like responses, at least
to my personally, losing door Dash was kind of like like, oh, well,
I'll make you a new account, or like cow, could
I send you a sandwich? Like no, that's not the point.
The point is that like fucking door Dash is like

(06:21):
somehow privy to this information. It's frustrating, I guess to
I mean, and I don't know if it's necessarily gaslighting
because like I guess, people really do believe that I'm
you know, making it up or whatever, or like this
can't be true. You know, door Dash isn't. Well, actually, no, DoorDash.
I remember when this went viral and the New York

(06:41):
Post reached out to DoorDash. They were like, no, we
would never So that's gaslighting. But yeah, I don't know,
it's it's Yeah, it's frustrated.

Speaker 1 (06:50):
It's one of those things that people really have to
listen to and center marginalized people.

Speaker 2 (06:56):
Sex workers are very much included.

Speaker 1 (06:58):
When you're thinking about the future of the Internet, and
so much of your work is like speaks to that
and looks to that, and like, you know, when we're
thinking about the kind of Internet that we want to have,
making sure that those voices are centered. Like you know,
when we have legislation like the Kids Online Safety Act
that you know has so much bipartisan support. When you
hear things like, oh, this is a law to legislate

(07:22):
the Internet in order to protect.

Speaker 2 (07:24):
Kids, protect kids, what comes to your mind? Like what
do you think when you hear stuff like that.

Speaker 3 (07:28):
I mean, you know, if you had to ask me
that question six years ago, I'd be like, we should
protect kids. Like literally nobody, at least nobody in Congress
gives a single shit about protecting kids. And like half
these people were on like the Epstein book, like they
they do not care. It's clearly like a smokescreen for

(07:50):
increased surveillance. To me, like that's all I think at
this point. I do not believe anyone. I mean, well,
I mean we've seen it, you know, in other sectors
like the like rising homophobia that I personally haven't seen
since like two thousand and four. That's also you know,
protecting kids from like groomers or whatever the fuck.

Speaker 2 (08:11):
You know.

Speaker 3 (08:11):
It's just I I don't even think that that people
making those claims at this point are doing it. It's
like I don't I don't think they're doing in good
faith period. You know, this is always and you know
it's not necessarily about sex workers, but yeah, this is
always about increased surveillance. They do not care about kids.
If they care about kids, they be working on guns.
Are they working on guns? No? So like that's yeah,

(08:34):
it's and it keeps working. Like also, like I mean,
I know, I'm like relatively quite privileged, Like I'm, oh,
white woman with a PhD. Like that's pretty high up there,
and I often get pushback that's like like you're a privilege.

(08:55):
Y'm like, well, yeah exactly, and you're still not listening,
Like I mean, imagine if I weren't, like then we
wouldn't even be having this argument because you wouldn't like
deign to bother with me.

Speaker 1 (09:12):
Let's take a quick break.

Speaker 2 (09:24):
At our back.

Speaker 1 (09:26):
Roe versus Wad was overturned almost a year ago, and
since then we've all had to navigate what online privacy
really means when simply accessing information online about abortion can
be used as evidence to put people behind bars. Now,
this is something that sex workers know all too well.
That same vast network of online surveillance and criminalization used

(09:47):
to target sex workers, also it threatens people who need abortions.
In a piece for Wired called are You Ready to
be Surveilled Like a sex worker? Olivia points out that
this post row world is simply the next and a
larger campaign to expand state surveillance and e roade the
right to privacy, a campaign that sex workers have been
fighting for decades. Do you see a connection between sex

(10:10):
work and abortion rights as it pertains to the Internet
and the way it's legislated?

Speaker 3 (10:14):
Oh yeah, I mean just like the ways that they're tracked. Like, okay,
so this is going to be an underground economy. You're
going to have to use cash. They're also going to
be monitoring how you use cash. You know, you're you're
like geolocation will be weaponized, you know, like facial recognition
technology and at like traffic stops might be used to

(10:36):
identify you, Like shit, that you thought was private is
it in a lot of the same ways, not like
almost all the same ways. And I mean it's all
just you know, uh, trying to restrict what women do
with their bodies, you know, and like regardless of if that,

(10:58):
you know, is the actual outcome, I think that's the
intent on one hand. On the other hand, I'm like,
I don't. I also don't care what the intent is
because like, like I don't want to waste my time
being like, but why is fascism like like not the point?
But yeah, no, I see them as entirely connected. I mean,

(11:19):
same with with the like anti trans legislation, Like it's
all about restricting what people do with their own bodies.

Speaker 1 (11:28):
It feels like we're in a much more dire place
because the last time that you know, before like in
the seventies, before Roe was the law of the land,
we didn't have this like vast surveillance network.

Speaker 2 (11:40):
We all didn't carry GPS devices in our pockets.

Speaker 1 (11:43):
And I feel like one of the ways that we're
kind of worse off today is that there has been
this sort of piece by piece tacit.

Speaker 2 (11:52):
In like normalization of that.

Speaker 1 (11:53):
But I think that we have this relationship with tech
companies that like it's fine that they surveil us, and
actually maybe it's good, like I have nothing to hide,
I'm never gonna do anything wrong, like right where we
don't even think about it anymore. Like when Ring released
that show that was like Ring Nation, funny videos that
you get from your ring camera.

Speaker 2 (12:15):
I think that's meant to I think it's meant.

Speaker 1 (12:18):
To signal to us that like this vast surveillance is
actually good because it creates funny moments that you get
to watch on TV.

Speaker 2 (12:26):
So you shouldn't really think that critically.

Speaker 3 (12:27):
About it right now. And I mean I think that's
also tied up with like the way that I've been
calling it, like the clout economy kind of works where
I mean, thank god I didn't tweet in twenty thirteen,
I think on Twitter ID but like you know, like
stupid shit that like you know, whatever we've all said,

(12:49):
stupid ship that we don't need to have on the internet,
that just kind of like follows. It also makes me
think of like the GDPR, like there's no right to
be forgotten. There's no grace if you, I don't know,
do something stupid and you know, everything is so bad
faith in a way.

Speaker 1 (13:11):
I myself barely show up on social media because I
don't want to.

Speaker 2 (13:16):
Deal with it. Like I obviously.

Speaker 1 (13:18):
There's there's like egregious cases, but like for the most part,
you know, you and I are the same age. We
like we have grown up online and so like we were,
we were products of our age, products of our environment,
products of like the social political.

Speaker 2 (13:34):
Climate at the time. I don't think this is the
internet landscape that we want.

Speaker 1 (13:37):
I don't think it's the internet landscape that actually fosters
things like good faith, disagreements, conversation, learning, you know, fucking
up in public and being like, oh I'm you know,
I messed up. I learned something. I don't think we
have an internet climate that welcomes any of that. It's
it's just an ocean of bad faith attacks that you
have to like like a minefield that you have to
navigate through, not something that you can actually show up

(13:58):
to to.

Speaker 3 (13:59):
You know, I learned something, and like we're living in
this fascist healscape. Like of course it makes sense that
you want to, you know, unleash that rage somewhere.

Speaker 1 (14:08):
Some people are very committed to weaponizing anything, and you're
someone who is like quite visible on social media. As
we're in this like weird place with musk owning Twitter,
how has it impacted how you.

Speaker 2 (14:21):
Show up on Twitter? Ask someone who was so visible?

Speaker 3 (14:24):
Well, you know, I recently went face out, maybe like
two or three weeks ago, after like long discussions with
my therapists. But like you know, there are so many
for until I got dogs, I was meticulous about not
even sharing my time zone, not saying anything about the weather,
not saying if I was taking like a train or
a bus or a car, or maybe I would like

(14:45):
say I was taking a car because I don't own
want anymore, so that people wouldn't be able to track
me down. And at a certain point, you know, it's
horrifying to be dogs obviously, like I've had to move twice,
but like my parents found my dogs, which is how
why I haven't like spoken to them in whatever a
year now, which all was kind of a blessing in
disguise because like, fuck those few people. I don't want

(15:06):
to say like liberating because it's not it's horrible, but
knowing that there's nothing you can hide like or yeah,
we were saying like oh I don't need to hide
anything like no, I actually do need to hide a
lot of things. It's like my address and how I
make money. But having that just kind of pulled from you,

(15:30):
I feel like it doesn't make things easier, but it's
like that there's a whole level, there's a whole thought
process that I would have to go through to be like, oh,
did I like erase the metadata? Did I crop out
the time zone? Did I you know that I don't

(15:51):
need to do anymore? And that's weirdly, well I guess
not weirdly, but that's you know, it freed up a
lot of mental energy I didn't realize I was spending.

Speaker 1 (16:01):
Yeah, I mean I don't share anything that is really
like truly personal, like in a meaningful way with the
inn way, because I I don't have it in me.
I feel like I feel like there's a cost, particularly
for women, particularly for women who are sex engaged in
sex work. I think there's a cost that is that

(16:24):
other people don't have to carry to being someone who's
visible online.

Speaker 3 (16:29):
But what's you know, what's most difficult for me, I
think is that like, you know, if they want to
like expose my shit, like whatever, go ahead, But the
people close to me, like you know, my grandfather, for instance,
or like partners, ex partners, friends, like but I mean,
my partner was dosed and I still don't know how
those motherfuckers figured out that I was dating period, let

(16:50):
alone who I was dating, excuse me, And you know
that too, which just fucking horrifying but also made me think,
you know, like the person I'm dating now is able
to deal with that horror more than like some of
my exes that I'm still friends with, And you know,

(17:12):
having to just carry the burden of protecting other people's
safety and other people's privacy with my own husband just
taken is a lot.

Speaker 2 (17:23):
But I mean that's like, like that's how it works.
It's never just that the woman.

Speaker 1 (17:27):
It's her partner, it's her mom, it's her community. Like
that's part of the way that this kind of harassment functions.
It's like, I'm not going to just come after you.
I'm going to make everybody I'm going to get make
there be a cost for anybody associating with you in
your community.

Speaker 3 (17:43):
Right, or like when I've done like panels or you'll
probably get this when you upload this and like how
are many weeks or whatever, people will be like huh.
So I just wanted to let you know that doctor
Olivia Snow is actually a fascist, and I'll probably like
send you some deranged Twitter friends like about how I'm
I don't know, like drinking baby's blood because that's not yeah, I.

Speaker 1 (18:05):
Mean don't like I have gotten I hope this doesn't
sound weird. I have gotten only once a DM from
somebody I didn't know. Just so you know, you're following
this person who was like, kicks puppies, kills babies, and
I didn't reply because I.

Speaker 2 (18:22):
Have a blanket rule. I don't engage with that.

Speaker 1 (18:25):
If you're going to tell me that someone was violent
toward you or something, that totally different story, right, I
don't engage with this attitude of how could I cut
this person down by making them lose followers on Twitter.

Speaker 2 (18:38):
I'm not going to engage with it. Not interested.

Speaker 3 (18:42):
You know, years ago, I might have been like, oh,
that's fucked up, I won't follow them or whatever. But
you know, now having had these well, I mean, you know,
years ago, I guess I didn't have the visibility on
Twitter that I have now, So I guess it's kind
of a mood point. But like, you know, unless you're
telling me that, yeah, like you said, like that someone
was violent to you specifically, or you can give me
some like concrete shit that I can look at and

(19:03):
be like, oh wow, Jesus Christ. Right, and yeah, and
even then, you know, if it's something from like a
tweet that's taking out of context from ten years ago,
like that's I just this is what you're spending your
time on.

Speaker 1 (19:15):
Really exactly, Like, I am not in the business of
like legislating people's online grievances. It's just like I'm I'm
and I think it makes the Internet worse for everyone
when that's how people like perceive it, when that's how
people engage with it, you know what I mean.

Speaker 3 (19:32):
Yeah, I find that people tend to reach out to
other marginalized people specifically to try to, you know, warn
them that I'm you know, sexss, a homophobic, of transfer
whatever the fuck in a way like knowing that I
am a part of a lot of these communities that
they're trying to just sever me from.

Speaker 1 (19:58):
More after a quick break, let's get right back into it.
Last month, Elon Musk overhauled Twitter's verification system and ramped
up Twitter Blue. The eight dollars a month subscription service

(20:21):
that gives users access to perks like being able to
post longer tweets and videos and giving the tweets of
subscribers increased visibility, thinking that anybody who spends money on
Twitter Blue is financially supporting Elon Musk. In response, there
was a short lived campaign to block anyone with a
blue check mark. Now, Twitter is pretty much the only

(20:41):
mainstream social media platform that will sort of allow sex
workers to show up there. Adult content is not explicitly
banned on Twitter like it is on Facebook, Instagram, and TikTok,
and since some of the perks of Twitter Blue, like
being able to post longer videos, could be good marketing
for sex workers, it kind of makes sense that many
of them would choose to stick around on the platform

(21:03):
and pay for Twitter Blue, even if it's not an
endorsement of Elon Musk or the way he runs the platform.
So a campaign to block anyone with a blue check
mark on site is hostile to a community of people
who already face digital hostility all over the internet. Thinking
about Twitter and your relationship to it. When they rolled
out the like Twitter Blue subscription, there was a whole

(21:25):
campaign of like block people with blue check marks.

Speaker 2 (21:28):
But then it was like, you know, there's Twitter.

Speaker 1 (21:33):
It seems to be a platform where a lot of
sex workers do show up for their businesses to make money,
and so yeah, if you're blocking all of them, like
you're not really you feel like maybe you're getting one
over on Elon Musk, But who you're actually harming is
sex workers who need, you know, to make livings.

Speaker 3 (21:50):
And people are like, oh, you're giving eight dollars to
a fascist, Like do you know how much money this
motherfucker has? Eight dollars is a drop in the book,
really ridiculous. One of the reasons I'm so active on
Twitter is it's the only platform that tolerates sex workers.
Instagram that hates us. I mean, I'm not on Facebook period.

(22:11):
I think I deactivated when I noticed that, Like all
my coworkers really foxed TikTok obviously, like I don't even
use it and I'm not allowed on it. So yeah, no,
Twitter is the one platform where we can exist as
sex workers and not have to worry about getting booted
at the drop of a hat. I mean, we still
have to worry about getting booted kind of at the drop,

(22:33):
but it's like slightly less bad, but yeah, no, And
I mean, like it turns out I think that the
Twitter blue subscription that like a lot of sex workers
were getting didn't end up being that profitable, probably in
large part because of the mass blocking, which, of course,
like algorithmically each block deboosts your visibility by you know. However,

(22:55):
many points I've had people reach out to me to
be like, oh, so and so is dangerous or whatever,
And in almost every case it's been uh, trans women, wow,
and like are they dangerous? Or do y'all really hate
trans women? Because there's a pattern here and the pattern

(23:17):
is not that's dangerous. Like so you know with the
block the blue thing, you know, like are you really
worried that people are giving money to a fascist? Like
what is he like the second rigist person on the planet?

Speaker 2 (23:31):
Now?

Speaker 3 (23:31):
Like really, yeah, like this is where you're going after
and you're not trying to get i don't know, like
the irs to collect some of them, Like really this
is your priority? And like it are you fighting fashism
or workers?

Speaker 1 (23:47):
I mean it sounds like even if it's not based
in hating sex workers, which it likely a lot of
it is, I think it's also about like not thinking
about sex workers not not not like being able to
see sex workers as people whose perspectives matter who you
want to like have a right to show up on

(24:07):
an online space.

Speaker 3 (24:08):
Right, or like who are you okay with me? In
collateral damage?

Speaker 1 (24:12):
That's exactly That's such a good way to put it.
I think that's exactly it that like, well they don't
really matter.

Speaker 3 (24:17):
You know, right, or like, well we can't just you know,
stop fighting fascism so that sex workers can live second place,
but deside the point, you know, like you're just undermining
your own point number one, But like I don't.

Speaker 1 (24:36):
When Olivia is asked how we build a safer Internet,
she bristles, how can we have a truly safer internet
when it's worn from a society that is often not safe?
As doctor Sophia Noble, author of Algorithms of Oppression, a
groundbreaking work that confirmed that search engines are built with
the same old race and gender biases baked into their algorithms,

(24:56):
has noted technology is not neutral. It reflects the same
dynamics that exist in society and the biases of people
who create them. And this is pretty clear when it
comes to AI. Back when the AI image generator Lensa
was taking over everyone's social media feed. Olivia found that
the program generated non consensual sexualized images of her, even

(25:20):
from images of her as a child, and she sees
the way that this will be used to harass, especially
harassed people who are already marginalized online, like women of color, children,
queer folks, and sex workers. She writes, this horror story
that I just narrated sounds too dystopian to be a
real threat. But as I have also learned through my
own endlessly revolving door of cyberstalkers, no amount of exonerating

(25:44):
evidence is sufficient to qual a harassment campaign. Coordinated harassment
is already unfathomably effective in silencing marginalized voices, especially those
of sex workers, queer.

Speaker 2 (25:54):
People, and black women.

Speaker 1 (25:56):
Without AI generated revenge porn, And while the technology may
not be sophisticated enough to produce convincing deep things now,
it will be soon. Your photos will be used to
train the AI that will create magic avatars for you,
and for only three ninety nine a pop.

Speaker 2 (26:14):
You talked about.

Speaker 1 (26:14):
How LENSA like like non consensually sexualizes people, even like
images of them as children, and so I guess my
question is like, what do you think about this time
that we're in where everybody's talking about AI. It seems
like if you scroll Apple podcasts the tech charts, every podcast.

Speaker 2 (26:33):
Is about AI.

Speaker 1 (26:34):
We're having so many conversations about how quickly it is
developing and how it's going to change everything, blah blah blah.
Do you like, like, as someone who thinks about the
intersection of gender, sex and technology, where do you think
we are headed to hell directly?

Speaker 2 (26:52):
Now?

Speaker 3 (27:01):
Like you know, with that LENSA piece. I didn't reach
out to Lensa because I didn't even know. I'm not
trained as a journalist, so I'm just kind of like,
I don't know, I don't have a transcript of the interview. Sorry,
but I didn't. I didn't reach out to them, but
I think Jezebel did and they were like, hey, what
do you what do you think of this as like
it's creating you know, child sexual exploitation material, And Lensa's

(27:23):
response with something like, well, that person should look into
the laws in their jurisdiction. And they're always like kind
of degendering me, like that person by them, and I'm like,
I I have a name and whatever. Anyway, and you know,
because they might be like subject to certain penalties for

(27:46):
creating this, uh like the problem.

Speaker 2 (27:51):
They create, like you didn't create.

Speaker 3 (27:53):
It, right, yeah, And they're like, well, we have a
policy that says no photos of kids, and I'm like,
uh like, even even if I wasn't. So the reason
I even thought to put in those childhood photos is
because I was just like doing experiments like digging around
and I so I first ran it with like just
random pictures of myself and I thought that the results

(28:15):
were like kind of like I was like, oh, these
are pretty neat, and you know, knowing that my face
has been like circulated without my consent, I'm like, I
don't get what they're gonna have my face? Oh no.
Then I ran it a second time using like pictures
where I thought I looked hot, and that was where
it was creating nudes not consentually, and you know, nudes

(28:40):
being another thing that you weren't supposed to submit to it.
So I ran it again and did you did you
run it or did you for money? I was like that,
I tweeted. I was like, someone sent me three So
there were three different gender options. Was I think female male?

(29:00):
And I want to say others, right, So I ran
it through the female ones like twice. I ran it
through the male one and thinking like, well, it give
me like a beard or something, and it did maybe
on like two pictures, but the rest it was just
still like hyper sexual, just in like slightly different ways.
And then I ran it on the like other gender option,
and it made me look like a child. You know.

(29:22):
I was like uploading pictures of like me, no, like
in my mid thirties, and it spit out pictures where
I looked like a teenager. And I think it was
because the algorithm, which I just like hate saying like
as of like that sounds the Illuminati, but like the
algorithm it's way of understanding like not male not female

(29:43):
is just like totally desexualized and like infantilized, which is
you know why I suspect that it, you know, made
my adult face and body and look like that of
a not body but face look like that of a child's.
I think of like like uh sofia nobles algorithms of oppression,

(30:03):
right that, like the Internet reflects back like our absolute
worst tendencies and not just because people are bigger dickheads
on the Internet than they are in real life though
that you know, certainly doesn't help the but that like, uh,

(30:24):
you know what's getting like the input to the Internet
is like what are like the biases that our culture
actually has and our culture, like our society, our societies
are disgusting. We are racists, we're sexist or homophobic, which
are like we're awful like as just a as a

(30:47):
a species. And the Internet you know, it doesn't it
doesn't you know, make value judgments. It just recognizes patterns
and what it's going to spit back out at us,
especially with machine learning algorithms that are trying to you know,
tell us what they think we want to hear is
going to be like the absolute worst stereotypes, so you

(31:12):
know lensa like sexualizing women like well, yeah, of course
it did, because we are a disgusting society that sexualizes
women and children.

Speaker 2 (31:22):
Yeah.

Speaker 1 (31:23):
And it's sad for me because I think there is
definitely a time where like AI is so powerful that
I would imagine that it can reimagine worlds where those
kinds of harmful like stereotypes and racism, sexism, that phobia,
transphobia or for all of that.

Speaker 2 (31:43):
It I would imagine a world.

Speaker 1 (31:45):
I would like to imagine a world where it is
not just simply recreating these things.

Speaker 2 (31:49):
And re establishing these things.

Speaker 1 (31:50):
But I feel like it's so clear that that's not,
that's not It'd be a nice fantasy, but that's not
what we're.

Speaker 3 (31:55):
Getting right well, And like, you know, there are some
like some systems or features, like some content moderation in
place that you know, prevents like just straight up slurs
popping up and something like chat GBT, but that I
mean that that's obviously a band aid that's not going
to uh like just like all the microaggressions inherent in

(32:20):
all of this shit. And you know, I'm thinking I
was asked a few months ago to give a talk
on like how to have a safer Internet, and I
remember being clicked with that is a ridiculous question, because
you cannot have a safer Internet until we have a
safer society, because Internet is just a reflection of that
then intensified. So like, sure, you can create some kind

(32:41):
of AI to identify and take down like child sexual
exploitation material, but is that going to keep kids safe? No,
like not at all, you know, Like I yeah, And.

Speaker 1 (32:53):
It's funny because you know, your experience with Lensa, with
them saying well, you could be facing you know, some sort.

Speaker 2 (33:00):
Of punishment for it.

Speaker 1 (33:01):
And it's like, even if Lensa the app, even if
the tool has a rule against uploading pictures of children,
if there's no safeguard, if you're allowed, if you're able
to do it, in what way is it really a rule?

Speaker 2 (33:13):
Right?

Speaker 1 (33:14):
If you can do it and still get those images?
How is it a rule if there's no safeguards.

Speaker 3 (33:20):
Right, And with them, I was like, I'm not you know,
writing this to like embarrass you. It's not like a gotcha.
It's like, hey, this is violent, maybe you should work
on making it less violent. And of course the response
is like, let's be more violent. Like I'm not trying to,
you know, like ruin your business model. I'm trying to

(33:41):
be like, hey guys, this is actually really dangerous. This
is really dangerous. Maybe we should work I'm making it
less dangerous. And just you know, I get being defensive,
you know, like I've certainly been accused of being like,
you know, a trafficker or whatever on Twitter because everyone's
a fucking idiot, Like no one likes likes getting that,

(34:01):
you know, pointed out to them, especially in a public forum,
but like you're like missing the forest of the trees here.

Speaker 1 (34:09):
Yeah, are you concerned about things like AI generated deep
fakes and like a marketplace fan?

Speaker 3 (34:14):
Oh yeah, oh yeah. The only thing that makes me,
that gives me some fain horrifying glimmer of hope is
that that technology is just becoming more and more widely available.
And I wonder if like deep fake revenge porn will
just become so ubiquitous that the like first response will

(34:38):
be like, oh, that's you pake for revenge porn instead
of like, oh that's real, right, yeah, it's oh absolutely,
And I mean it's already become a problem. Yeah. I
remember in that once a piece, I wrote something like like,
you know, with like four to six months deep pake
revenge port, it's gonna be everywhere, and then like low
and behold. Four months later they're like streamers and shit

(35:00):
just getting harassed with this and you know it's revenge
one is interesting too, and I think this goes back
to like why people like fucking with sex workers and
just like marginalized people in general, is that like it's
not necessity, it's not physically violent. You're not like punching
someone in the face. You're you're you know, but you

(35:20):
are I mean, and even beyond like damaging the reputation
and damaging their career, the like psychological warfare of some
of this shit, where it's just a constant environment of fear,
not even of what you're actually doing, but not even
what like you could be perceived as doing. I mean,
I see very little benefit to AI, if any, considering

(35:48):
it's potential for harm.

Speaker 1 (35:50):
Yeah, I personally have a little bit of trouble wrapping
my head around all the different conversations of that AI.

Speaker 2 (35:57):
What is pr what is true? What is like doomism?
What is marketing?

Speaker 1 (36:02):
Where do you think how do you think AI is
going to shape the experience of marginalized people ten years
down the line?

Speaker 3 (36:09):
I mean, I think that like there are some things
AI is like actually good at, and it's like stupid
mundane office shit, Like I need to come up with
a course description for a class I've just and I've
been meaning to just get on chat GPT and be like, hi,
Chat GBT, you give me a course description for blah
blah blah, and then you know, like tweak it or whatever.
But like you know, writing like mundane emails and shit

(36:30):
like I can't even imagine the number of hours I
spent in grad school thinking like should I sign it
with best or sincerely? Like that kind of shit? You know,
AI I think is good for But you know, I
cannot imagine without without restriction, what will like I I

(36:56):
maybe I could, but I like don't want to, you know, Like,
you know, I don't think that AI. I know, AI
is not sentient. I think that's a ridiculous argument. But
the way that some of this gets coded gives AI
not the agency, but like doesn't restrict it from doing
like wildly harmful shit. And you know, I really can't

(37:20):
see a functioning society if AI continues at the pace
that it is. I mean, and of course that like
rests on the assumption that we're currently in a functioning
society is questionable, But I know, I really can't think
of how that could possibly be a reality that anyone

(37:42):
could live in.

Speaker 2 (37:43):
I mean, we're in this very weird moment in technology
right now.

Speaker 1 (37:49):
Platforms feel weird, the future of platforms feel weird.

Speaker 3 (37:52):
All this a musk forever? What that fuck?

Speaker 2 (37:57):
I know it is bad.

Speaker 3 (38:00):
I mean, just like the people I know personally who
worked at Twitter, who had like their entire lives up
ended and like lots or livelihoods. I mean, you know,
which is such a like minor piece in the puzzle
of like how he has fucked up the entire fucking Internet.

Speaker 1 (38:13):
Yeah yeah, I mean when we're thinking about going forward,
like why do you feel it's so important to like,
why is it so important to make sure that the
voices of sex workers are like really at the forefront
when we're thinking about.

Speaker 2 (38:28):
The future of the Internet and the future of tech.

Speaker 3 (38:30):
Well, I mean, so sex workers a have to be
on top of tech because we're like one of the
first things in any term of terms of service is
you know, no sex adjacent anything like thanks fass Sessa. Uh,
you know, we're going to fight prostitution or whatever by
making sure that like, you know, dominatrix can't get a burrito, right, yeah,

(38:53):
good job. Yeah, so we already have to, you know,
like sex workers on the whole are a lot moreamiliar
with like Internet privacy and like how to use VPN
and how to use like cryptocurrencies and shit. So like
I mean that that's the one, but the other b
being like sex workers are consistently the most loathed and

(39:16):
dehumanized people, like period, Like I think of like like
no humans involved. What that has referred to it's been
what what was it? How did Sylvia Winter put it?
I think it was like black boys without jobs on
doc human and people, I want to say, and and

(39:37):
sex workers we are are we're so consistently dehumanized, We're
not seen as people. If we are seen as people,
that it's only for like stadistic purposes that we are
seen as such. And we are you know, concept we're
perceived as a safety threat. So to keep ourselves safe,
we have to you know, be take these I want

(40:01):
to say, like seep almost bizarre measures to like stay
safe in an environment that perceives our existence as a threat.
And you know, I think what I think is unique
to sex workers in that respect is having to be
like a step ahead of the curve and that like

(40:22):
or like actually being specifically prohibited by policy because you know,
there are all kinds of demographics that are fully dehumanized,
but you know, sex workers aren't seen as like I
guess like a demographic. You know, I've been trying to
like theorize this where like the internet kind of makes
sex work like a fixed identity, Like even if you're

(40:42):
not doing sex work, you're still a sex worker. And
you know, especially with the way that that like search
engines and shit go are, you know, I don't know
if I would still be in sex work if I
hadn't been Docks by my full name, which is ethnic
as shit. I mean, only one on the planet with
this goofy ass polish name. I don't know if I'd

(41:04):
still be doing it. But at this point, I'm like,
it's kind of a waste. If you know, why would
I be applying for academic jobs in fucking literature when
they're going to google me and find out that, you know,
I was a dominatrix in twenty nineteen. I might as
well be making money off of that, I guess, because
you know, now I'm always going to be a dominatrix,
Like that is always going to be online. The way

(41:24):
that like sex workers work too, has been kind of
like misunderstood and misapplied, which you know, sex workers work
means that or should mean that sex workers deserve labor
protections like any other worker, or workplace safety protections, or
like sex workers should you know he experienced violence, should
be able to you know, report them, to be protected

(41:44):
for it or from it, but instead this like I
feel like the popular view is like sex work is
just another job, like working at like the mall, which
means I can treat sex workers as terribly as I
treat other or low wage workers.

Speaker 2 (42:01):
No, that is not it, Like no, yeah, and.

Speaker 3 (42:04):
Like well why didn't you just get another job? And
like no, that's not what you know it, And I
get like, I think it's hard to communicate that, you know,
like horphobia is kind of like an amalgamation of all
these other like biases or like matrices of oppression and not,

(42:29):
like the stigma is so bad that in almost every scenario,
you're not going to turn to sex work unless you
have exhausted every other option, which is why you know,
sex workers are usually like working class or poor or
women of color or trans. People think that sex work
or like hor phobia is something you can like opt

(42:51):
out of, or like you could just quit being a
sex worker, versus like I can't quit being like a
Jewish woman, but like, no, you can't actually, like even
if I wanted to, no one will let me. And
I don't know, I think that that kind of nebulousness
of sex worker as an identity also makes us such

(43:15):
like uh ripe population, I guess for this kind of shit.
I saw this headline maybe like a week or so ago,
and it was like the FBI or the you know,
secret service or whatever misused some surveillance technology that was
supposed to identify January sixth Writers, which sign note was

(43:38):
my worst birthday.

Speaker 2 (43:40):
No six no Olivia.

Speaker 3 (43:49):
But it was like, uh, surveillance technology meant to identify
January six writers was used against Black Lives Matter organizers
and like I think it said like misuse. I'm like
it wasn't misuse because I think that's why it was made.
I think that was the point of the tech exactly. Yeah,

(44:13):
so you know these few demographics, I think Black Lives
Matter gets is more visible than sex workers, like obviously
maybe obviously, I don't know, and no one's gonna argue
like you could just opt out of being black, like right, yeah,
But I don't know as far as voices that are

(44:38):
like a threat to I don't want to say social order,
but or hegemony. I don't even know what word I'm
trying to think of, but I know what you Yeah, yeah,
that these voices that kind of threaten the status quo
are intentionally the most marginalized, and they or for need

(45:01):
to be amplified all that more. You know, I was
thinking I was in this workshop, maybe like two weeks ago,
and everyone's talking about center marginalized vices, center marginalized voices.
I'm like, okay, but the thing about marginalized voices is
that they are marginalized, right, which means no one wants
to believe them, and no one's going to listen. Like
you could say, center marginalized voices all you want, but like,

(45:25):
does that mean that people are actually going to listen
to those voices you're centering? No? Like, so you know,
I like, I don't know how to push back on
that exactly, other than yelling on the internet, which I
do a lot of.

Speaker 1 (45:45):
Well, I always end my interviews as asking when it
comes to the state of the internet and technology, are
you hopeful?

Speaker 2 (45:52):
Do you have any hope? And if so, what is
it that gives you hope? Is it a hard no?

Speaker 3 (45:57):
Yes? Like shit? You know, I okay, So one thing
that does kind of give me hope, and this may
be sounds like just don't give a shit? Is that
like so I was on this I was in this
workshop that was just the marginalized worsh I was just describing,

(46:18):
like I was invited to this workshop to work, uh,
to work on AI policy and you know, figure out
what to do going forward. And it was a sponsored
by the MacArthur Foundation, and right, like uh like right,
nice hotel for free. But the fact that they're inviting

(46:40):
a sex worker at all to something like that, and
it wasn't there with me. And then I mean as
far as like people you wouldn't expect to see at
these kind of events, and I think there were too,
like a formally incarcerated person and another organizer who was

(47:02):
my But like the fact that some voices, like the
most privileged of the marginalized are getting listened to a
little but makes me think that's a step in maybe
the right direction, because you know, I don't think that's
something that was happening ten years ago.

Speaker 2 (47:21):
Yeah, take the hope where you can get it.

Speaker 3 (47:24):
Yeah, that's you know, my one faint glimmer And like,
of course I don't. And I like always try to
emphasize like when I'm when I say like, well, it
gives me hope that I specifically, am being listened to?
Like No, it's not about like me specifically or about
sex workers specifically, because you know, at the end of
the day, this is all going to affect everyone. So

(47:47):
you know the fact that somehow these uh like more
vulnerable silence voices are getting heard just a little bit,
I think is hopefully right.

Speaker 1 (48:07):
Got a story about an interesting thing in tech, or
just want to say hi? You can read us at
Hello at tangody dot com. You can also find transcripts
for today's episode at tengody dot com. There Are No
Girls on the Internet was created by me Bridget Todd.
It's a production of iHeartRadio and Unbossed creative. Jonathan Stricklet
is our executive producer. Tari Harrison is our producer and
sound engineer. Michael Almado is our contributing producer. I'm your host,

(48:29):
Bridget Todd. If you want to help us grow, rate
and review.

Speaker 2 (48:32):
Us on Apple Podcasts.

Speaker 1 (48:34):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.