All Episodes

February 24, 2024 69 mins

Here’s Why Everyone’s Watching the ‘Who TF Did I Marry?’ Series on TikTok: https://www.glamour.com/story/heres-why-everyones-watching-the-who-tf-did-i-marry-series-on-tiktok

A Marketplace of Girl Influencers Managed by Moms and Stalked by Men: https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html

Nex Benedict: Everything We Know About 16-Year-Old Oklahoma Student’s Death: https://www.teenvogue.com/story/nex-benedict-everything-we-know-about-16-year-old-oklahoma-students-death

Schumer, LGBTQ+ advocates back updated kids online safety bill: https://thehill.com/policy/technology/4469721-schumer-backs-updated-kids-online-safety-bill/ 

Don’t Fall for the Latest Changes to the Dangerous Kids Online Safety Act: https://www.eff.org/deeplinks/2024/02/dont-fall-latest-changes-dangerous-kids-online-safety-act

Joey on Stuff Mom Never Told You breaking down Kids Online Safety Act (KOSA): https://omny.fm/shows/stuff-mom-never-told-you/what-is-kosa-and-why-is-it-so-scary

Bobbi Althoff deepfake spotlights X’s role as a top source of AI porn: https://www.washingtonpost.com/technology/2024/02/22/x-twitter-bobbi-althoff-deepfake-porn-viral/ 

How Ghana's Labour Act helped laid-off Twitter Africa staff to secure compensation: https://www.ghanaweb.com/GhanaHomePage/NewsArchive/How-Ghana-s-Labour-Act-helped-laid-off-Twitter-Africa-staff-to-secure-compensation-1917450

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. Welcome to
the show, where we explore the intersection of identity, social media,
and technology. This is another installment of our weekly roundup

(00:25):
of news that you might have missed on the Internet
this week. Joey, thank you so much for being here.
As I was telling you, I am coming to you
from my hotel room in beautiful Burlington, Vermont. I met
the Hotel Vermont. Shout out to them. They've been great.
Thank you for being here, of course. Okay, So we
were talking about this off Mike and I was like,

(00:48):
I wonder if they have heard about this? Who the
f did I marry? Saga on TikTok And you were like, oh,
what is this? Who the f did I am married?
So you have not heard about this at all from TikTok?
Have not, no, So let me explain to you what
is going on. So this woman on TikTok basically told
this epic story about essentially how her marriage fell apart.

(01:12):
It was in fifty parts, so fifty separate ten minute
tiktoks all together. It was five hundred minutes of viewing time,
or about eight and a half hours. When I tell
you that I watched the entire thing. I was watching
this TikTok as if I was listening to an audiobook
or like listening to a podcast where I was carrying
my phone around my apartment being like ooh, and then

(01:34):
what happened, and then what happened, and that what happened,
And so essentially her story is one where she was
engaged to marry this guy, or married this guy she
thought he she had been telling her like, Oh, I'm
the VP of a company. I'm really close to my
family and to my cousins, blah blah blah and it.

(01:55):
And he was getting up every morning and having these
like jovial converts with his brother and his cousins and
his families and his friends. And it turns out that
he'd been doing this for years, but he was just
pretending to have these conversations, so there was never anybody
on the other end of the phone because his family
has been estranged and no contact from him four years.

(02:18):
He said that he was a VP at a condiment company.
Turns out, in reality he was like a forklift operator.
He basically had invented all of these bank statements from
Chase from Chase Bank talking about like oh, like this
is how much money I have in the bank because
they were going to buy a house together. Like this
thing went deep. It was fifty separate parts. She got

(02:41):
one hundred and fifty million different people watching this story,
hanging on her every word. Uh, overnight. Essentially she got
a million followers through sharing this story. And I will say,
like people listening might be like, Wow, you dedicated eight
and a half hours of your life to listening to
the drama of a stranger people that you'll never meet. Yeah,
I did, because she is a very good storyteller. I

(03:03):
was hanging on every word. So I guess my question
to you to start off is that would you air
out some truly wild personal story or personal drama if
you knew that it would equal like overnight social media fame.
Because right now this woman is she's in talks with
like big brands and like people are like, oh, Shonda

(03:25):
Rhime should make her life into a movie, like she
like she's made it. Would you do that if you
knew it was going to make you like famous overnight.

Speaker 2 (03:35):
Damn, I want to say no, just because I feel like,
you know, it's kind of a double edged sword for like,
I feel like people are gonna turn on you really quick,
like something's gonna happen.

Speaker 1 (03:49):
I don't know.

Speaker 2 (03:50):
Uh, Damn, that's crazy though that I feel like that
level of like a I feel like I wouldn't want
to admit to the public that I felt that, Like
I'd feel bad like which, no, no shade to her
like whatever.

Speaker 3 (04:04):
But I'm kind of like I feel like I'd be like.

Speaker 2 (04:07):
I wouldn't want people to know that I felt for
that I didn't even google the guy or whatever.

Speaker 4 (04:10):
But yeah, at the same time, that's that's crazy. Can
I don't know.

Speaker 1 (04:17):
I feel like I will never truly know social media
fame or infamy because I don't. I don't have this.
I don't have this kind of like I couldn't. There's
nothing in my life that I could do a fifty
part TikTok on. I don't have personal drama. There's something
like wild going on in my life, I guess, thank God,
But like part of me is like, damn, do you

(04:38):
really have to like have some wild stuff happening in
your personal life and be willing to like really air
out all the details in order to find some kind
of social media success.

Speaker 2 (04:50):
Okay, so my one point of reference is that I
had somebody try to do this to a friend of mine, which.

Speaker 4 (04:59):
It wasn't success.

Speaker 2 (05:00):
Also, like ninety percent of it was fain It was
a weird, Like it was a weird situation.

Speaker 3 (05:06):
But I don't know, I.

Speaker 2 (05:09):
Personally, I feel like I see this a lot on TikTok, like,
especially when you get into like more niche TikTok communities too,
Like there's always some sort of personal beef between creators
that like blows up and like becomes a huge thing.
And I'm like, I like when you step back, it's
sort of like who, Like, this is not any of
my business? Like what is going on?

Speaker 1 (05:31):
Yeah, I feel the same way. And it's funny. When
I first started making the podcast, it was very important
to me that someone listening to the show could never
be like, oh, it's just a summary of like TikTok
beefs or Twitter beefs or who's canceling who on Twitter?
And I see so much of that, but like, and
I'm not gonna sit here and pretend like I'm above

(05:53):
finding the interpersonal drama and dynamics of strangers interesting because
lord knows, I'm fascinated. But you know, it's it's out there.
You should tell you if you're if your friend wants
social media fame and can like tell the story over
fifty very compelling tiktoks. Now is the time as a
hunger for it.

Speaker 2 (06:13):
Dude, there's all That's the thing is, like, you know,
the last time I was on, we were talking.

Speaker 4 (06:17):
About the like lesbian bar drama.

Speaker 2 (06:19):
And it's funny because I feel like that is the
version of like where it backfired for like, I think
she was doing this.

Speaker 4 (06:25):
I think she thought she was like.

Speaker 2 (06:27):
Getting her beef out there and using I don't know,
but like I think she was intending to use a
perversonal gain and then it backfired in a way that
made her look bad.

Speaker 4 (06:37):
And that's why I don't know.

Speaker 2 (06:39):
I feel like there's some there's some sort of like
detail I missed that everybody would go after me, you know,
right right.

Speaker 1 (06:45):
Like, oh well, Joey didn't consider this.

Speaker 4 (06:47):
Were telling the story like house are.

Speaker 1 (06:51):
Yeah, you live by you live by the already you
die by thesorty. You gotta be careful air and at
your beef on social media. It's great advice. Hey, so
this is future bridget The New York Times has published
a groundbreaking story today looking at so called insta moms
who run accounts for their very young minor daughters. The
piece was published after Joey and I have recorded this episode,

(07:13):
but I still really wanted to let y'all know about it,
which is why I'm popping in here justa heads up.
This is dark. Basically, parents, mostly moms, to be honest,
are running Instagram accounts for their minor daughters who are
as young as five years old. Now. Some of these
accounts are ostensibly about showcasing their daughter's modeling or competitive
dancing or gymnastics. The accounts show their minor daughters in

(07:36):
dance or gymnastics apparel, which can be skimpy or revealing,
and you know, might look one way in a dance
studio or on a dance stage, but looks very different
when posed in photos on social media. It's complicated because
some of these moms may genuinely feel like running these
kinds of accounts for their daughters could be good for
a future career as a model or an influencer. Some

(07:57):
of the moms are getting free or discount and clothing
and are hopeful about brand deals from these accounts. It's
gross because at least some of these moms who run
the pages really do seem like they think this is
going to be a good thing for their daughters or
their families financially, but maybe don't seem to be that
concerned with what it's actually doing to their daughters, Like
becoming an influencer cannot be worth this because for a

(08:20):
lot of these accounts, the followers aren't just brands or
well meeting family members. The followers are grown men, strangers,
creeps who leave really sexually inappropriate comments on the pictures. Now,
I should say some of the moms the New York
Times talk to are actively managing the accounts, deleting creepy
men followers and their comments, but other moms are encouraging

(08:42):
these kinds of comments and even arranging for the men
to have more and more direct contact with their daughters
for money, like through paying for subscriptions to exclusive content
on Instagram, allowing the men to buy the daughters used clothing,
or even arranging private chats. So it's pretty clear what's
going on don't think that you could be a parent
who is arranging this kind of thing and be totally

(09:04):
naive to what's going on. And Instagram is really not
doing enough compared to other platforms to stamp out this
kind of behavior. In fact, they're really kind of doing
nothing in my book. In twenty twenty two, Instagram launched
paid subscriptions, which allow followers to pay a monthly fee
for exclusive content and access. Now, accounts for anybody under
the age of eighteen are not supposed to be able

(09:26):
to do subscriptions, but mom run account sidestep that restriction.
The Time has found dozens that charged from ninety nine
cents to nineteen ninety nine at the highest price, parents
offered ask Me Anything, chat sessions, and behind the scenes photos.
Child safety experts warned that subscriptions and other features could
lead to unhealthy interactions, with men believing they have a

(09:48):
special connection to the girls and girls believing they must
meet these men's needs. So Instagram is basically like uniquely
establishing a pathway that allows for minor girls and adults
to be connected and get exclusive access to them. And
what in the world would there be any legitimate reason
for a grown man to be paying for exclusive access

(10:09):
to a child like on some level, I cannot imagine
a world where these moms truly are so naive to
what is going on, and some of the men who
follow these kinds of accounts are actual child abusers. Even
though Instagram is supposed to not allow anybody who is
on the sex offender registry to use the platform, The
Times trace the account of some of the followers, and

(10:31):
one of them, who follows these kinds of accounts was
convicted of sexually assaulting a child and is listed on
the New Jersey Sex Offender Registry, and Instagram did not
remove this account until The Times ask them about it
for this piece. On Telegram. On alternative social media platform,
these men connect with each other and trade images for
abusive means. Men in these groups frequently praise the advent

(10:52):
of Instagram as the golden age for child exploitation. I'm
so glad for these new moms pimping their daughters out,
wrote one of them. And there's an infinite supply of it.
Literally does refresh your Instagram explore page and there's fresh preteens.
Some of these men even go further and cultivate business
or patronage relationships with the moms by sending the mom's

(11:12):
cash or gifts or other things, and also trying to
pressure or blackmail the moms into posting more and more
revealing outfits. One mom described it as her girls becoming
a kind of currency. Now there again, there are moms who,
it sounds like, are trying to keep men like this
off of their accounts, but those moms are basically blocked

(11:33):
by Instagram itself. Meta failed to act on multiple reports
made by parents and even restricted parents who tried to
police their own followers. According to interviews and materials provided
by these parents, if the parents block too many accounts
in one day, Meta curtails their ability to block or
follow others. They said. I remember being told like I've
reached my limit, said a mother of two dancers in

(11:55):
Arizona who declined to be named, Like what, I reached
my limit of pedophiles for today? Okay? Great? And sometimes
even in the most egregious explicit cases, like a man
who propositioned to mom offering her sixty five thousand dollars
for one hour with her minor daughter in a DM,
Instagram still did nothing when she reported it. Now in response,

(12:15):
to these reports, Meta said that either those communications did
not violate Instagram's community guidelines or that staff didn't have
time to review them. In other cases, Meta told parents
that it relied on its quote technology to determine the
content was probably not a violation. So there is just
no way to spin that. This is Instagram allowing their

(12:36):
platform to be used for nefarious ways by creeps who
are endangering children and doing fuck all about it. So
I really want to talk about what's going on here.
One Instagram isn't doing enough, and I think that they've
really been counting on bad press around in Twitter and
TikTok to not have to answer for the ways that
they've allowed their platform to become. As one of the

(12:58):
men who follows these accounts put it quote a candy
store for abusive child content. And I also think there
is a lot going on with these moms, Like I
don't think anyone should have to deal with a creep
exploiting their kids, and no young person should have to
go through it either. These girls should be able to
decide how they want to show up online, but they're
so young they can't possibly understand and consent to this

(13:20):
kind of digital presence. Some of the moms the Times
spoke to said things like, Oh, she's been doing this
for so long now, her numbers are so big. What
are we supposed to do? Just stop it and walk away?
And yes, of course you can just stop this and
walk away. If you are a parent, it is your
job to keep your kids safe. And if you know
that creeps like this are out there and exploiting your kid,

(13:41):
you should be able to be the adult in the
room and say we are going to stop this and
walk away. The promise of insta fame and free stuff
from brands cannot be worth the well being and health
of your child. Let's take quickly at our back, so

(14:17):
we have to sort of switch gears. And I have
to give a bit of a warning for this story
because it is deeply heartbreaking and deeply disturbing, and I
just want to say that right off the top. It
is a truly heartbreaking story out of Oklahoma, and it's
still developing, but I can tell you what we know
as of right now. So next, Benedict, a sixteen year

(14:41):
old Oklahoma high school sophomore, was beaten by three girl
classmates in the bathroom of Owasso High School. On Wray Evans.
The next day, Wuray eighth, Next, who identified with the
two Spirit transgender and gender non conforming umbrella, was declared
dead at a hospital. So there is a very detailed

(15:01):
piece in teen Vogue by non binary journalist Lex McManamon
who points out that according to Freedom Oklahoma, it is
not apparently clear what pronouns better it's used, So in
this conversation, I'm going to refrain from using any pronouns.
According to this piece, a lot of what we have
heard about what might have happened comes from an anonymous

(15:22):
source who says that they are a family friend, and
this person told a local news outlet that the cause
of Nex's death was quote complications from brain trauma, and
that three older girls were beating the victim and her
daughter in this girl's bathroom at the school. So this
source said that Next said that Next couldn't walk to

(15:44):
the nurses station without assistance, and that the school did
not call an ambulance. This is a little bit murky
to me because I've seen conflicting sources and conflicting reports.
The school claims that they were unaware that the fight
happened until they were informed by a parent. But then
on February twenty, the school kind of backtracked, acknowledging that
the students saw the school nurse after the altercation. So

(16:05):
it's not totally clear to me what actually happened, Like
the school said one thing, that another. I'm not totally
sure what's going on there. But here is what Next
texted after the attack. I got jumped at school three
on one had to go to the er. They had
been bullying me and my friends, and I got tired
of it, so I poured some water on them and

(16:27):
all three came after me. School did not report to
the police and is probably getting sued. NeXT's guardian, Sue Benedict,
told The Independent that she was called to the school
to find Next had been badly beaten with face bruises.
Sue Benedict says that her child told her that the
attack involved NeXT's head hitting the ground, so Benedict says

(16:48):
that the school told her that Next was going to
be suspended for two weeks. NeXT's guardian brought Next to
the hospital right after the fight was discharged. The very
next day, Next ended up color, saying at home. NeXT's
guardian called an ambulance who took next to the hospital
where Next was then pronounced dead. It's very heartbreaking. I

(17:10):
have to be clear that we don't have all the information.
A lot of these reports are still coming together and
still being confirmed. If we get more information, we will
definitely update you. But this is what I was able
to confirm myself as of recording on Thursday evening. So.
According to a piece in The Independent, Next began being
bullied at the beginning of twenty twenty three, a few

(17:30):
months after Oklahoma Governor Kevin Stitt signed a bill that
required public school students to use bathrooms that matched the
sets listed on their birth certificates. As of right now,
we do not know NeXT's cause of death. The police
issued a subsequent statement on the twenty fourth of February,
saying that an autopsy indicated that Next had not died

(17:51):
as a result of trauma, but that didn't say what
might have caused NeXT's death. The police said, while the
investigation continues into the altercation, preliminary information from the Medical
Examiner's office is that a complete autopsy was performed and
indicated that the descendent did not die as a result
of trauma. But the family is also going to do
their own independent investigation into the death as well. So

(18:14):
that is what we know in terms of facts about
what happened. But something else that we know is that
NeXT's death isn't happening in a vacuum, right, It's happening
against the backdrop of an entire wave of increasingly violent
anti LGBTQ climate, and Oklahoma specifically, Oklahoma leads the nation
in terms of anti LGBTQ laws. Joey, you and I

(18:36):
talked about this just a couple of weeks ago when
you were on the show. But in Oklahoma's school district,
Oklahoma's Republican Superintendent of Public Schools, Ryan Walters, recently appointed
the far right hate mongering influencer Chaia Rachik, who runs
the inflammatory libs of TikTok account. So Reichik was appointed
to be on the Oklahoma Education Department's Library Media Committee,

(19:00):
despite the fact that, as we said, she's not an
Oklahoma resident, doesn't have a child in the Oklahoma school system,
does not have a background in education, she's a realtor.
It's not clear why she would be someone who would
be a good person to elevate within Oklahoma public schools
other than the fact that she has these like very
public anti LGBTQ anti trans views that she makes very

(19:23):
clear from her TikTok account. Right cheous anti LGBTQ posts
have been linked to nearly three dozen threats made towards schools, libraries, hospitals,
and businesses across sixteen states. This is according to an
NBC News investigation. And so it's pretty clear to me,
it's very clear what the kind of statement Oklahoma Public

(19:44):
Schools is trying to send when they amplify somebody like
libs of TikTok to a position like this within the
school system. And I think it's like it's very difficult
to talk about what happened to Next without also talking
about this very very hateful climate that's happening in Oklahoma.

Speaker 2 (20:06):
Yeah, definitely, I saw this story come out right after
next it died, and.

Speaker 4 (20:15):
Yeah, this is just so heartbreaking and.

Speaker 2 (20:19):
I cannot Yeah, Like it's just it's so heartbreaking, and
it's so clear that yeah, like there's like that's connected.
There's definitely a connection there. Hey, this is kind of
like this is the endgame of a lot of like
this sort of rhetoric and like the kind of things
that like lips of TikTok and all that. Like what

(20:40):
they're trying to do, they're trying to prevent trans people
from existing, and a lot of times that ends in
people dying, you know, and that is horrifying. And I yeah,
this this is truly such just such a like tragic story.

Speaker 1 (20:58):
It really is. And I've I mean, I think that
you're exactly right that you can't have all of these
different ways where queer folks and trans folks and gender
nonconforming folks are being given the message that the people
in power, the powers that be, don't want them to exist,

(21:18):
They want to criminalize them, they want to legislate the
met of existence and then act surprised when this kind
of thing happens.

Speaker 2 (21:26):
Yeah, and then I mean, this is also the other
side of that is that it empowers the other kids
that killed next, Like they're probably their motivation or they're.

Speaker 4 (21:38):
Kind of the reasons they did.

Speaker 2 (21:39):
What they did was motivated also by this kind of
rhetoric and the fact that this rhetoric has become like normalized,
and like the messages they're hearing are probably telling them
that this was something that was okay and justifiable.

Speaker 1 (21:53):
That's such a good point and I think, you know,
kids are absorbing these messages. They're here these messages, you know.
It just doesn't surprise me that kids in this in
Nexus school saw Next as someone it was okay to
bully someone, that it was open season on and that
the adults at the school what were they gonna say,

(22:16):
because they're doing it too, right, Like, I mean, like
I've seen a lot of people framing this story as
like kids bullying kids, like teach your kids to be better,
But we are talking about grown adults who are passing
laws that harm kids, right, Like Chaia Rachel from Limbs
of TikTok is in her thirties, right, She's not a child.
And if you're curious, like what is she doing on

(22:38):
Twitter right now, well, right now she is mis gendering
next even in depth. She's on Twitter tonight mis gendering
a dead child. So no, it is not kids bullying
kids when you've got adults who are setting a very
clear agenda that criminalizes and demonizes lgbtqth right, Like, it's
just like very clear to me that we're not talking

(23:00):
about like kids being awful to other kids. The adults
in the room who are supposed to be setting a
different example, are making it clear that this is going
to be tolerated, that this is okay, and participating in
it themselves. According to The Independent, one of nex's teachers
was actually featured in a secretly filmed video on libs

(23:20):
of TikTok. In April of twenty twenty two. This teacher
was filmed telling students, if your parents don't accept you
for who you are, fuck them. After a very big backlash,
this teacher had to resign, and according to nex's guardian,
Next was really angry about this and to put this
in a larger context beyond just limbs of TikTok. Last year,
a court ruled that Oklahoma can enforce its law banning

(23:43):
and criminalizing gender refirming care for transminers while a suit
against it is heard. So that law would make providing
gender refirming care in Oklahoma a felony. And so yeah,
it's it's I it's it's impossible for me to talk
about what happened to Next without putting it in that
larger context where adults who really should know better are

(24:04):
creating the conditions for this kind of violence. And I
think we need to be really clear about the agenda
that they're setting, and not let them feign ignorance and
not let them be like, oh well, I don't want
anyone to get hurt. That is a very obvious example
of what is called stochastic terror, right where you don't
necessarily exblicitly come out and say like I want you

(24:29):
to harm trans people or I want you to do
harassing things to trans people, but you're not not saying that.
It's all very wink wink, not not. And I think
for too long we have let these people get away
with exactly that, Like, I don't like that libs of
TikTok is able to be like they're accusing me of
killing this young person and all I do is, you know,

(24:51):
post videos from Nexus School that criminalize transness and queerness,
and anybody who would support that, it's like, well that
is kind of what you're doing. You don't really get
to say that you're not doing it when that's what's
going on.

Speaker 2 (25:04):
Yeah, exactly, she stakes violence, like when she does, she
the purpose of her account is to soake violence, and like,
you know this is I get like, yeah, this is
what people have been saying, like this is gonna lead
to kids dying, and yeah, we're seeing that happen. We're like,
you know, I just like, yeah, Neck should still be alive.

(25:26):
This is just such a like terrible story, and it's
so clear like who's at fault here and what sort
of narratives are at fault and yeah, it's just horrifying.

Speaker 1 (25:36):
Yeah, Matt bernste can put it really well. Matt says,
you took away their books, you took away their health care,
you took away their ability to use the right bathroom.
You fired the teachers who kept them safe. You demonize
them on television and on Twitter, all in the name
of quote protecting children. And I guess I just feel
like when people talk about protecting children, it's like what

(25:57):
about children?

Speaker 2 (25:58):
Like next, I feel like all this so much of
like what's happening right now, it's literally just grown adults
trying to excuse the fact that they still want to
bully children, like they still It is like literally like
not so like psycho analysis or whatever, but like literally
it feels like these are just people that never got

(26:18):
over like whatever high school bully phase they went through,
and they're mad that they can't like like that that Yeah,
that things are I don't know, it is just it
is grown adults bullying children and bullying children to the
point where they're putting them in danger. And that is
so like disgusting and terrifying about this whole thing.

Speaker 1 (26:42):
Yes, And it's like bullying children under the guys of
protecting children. If you're a protecting children, Next would still
be alive. Like, this is not protecting children. This is
bullying children. This is living out some kind of a
fucked up high school fantasy against a target that you
perceive as vulnerable and not unable to protect themselves. And
that I think that you're exactly right. I think that's

(27:04):
exactly what it is. And yeah, Next should still be alive.
I hate that this is a story that we're even
having to talk about, but it is. It's what's going on.
It's what's happening in Oklahoma. And I really feel for
the community there, for NeXT's community who is grieving, and
it's just cont even in grief, even in death, just

(27:26):
constantly getting signals that this is okay. Like the superintendent
of Nex's school didn't even put out a statement, didn't
even say anything if a young person dies in one
of your schools. It's just like these people are ghouls
like I can't even like, I don't even know what
to say. At a certain point, if you are a

(27:47):
superintendent of a school and a child dies in the
school bathroom of a school in which you are meant
to be kind of in charge of, and you don't
even say anything because you know that you've spent the
last few years creating a toxic, violent environment for kids
like the one that died, it is really bad. It
is really bad. My heart breaks for next, My heart

(28:09):
breaks for next as family and community, And if we
get more information about what's going on in this story,
will definitely give you an update. Let's take a quick
break at our back. But this also feeds into how

(28:37):
LGBTQ youth are treated online, which actually does bring me
to an update that we have about the Kids Online
Safety Acts this week, So for folks who don't know,
the Kids Online Safety Act, otherwise known as KOSA, is
ostensibly about regulating how social media platforms operate for younger
users to protect them. If you're hearing scare quotes in

(28:58):
my voice, they're there, but kind of sounds like a
way to suppress and criminalize LGBTQ folks on content that's
not me saying this. I'm saying this because like exublicitly
that is how people behind it have spoken about it
and further survey and censor us all, which is why,
as much as we want a safer Internet for all

(29:19):
of us, children, adults, all of us, I would say
like in these parts, we have not been too thrilled
about this specific legislation. Joey, how would you describe your
thoughts on it?

Speaker 2 (29:31):
Yeah, I think, UH, not necessarily thrilled puts it pretty well,
might be a little bit of an understatement. Uh. KOSA
is a very concerning piece of legislation that I think
uh does not do much to achieve the goal that
it says it is created to do.

Speaker 4 (29:55):
Uh, which I.

Speaker 2 (29:56):
I'm guessing we're gonna get into a lot of what
it does actually do.

Speaker 4 (30:00):
Yeah, it's not great. Not excited about this one.

Speaker 1 (30:04):
Yeah, I'm not excited about it either. I It's tough
for me because I've always generally taken the position on
the podcast of like I will typically shy away from
advocating for or against specific legislation. However, on this one,
I feel like I have to be clear about my
reservations about it. Although I don't want to make it

(30:25):
seem like it is widely being panned because I went
to an event not that long ago. Actually, the group
that put on the event you talked about on your
very comprehensive Smitty episode. I went to it. It's like
youth designing better something.

Speaker 4 (30:42):
I oh, design it for us?

Speaker 1 (30:45):
Yes, that was it.

Speaker 4 (30:46):
Yes, okay, design it for us.

Speaker 2 (30:48):
Just for the record, So this is a group that
claims to be a coalition of young activists and organizations
fighting for social media and online platforms kN Teina, Young
and Ultya. Yeah, okay, it is supposed to be led
by young people. It is about the internet. Why do
they have less than a thousand Instagram followers? They have
currently eight hundred and sixty five Instagram followers.

Speaker 4 (31:08):
Like, I'm sorry, I know, like young people use social media.

Speaker 2 (31:14):
This is the target group that should be on Instagram
to follow you. And also you are claiming to understand
how the internet works.

Speaker 1 (31:21):
Yeah, I don't know. I went, Producer Mike and myself
went to their like they had a kickoff event and
DC and we went it is I mean, I a
will to say I agree with you, like I think,
I mean, like this is just my opinion and I
want to be clear that I I don't think that

(31:42):
KOSA is good legislation. And there are lots of organizations,
some of whom I trust and like, and I've worked
with some of whom I don't really know much about
and maybe are a little skeptical about if you know, uh,
who feel differently about it. But I have to say
how I feel like you're listening to this pod to
get my take, and I have to be honest about
that take, which is that I am suspect of the legislation.

(32:06):
And if you want more information, like a really good
deep dive, Joey actually did an amazing stuff mom ever
told You episode breaking down some of the concerns and
issues and origins of the Kids Online Safety Act and
the way that it could really change the Internet, not
just for young people but for everybody. So I definitely
recommend anybody interested check that out. So I feel like

(32:26):
I'm kind of here with a COOSA expert a little bit.

Speaker 4 (32:30):
Oh thank you, I try.

Speaker 2 (32:32):
I watched a lot of tiktoks for that episode, so
some might call me an expert.

Speaker 1 (32:37):
Yes. So here's what's going on. So this week, Senate
Majority Leader Chuck Schumer announced his support for KOSA. The
bill also secured fifteen new sponsors, including Schumer, for a
total of sixty two co sponsors in the Senate, which
would give the bill enough votes to pass in the Senate.
So we're actually a lot closer to this bill becoming reality. Earlier,

(32:59):
when this bill was introduced, y'all might remember that LGBTQ
rights groups had been loudly objecting to COSA because of
provisions that could be interpreted, specifically at the state level
by attorneys general as defining what kind of content is
harmful to children. You know, after talking about what happened
to next, it doesn't take a fortune teller to assume
that it would probably include lawmakers and attorneys general being like, oh,

(33:23):
it is content pertaining to LGBTQ people that is harmful
and thus needs to be, you know, kept from youth
on the internet. We don't even really have to like speculate,
because Marsha Blackburn and the Heritage Foundation very clearly and
explicitly stated their intentions to use COSA to censor LGBTQ content.

(33:44):
So this is not me like gleaning or reading tea leaves.
This is meat is listening to the words that they
said and very plain, explicit, clear English. So that's why
LGBTQ groups were not down with COSA initially. But this
week The Hill reports that several LGBTQ advocacy organized dropped
their opposition to the Kids Online Safety Act after the
sponsors updated the text, inching the bill even closer to passing.

(34:08):
So the groups include Glad Glisten, the Human Rights Campaign,
p FLAG, the National Center for Lesbian Rights, the National
Center for Transgender Equality, and the Trevor Project. All of
these groups got together and sent a letter to Blumenthal,
one of the sponsors of this bill, withdrawing their opposition
following updates to the legislation. The letter reads, when COSA

(34:28):
was first introduced, we had serious concerns that the bill
could open the door to bad actors weaponizing the law
to suppress affirming content for LGBTQ young people. Some early
supporters of COSA even touted this is how they intended
to use the law. The considerable changes that you have
proposed to COSA in the draft released on February fifteenth
significantly mitigate the risk of being misused to suppress LGBTQ

(34:50):
plus resources or stifle young people's access to online communities
as such, if this draft of the bill moves forward,
our organizations will not app it's passage. So yeah, I
guess some of these LGBTQ plus advocacy organizations are like, listen,
if this moves on and becomes the law of the land,
we will no longer object.

Speaker 4 (35:11):
Yeah, I don't know.

Speaker 2 (35:15):
I'm I'm still I feel like if you are creating
a law, and the intention of the law that you
create when you create it is to block kids and teenagers'
aspects to anything to do with like queerness or any
like gender nonconformity or anything, in my opinion, it's kind

(35:37):
of hard to redeem that and to find a way
to be like, no, actually we fixed it so it's
not homophobic and transphobic anymore, because like that's the point
of the bill that you like, you outright said that
this is why you're making this bill, And I don't like,
what's the point of I don't know, And but yeah,
I guess, you know, I guess at least it won't

(35:57):
be suppressing queer content anymore, or it'll just be suppressing
every other issue.

Speaker 3 (36:03):
That we have, you know, going on in the world
right now.

Speaker 1 (36:07):
Yeah, Yeah, exactly, so one of the LGBTQ advocacy organizations
is big beef with the previous draft of the bill
was the bill's duty of Care standard, which was meant
to mitigate the promotion of harmful content and the use
of harmful or addictive features for teens and kids online,
but again could be interpreted to deem anything involving you know,

(36:30):
LGBTQ content as harmful and thus kept away from kids. Right, So,
like it just had a pretty broad interpretation of what
harmful content was and kind of left it up to
states to decide what is or is not harmful content.
So in this updated version of the bill, the duty
of care is clarified to focus on the design, features
and components of the platform rather than on the content

(36:51):
hosted on those platforms. There's also a new bit added
that says that the provisions in the bill cannot be
the basis of any action brought by a state attorney
general under a state law. So I can see how
they're trying to tinker with it so that it's not
just like left up to you know, specific attorneys generals
at the state level to decide how to interpret this law.

(37:14):
So this is tricky, right, like, I do agree that
some of these changes made make the bill less obviously horrible.
I will be the first person to acknowledge that. However,
I am still, like you, Joey, just kind of not
buying it. And this is my opinion to like take
that for what it's worth. I think that KOSA is

(37:36):
still a bill that will lead to all of us
having to give over more information and more data in
the name of keeping kids safe. Namely, I'm talking about
age restrictions. Local governments already sell our private information and
data today to data brokers, right like we know this,
So I am to believe that requiring children and also
everyone else to give over more data to tech companies

(37:58):
and more like our our government IDs is going to
keep us safe and keep our our data more private.
I'm sorry, I simply do not believe this. Like this
is not I just like it's not a circle that
I can square. I simply do not believe it. I
am completely against age restrictions in this way because I
don't think it makes sense to hand over more of

(38:19):
our data to tech companies in order to achieve better
data privacy. It just like does not make sense to me,
does not compute if somebody smarter than me out there
doesn't agree and wants to explain to me how this
is actually keeping anybody safer, I would love to hear it,
but from my understanding, I just cannot see how it
actually will.

Speaker 2 (38:37):
Yeah, I think especially like the US is so we
do not have data privacy laws, and like we are
so behind on that, like in comparison to like Europe
or whatever. So it's like it's so it's so especially
kind of almost like insulting to like have this bill
be introduced and be like framed as this thing that's

(38:58):
about safety, Like it's just making the actual safety concerned
like more dangerous and more likely to happen to you.

Speaker 3 (39:06):
I don't know.

Speaker 5 (39:07):
Again, I feel like this is and I said, this
is the episode that I just will spent to you,
Like imagine having to put your full name and like
your ID and quote all of that in when you
just want to use the internet.

Speaker 4 (39:18):
Like, I don't know, that just seems to be not
a good thing in my opinion.

Speaker 2 (39:23):
But uh yeah, again, I guess that's not universal that
we all want data privacy.

Speaker 4 (39:30):
I guess I don't know.

Speaker 1 (39:31):
Well, you know what, showy, you and I are not
the only people who feel this way, and a piece
called Don't Fall for the latest changes to the Dangerous
Kids Online Safety Act on the Electronic Frontier Foundation's blog,
which we'll link to in the show notes, they argue
exactly that right. They say that this updated version of
the bill is still a vehicle for online censorship and
increase surveillance, which we already know deeply impacts marginalized communities.

(39:55):
So they write, in light of the overall politicization of
youth education and online activity, we believe the following groups,
just to name a few, will be endangered. LGBTQ plus
youth will be at risk of having content, educational material,
and their own online identities erased. Young people searching for
sexual health and reproductive rights information will find their search
results stymied. Teens and children and historically oppressed and marginalized

(40:18):
groups will be unable to locate information about their history
and shared experiences. Activist youth on either side of the aisle,
such as those fighting for changes to climate laws, gun laws,
or religious rights, will be siloed and unable to advocate
and connect on platforms. Young people seeking mental health help
and information will be blocked from finding it because even
discussions of suicide, depression, anxiety, and eating disorders will be

(40:39):
hidden from them. Teens hoping to combat the problem of addiction,
either their own or that of friends and families and neighbors,
will not have the resources they need to do so.
Any young person seeking truthful news or information that can
be considered depressing will find it harder to educate themselves
and engage in current events and honest discussion. Adults in
any of these groups who are unwilling to share identities

(41:00):
will find themselves shunted onto a second class Internet, alongside
the young people who have been denied access to this information.
They go on to say we shouldn't kid ourselves that
the latest version of COSA will stop state officials from
targeting vulnerable communities, and KOSA leaves all of the bills'
sensorial powers with the FTC, a five person commission nominated

(41:20):
by the President. This allows a small group of federal
officials appointed by the President to decide what content is
dangerous for young people. Placing this enforcement power with the
FTC is still a First Amendment problem. No government official,
state or federal, has the power to dictate by law
what people can read online, and that blog post really

(41:41):
confirms my feelings on this legislation, even the updated version.
Like I understand that these groups that I respect them,
that I think do good work, are saying that they
largely no longer take issue with it in the way
they did before. But I still think it's pretty scary,
and I just I am doing a bit of few
sure casting imagining another Trump presidency where President Trump has

(42:05):
the ability to nominate this five person FTC Commission and
essentially can decide what people can read and post about online.

Speaker 2 (42:14):
It is pretty scary, it is, and I think even
beyond just thinking about the future, like we're in the
middle of a like crack down on information and content
that's coming out about Palestine, about what's happening in Palestine,
and a crackdown on protesters. Like in New York where

(42:35):
I live, there have been multiple incidents of protesters they're
being attacked by counter protesters and like kind of not
you know that the police kind of clearly taking aside
and letting certain actions slide versus others, or instances where
people were being arrested just for using slogans or for

(42:56):
putting up stickers, Like this isn't there's an active cracked
on free speech that is happening under this administration. And
I think this is why like this, I'm disappointed but
not surprised. And I think in a lot of these
like LGBT organizations, because I think there really is which
is this is a whole other conversation about kind of

(43:18):
mainstream LGBTQ plus like nonprofits and all of that. But
I think like what's happening is kind of a lack
of a refusal to show solidarity with other marginalized groups,
because it's almost like saying like, oh, okay, we have
a little bit of reassurance that this isn't gonna affect us.

Speaker 3 (43:34):
But like we're gonna let it happen to other people.

Speaker 2 (43:37):
And although you know, and all these groups intersect too,
you know, lots of lots of queer.

Speaker 4 (43:40):
People deal with mental health issues.

Speaker 2 (43:42):
A lot of queer people also deal with issues of
like you know, other like oppression, because there are other
like part of our other marginalized groups as well. Yeah,
I something I saw talking about this latest UH update
brought up something that I hadn't really thought about it
this way, but it the speaker was talking about how

(44:05):
like COOSA comes from this place where it's the assumption
is that social media and the internet is bad for
children full stop, like there's no benefits from it. And
they were like encouraging people to share stories about how
social media or the internet has helped them because like
it's helped them get out of you know, like a
difficult situation or find community and all that. And I think, like,

(44:27):
and we've talked about in the show, like I personally
have like found a lot of community through the Internet.
I've also experienced a lot of the bad sides of
the Internet, but I've seen the good too. It has
also helped me deal with you know, mental health issue
like you know, depression and all that in the past.
That being said, I also, you know can see how

(44:47):
it's made certain issues worse and particularly like I don't know,
like I was on Tumblr when like there was a
lot of glorification of like eating disorders, but also the Internet. Yeah,
like but it ultimately is just kind of a it's
like we've created this new like space for people to
be and it's it's hard to you know, just sort

(45:09):
of preventing people from talking about things full stop because
they can lead to harmful.

Speaker 4 (45:16):
Things.

Speaker 2 (45:16):
Happening or complicated discussions or whatever like that doesn't that's
not effective. And again, like when you especially when you
list out all of these things that are still going
to be affected by this legulation, it's so clear to see, like, yeah,
like you know, the Heritage Foundation might not be super
happy that it's going to be a little bit harder
for them to repress queer content, but they also have

(45:40):
they have stakes in trying to repress content about reproductive
health and sexual health and about you know, certain activist
groups or our struggles or whatever. Like there's still kind
of a political motivation there that I think has really
scary implications for all of us, and really scary implications

(46:00):
for just like freedom of speech in general.

Speaker 1 (46:03):
Yeah, I'm glad that you phrased it that way, And
I want to be clear, I don't think that any
president should have the power to like pick the five
person group who gets to decide like what issues are
okay and what issues aren't. But I think that there
might I guess I say that to say that, like,
I think there might be people I am not one
of them, but I think there might be people listening
who are like, well, you know, Biden would be fine.

(46:27):
If Biden had that power, it would be okay. And
the answer is no, it wouldn't be okay because, as
you said, there is currently a crackdown on all kinds
of speech and all kinds of political and social expressions
with Biden in the White House. So no, that's not
the case. But also like no president should have that power,
full stop. And I think you're exactly right that against
the backdrop of I don't know, I can only really

(46:49):
call it like a very chilling era of a crackdown
on free expression happening before our very eyes, but in
this like almost gas lighting way where it's like, no,
it's not be talking about like you know what I mean,
It's it's like a very weird time.

Speaker 2 (47:09):
I'm sorry, Bridget, are you saying that with Biden as president?

Speaker 4 (47:12):
We can't just go back to brunch.

Speaker 1 (47:14):
We can just go back to listen, if Biden was
in the White House, I wouldn't be in the street.
That would be at brunch, Joey.

Speaker 4 (47:21):
I go to court. Everything's signed, everything's great.

Speaker 1 (47:28):
More after a quick break, let's get right back into it, okay. So,
speaking of everything being on fire and awful, I regret
to inform you that new AI generated deep fakes have

(47:53):
just dropped. I almost didn't include this because lord, am
I tired of talking about this? Like, I hope the
day comes that I can shut up about this and
it doesn't have to be something I'm talking about all
the time. But unfortunately that day is not today because
podcaster Bobby Elfoff. Do you know Bobby elf Off? Her
thing is kind of like she people call her like
a NEPO baby or an industry plant. Her thing is

(48:15):
kind of like doing cringey kind of awkward interviews with
like rappers. Do you know Bobby?

Speaker 2 (48:22):
I don't, but I do get a lot of those
like tiktoks where it's like a random clip from a
podcast that seems like it would be this kind of thing.

Speaker 1 (48:30):
Okay, so you've probably seen Bobby's podcast. So Bobby is
just the latest target of viral AI non consensual deep
fake videos, specifically on Twitter. So, as we've talked about
on the show, this has happened a lot before. Unfortunately,
it happened to Taylor Swift, it happened to Marvel Star,
Social Gomez. Social Gomes is seventeen, so a minor and

(48:50):
with all of these viral deep fakes. All of them
have really gotten a lot of traction, specifically on Twitter,
and now a new report in The Washington Post is
shedding life on why Twitter has kind of become the
platform for deep fix going viral. Now, I have to say,
none of this is terribly surprising to me. If you
listen to this podcast, none of this is probably terribly
surprising to you. It is the kind of thing that like, yes.

Speaker 4 (49:15):
I wonder what the problem is.

Speaker 1 (49:17):
I mean, I'm sure you could guess. Honestly, I truly
was going to add in a clip from an earlier
podcast episode that we did when Elon Musk first took
over Twitter and started making changes, because literally I was like, oh, well,
this is going to happen, This is going to happen,
This is going to happen. I thought that would sound
very obnoxious and probably like too smug, because nobody likes

(49:38):
somebody who says I told you so. But listeners just
know I could have added that if I wanted to,
because it exists. Because you didn't have to be a
fortune teller or a mind reader to tell the hell
that the changes that Elon Musk made was going to
lead to on Twitter.

Speaker 4 (49:54):
Oh yeah, I think I think you get to say
I told you so.

Speaker 1 (49:57):
You know what I'm going to say it, Hey, Elon Musk,
I told you so so. Just like those Taylor Swift images,
the images of Bobby although started on message boards so
not on Twitter. When they first started on message boards,
they had a relatively small audience. The post report of
the videos got one hundred and seventy eight thousand views
over the last six months on message boards like four chan,

(50:20):
But then someone brought those videos to Twitter, where they
blew up. This week. It was reposted so many times
that Althoff's name was trending on the platform. In just
nine hours, the clip received more than four point five
million views, twenty five times more than the porn site's viewership,
according to data from an industry analyst. So one of

(50:40):
the most popular posts on Twitter directing viewers to the
video remained online for more than thirty hours. Another post,
which promised to send the full Bobby Althoff leaks to
everyone who like and comment, was online for twenty hours.
Twitter only removed it after The Washington Post reached out
to them for a comment on the fake videos. By
the video was removed, it had been viewed more than

(51:02):
five million times. So this is obviously a problem, and
Twitter is a unique platform when it comes to how
and why deep fakes spread so easily. So even though
Twitter was one of the first platforms to outrite ban
synthetic AI generated content and deep fakes, this was long
before Musk took over. Back in twenty twenty, Twitter executive

(51:22):
said that they recognized the threat of misleading synthetic media
and they were committed to doing this right. But when
Elon Musk took over, changes that he made early on
to Twitter have basically gone on to make Twitter not
just a place where people post deep fakes, but also
a place where people can be ensured that their deep
fakes will get lots of traction and lots of eyeballs. Like, seriously,

(51:44):
go back and listen to some of our earlier episodes.
You didn't have to be a genius or a rocket
scientist to figure this out. So here's how the Post
reports it. Under owner Elon Musk, x has now become
one of the most powerful and prominent distribution channels for
non consensual deep fake pornat not only helps the phony
photos and videos go viral in a low moderation environment,

(52:05):
but it can also end up rewarding deep fake spreaders
who use manipulated porn to make a buck. So the
way that Elon is running Twitter has basically turned the
platform into a deep fake marketplace and also advertising machine
where bad actors can make money by exploiting women through
these gross fake photos. If there is one thing that

(52:26):
I hope that people take away from this bit of
the show is that it is not just creeps posting
gross fake images of women. It is creeps who are
making money posting fake images of women, and Twitter is
enabling them to do so more effectively by giving the
money for doing it and by essentially allowing them to
market and advertise their creepy, gross fake images of women. So,

(52:49):
to be clear, in my book, both Twitter and Elon
Musk personally are complicit in a criminal money making enterprise.
Like that is how I see this. I don't see
it as just disparate creeps posting deep fake images. To me,
this is a criminal money making enterprise that Elon Musk
is actively engaged in. So here's how the whole thing works.

(53:12):
So y'all know that Twitter Blue subscribers that get a
blue check are paying to in part have their content
get more visibility on Twitter, and because Twitter offers payouts
for posts that get like lots of engagements and lots
of views on the platform, many of the people sharing
those deep fake videos of Bobby have a blue check mark,
which also means that they could be making money from
that video. That is, if Elon Musk actually pays as

(53:35):
he says he is. We've talked about that quite a bit.
And you even have people who are trying to boost
the videos engagement by tweeting that it's the quote real
leaked Bobby footage, as if to tell people like, oh, yeah,
there's like fake Bobby videos floating around, but I have
the real Bobby video. You're gonna want to click here
to see this real content, or saying like we'll DM

(53:56):
the longer, real video to anybody who shares this, because
the more people who share and engage, the more money
they stand to make from Elon Musk's you know, payment program.
And so these people are really exploiting Twitter's lax rules, moderation,
and pay to play engagement mechanisms to essentially create trailers

(54:16):
to put on Twitter to get people to buy longer
deep fake videos, which is why they're not keeping them
on four Chan. Right, So, like a lot of these
videos and a lot of this content, it starts on
four Chan and then it gets like, you know, kind
of a niche audience. But the reason why they're putting
them on Twitter is because they want more people to
see it, because they have a marketplace. They then will

(54:39):
like ask people to give them money in order to
get more deep fake images. So essentially it's like a
coming attractions for AI generated exploitive deep sakes of women.
One person was offering to sell a longer fake video
of Bobby for ten dollars payable via PayPal, according to
The Washington Post. So this is like a business enterprise,

(55:00):
complete with advertising platform support and payment processors like PayPal.
So we already have platforms like four Chan, which are
you know, message boards known for this kind of harmful
content that are kind of fringe. But Twitter used to
be very different from four Chan right before Must took
over it. It's where journalists and elected officials are posting content.

(55:20):
So the fact that we're seeing Twitter kind of become
more like four Chan under Elon Musk is really telling
to me, right, Like, it really seems like if this
is going to be the kind of stuff that is
tolerated on Twitter. It really says a lot about how
far Twitter has fallen under Elon Musk's leadership. Genevievo, an
analyst who studies deep fake, says that Twitter is four

(55:43):
Chan two. It's emboldening future malicious figures to coordinate towards
demeaning more popular women with synthetic footage and imagery. So
I should say Twitter technically does ban non consensual nudity,
but there is basically no enforcement of that because the
team that handles it Musk basically like fired and laid

(56:03):
off and got it as soon as he took over.
And I also just think that like Elon Musk just
doesn't care, Like he could not signal that more. If
he says he cares, I think that he is lying
because he knows that people are upset. But I just
genuinely don't think he sees this as the big issue
and the big threat that it is.

Speaker 4 (56:22):
Elon Musk doesn't care about women, what I know. I mean,
he's surprised.

Speaker 1 (56:30):
He seems like the kind of guy who would have
a stellar track record when it comes to respecting women.

Speaker 2 (56:34):
Absolutely, that is definitely what I associate him with.

Speaker 1 (56:40):
So after the Taylor Swift Deep things went viral, y'all
might recall that Musk was like, I'm going to open
a whole team and Austin dedicated to content moderation for Twitter.
But the day before these Bobby Outoff videos went viral,
Musk was just laughing off the need for any kind
of real content moderations. He shared a post that called

(57:00):
content moderation a quote digital chastity belt and a steaming
pile of horsemenre enforced only by digital tyrants, saying let's
give a big middle finger to content moderation and embrace
the chaos of the Internet. You know who probably does
not want to embrace this chaos, Women like Taylor Swift
and Social Gomez and Bobby Althof and any other woman

(57:22):
or girl who has been targeted and depicted in these
non consensual images. It is just such a clown show
over there. Like the fact that Bobby Althos's name was
trending on Twitter in relation to this was wild to me,
Like the trends used to be moderated by humans who
could like manually delete trends that were harmful or rule breaking.

(57:42):
But of course Musk fire the people who do that.
So now a woman's name can be trending for hours
on the platform, and the reason why it is trending
is because of deep fake images of her on the platform.
It's basically free advertising for this rule breaking behavior. And
you know, something else unique about Twitter is that it's
the only platform that allows nudity. So with the rise

(58:05):
of AI deep fakes, I don't know. I just think that, like,
I don't know that Twitter is going to be able
to really be responsible for sorting out what is consensual
real nudity and what isn't. I just don't think they're
really up for the task.

Speaker 2 (58:19):
Yeah, and it's it's a conscious choice that they're not
for the task. I think it's important to remember because again,
like they this is a problem that could be fixed
if they, you know, or at least sort of alleviated
to some extent. I think if they really wanted to,
and it's clear that they do not care.

Speaker 1 (58:38):
I completely agree. And this is just my opinion, and
I would be curious to know what folks think. I
have a sense that whoever is making these deep fakes
and then releasing them on Twitter specifically is choosing like
a specific kind of public figure. I think they're choosing
famous women who are perceived to be like annoying in

(59:02):
some way, and I think it's I think it's about
kind of like almost like a test of like, well,
you know, this woman is annoying or like she's everywhere,
she's like oversaturated. So if I make a cruel, sexually
humiliating deep thake of her, maybe people will like like it.

(59:22):
Like I almost I almost feel like there's something about
the women that they're targeting that is like, I don't know,
maybe I'm way off base here. I think they're picking
a specific kind of female public figure because they're not
expecting the public to loudly decry it when it's someone
who is perceived as like quote overexposed. So I think

(59:44):
it's I'm curious to see how this goes. I think
that nobody deserves this. I think that even if it's
somebody that you that you think like, oh, this is
the person is a billionaire, this person's very rich, I
see them everywhere, blah blah blah, nobody deserves this right,
Like it's completely fine to not like somebody without making
sexually humiliating content of them that is meant to take

(01:00:08):
them down, humiliate them, objectify them, and remind them and
the public that they do not have agency over their
body and their choices. And so I think it's disgusting.
I think it's even more discussing that Elon Musk is
essentially laughing this off as his platform becomes a unique
vector for advertising this this kind of material. And I

(01:00:29):
think that Elon Musk at some point needs to be
held accountable for this, Like he can read this Washington
Post article as well as I can and see how
his choices have specifically turned the platform into a marketplace
for this kind of behavior. And yeah, I guess I'll
just leave it there.

Speaker 2 (01:00:45):
Yeah, I want to add one word thing which I
think also it's especially with like you know, talking about
Kosa earlier in the episode. The problem isn't nudity. The
problem isn't explicit material. The problem is the fact that
this is taken without consent. This is material that is
like literally manufactured without any consent, without any action on

(01:01:09):
behalf of the person.

Speaker 4 (01:01:11):
It's like for it.

Speaker 2 (01:01:12):
Like like I don't know, I mean this is again
like this is another discussion, but like I'm of the
belief that like whatever, like poorn exists on the internet.
Nudity exists people like should you know, be able to
access that.

Speaker 4 (01:01:27):
I think that is like a fair thing to say,
But like.

Speaker 2 (01:01:31):
The part of this that is disgusting and messed up
is the fact that it is non consensual and the
fact that these are again oftentimes the same people that
are you know, supporting or like participating in efforts to
crack down on sex workers. That you know, if you
really just wanted to have like more porn in the world,

(01:01:53):
you would think you would just support sex workers and
people that do that stuff professionally.

Speaker 3 (01:01:59):
But yeah, it's not like the problem here, is it.

Speaker 2 (01:02:03):
And I think this is important to clarify again too,
with especially talking about the Kosa story, because I think
that isn't like I can easily see how this sort
of story would be twisted to be like, see, we
just need to get rid of all. No, the problem
is like, this is an issue with deep fakes. This
is an issue with a lack of respect for boundaries
and consent. And you know, this is about sexual violence.

(01:02:27):
This is not a condemnation of the fact that you know,
people are being portrayed a certain way or that nudity
is like making its way onto these platforms, you know, yeah.

Speaker 1 (01:02:39):
Absolutely, And we actually talked about this last week, how
like creeps are also using AI to like make reverse
deep fakes that are also not okay, where they women
who consentraally want to be in various stages of undress
on the Internet, they use AI to cover them up.
It's not about sexuality, and it's not about you know, nudity.

(01:03:00):
It is about taking away someone's agency without their consent.
Is about telling them that their choices for their own
body don't matter because we have the technology to strip
you from that choice. And like you said, Joey, if
you want to see nudity on the internet, God love them.
There are no less than a million people out there
who are in various stages of undress on the Internet.
But it's not really about that, right, It is about

(01:03:21):
I want this person who has not given me consent.
I want them to know that I can depict them
nude on the Internet against their will, without their consent.
It is about power. It is about humiliation. It is
about stripping women of their agency, not about like exppressions
of sexuality or nudity, like full stop. So I do

(01:03:42):
have a little bit of good Twitter news for folks.
So back when Elon musk first bought Twitter and laid
off a bunch of staffers, he was particularly cruel to
the staff in Ghana, Africa. The Twitter office in Ghana
had just been set up, so many of these staffers
had moved from other countries to take jobs at Twitter,
only to be let go with no notice a few
months in after Elon took over. They were promised a

(01:04:04):
month of wages under their contract and under Ghanaan employment law,
staff have to be paid out if they are laid
off for redundancy, which these staffers were. But as y'all know,
Elon musk Aka sharrees, she don't pay. He doesn't pay,
he doesn't pay his bills. He rap owes money all
over town. Who doesn't pay, So these staffers threatened to

(01:04:24):
sue to get their money. Twitter missed deadline after deadline
and essentially ghosted them, probably thinking that these you know,
Ghanaian Twitter staffers would just shut up and go away.
But they didn't shut up and go away, so after
a year they finally have gotten their payout. One of
the staffers says, it is difficult when the world's richest

(01:04:45):
man is owing you money and closure. I have a
say like shout out to the staffers for staying on it.
I'm happy they got their money. But I also have
to say, like watching Elon Musk do rich guys shit
like oh I'm going to Mars, Oh I'm gonna do this,
I'm gonna do that while he is at her owing
you money. That would shat my ass. But I guess
it's like easy to do a bunch of rich guy

(01:05:07):
shit when you don't pay your bills. When you can
rack up a bunch of bills and not.

Speaker 2 (01:05:10):
Pay Yeah, how uh do you expect him to get
so rich if he has to pay for things like
bills and labor? And I don't know, that's just too
much for one guy.

Speaker 1 (01:05:25):
I mean I can like go to a restaurant and
like pop bottles and you know, get a private jet
if I don't pay for anybody can. If that's if
that's the secret to living a wealthy lifestyle, Like anybody
can do that, Just go rack up a bunch of
bills and just not pay.

Speaker 4 (01:05:41):
Just something.

Speaker 1 (01:05:43):
And I have to give a major shout out to
CNN's Larry and Medoo for really staying on this story.
He first reported it back when Elon must took over
Twitter and stayed on this beat. So thank you for
helping to get this accountability. Larry, and I have one
other quick piece of good news kind of her me specifically,
which is that I feel like, you know, I've been
trying to use alternatives to Twitter, like Blue Sky and

(01:06:07):
threads more, and I feel like I have gotten my
first truly good bit of Blue Sky low stakes drama.
That Joey, if you will, re opened this episode with
me explaining some drama to you. I would love to
close with explaining a little bit of low stakes drama
to you, if that's okay.

Speaker 4 (01:06:24):
Cours always here for the drama.

Speaker 1 (01:06:27):
So someone on Blue Sky there they unfortunately had a
death in their family and they posted a picture of
a woman who was wearing like a big hat and
sunglasses who came to the funeral. And this person was like,
this person crashed my family member's funeral and they ate
a bunch of free food. And it wasn't until my
family went up to them and was like, hey, who

(01:06:48):
do you know here? Like how did you know the
deceased that we realized they were a funeral crasher. I'm
so angry, and rather than people being like, yeah, that's
really messed up for someone to do this, there are
a lot of people were like, well, what if they
just wanted a meal? What if they were just lowely?
You're so rude not letting this person crash the funeral.

Speaker 6 (01:07:08):
Oh no, wait, no, this is just I uh, it's
the Twitter thing where if you're not oh my god.

Speaker 4 (01:07:18):
No.

Speaker 1 (01:07:19):
So this is how I feel like, I know that
Blue Sky might actually be popping because you've gotten our
first like truly deranged group response to somebody's like very
legitimate complaint, like what kind of asshole doesn't want a
stranger at a family funeral? Like so I think, I think,
I don't know. It lifted my spirits to see this.
I was like, oh, yeah, nature is healing. We're having

(01:07:41):
this kind of discourse again, and so yeah, maybe blue
Sky really will be popping now that it is a
place to have truly deranged discourse in one place.

Speaker 4 (01:07:52):
You know what, it is a.

Speaker 2 (01:07:53):
Pillar of our society, and I'm I'm glad that's continuing.
I'm glad I'm getting to hear everyone these unsolicited takes.

Speaker 1 (01:08:04):
Oh my god, yes, give me your bad takes. I
want them to wash over me. Joey one person whose
takes I always think are good? Are yours? Thank you
so much for being here and helping us break down
these stories.

Speaker 2 (01:08:16):
Of course, Bridget happy to be here as always.

Speaker 1 (01:08:20):
Where can folks keep up with what your up to?

Speaker 2 (01:08:23):
You can find me on social media and you know
on on Instagram and Twitter for the moment, did not
yet make a blue sky but I should be doing
that tune but yeah. You can find me at Patna Pratt.
That's p A T T n O T p r
A T t uh. You can also check out Afterlives

(01:08:44):
the Leley and Polanco story as a series that I
just worked on that wrapped up about a month ago. Uh,
check out that spiny episode that's gonna be in the
notes but end.

Speaker 4 (01:08:57):
Yeah, I'm everywhere.

Speaker 1 (01:08:59):
So check out Shoeby's work and thanks to all of
you for listening. I will see you on the internet.
If you're looking for ways to support the show, check
out our merch store at tegoty dot com slash store.
Got a story about an interesting thing in tech, or
just want to say hi? You can reach us at
Hello at teangody dot com. You can also find transcripts

(01:09:20):
for today's episode at TENG Goody dot com. There Are
No Girls on the Internet was created by me Bridget Todd.
It's a production of iHeartRadio and Unboss Creative, edited by
Joey pat Jonathan Strickland is our executive producer. Tari Harrison
is our producer and sound engineer. Michael Almada is our
contributing producer. I'm your host, Bridget Toad. If you want
to help us grow, rate and review us on Apple Podcasts.

(01:09:41):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.