All Episodes

January 26, 2024 68 mins

Kelly Osbourne addresses infamous 2015 remarks about Latinos on 'The View': https://www.nbcnews.com/news/latino/kelly-osbourne-addresses-infamous-2015-remarks-latinos-view-rcna134755

Researchers Say the Deepfake Biden Robocall Was Likely Made With Tools From AI Startup ElevenLabs: https://www.wired.com/story/biden-robocall-deepfake-elevenlabs/

AI-Generated Taylor Swift Porn Went Viral on Twitter. Here's How It Got There: https://www.404media.co/ai-generated-taylor-swift-porn-twitter/

Oklahoma schools gig for Libs of TikTok founder: Does it meet state's own rules? https://www.usatoday.com/story/news/investigations/2024/01/26/libs-of-tiktok-chaya-raichik-oklahoma-school-library/72361037007/

Conversion therapy content is being banned by social media companies thanks to the work of LGBTQ+ group: https://www.advocate.com/business/conversion-therapy-ban-social-media

Man Jailed, Raped, and Beaten After False Facial Recognition Match, $10M Lawsuit Alleges: https://www.vice.com/en/article/3akekk/man-jailed-raped-and-beaten-after-false-facial-recognition-match-dollar10m-lawsuit-alleges

Parents worry AI-generated influencers are promoting unrealistic beauty standards to kids: https://www.nbcnews.com/tech/internet/parents-worry-ai-influencers-promote-unrealistic-beauty-standards-rcna134814 

AI is producing ‘fake’ Indigenous art trained on real artists’ work without permission: https://www.crikey.com.au/2024/01/19/artificial-intelligence-fake-indigenous-art-stock-images/ 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet. As a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet.

Speaker 2 (00:17):
Joey, welcome back to the show. It's so nice to
see you.

Speaker 3 (00:20):
Okay, Bridget, it's nice to you too.

Speaker 1 (00:23):
So before folks ask if my voice sounds a little different,
it's because I'm a bit under the weather. I've been
saving my voice all day. This is the first time
that I've really spoken out loud all day.

Speaker 2 (00:34):
So hopefully it'll agree with us for me to record this.

Speaker 1 (00:37):
But if you're thinking, oh, Bridget sounds raspy or husky,
that's what's going on.

Speaker 2 (00:42):
So bear with me. Joey.

Speaker 1 (00:44):
I want to start with a question. I know it
has been a weird week. January is also that month,
at least for me, that it feels like nothing is right.
Everybody is broke, tired, cold, sober. If you're doing a
dry January. All of that said, is there something on
the internet that is giving you joy in this moment?

Speaker 4 (01:04):
Oh, that's a good question. There's definitely a lot that
is stressing me out right now. But oh my god,
there was this TikTok trend recently that was I can't
remember who it was, but it was like this clip
from the View and it was the person like, you
know what, it was the clip of somebody going Donald Trump,

(01:25):
if you pick out.

Speaker 1 (01:26):
All the like, my gosh, whatever, it's Kelly Osbourne. I
know exactly what you're talking about. Oh my god, I'm
still glad that you brought this up. It's Kelly Osbourne's
are going like, if you kicked, if you click out
all out of this country, who is going.

Speaker 2 (01:41):
To scrub your Toilet's Donald Trump.

Speaker 4 (01:44):
So it was like people doing that meme, but like
different formats, but you know, there were just there were
a lot of fun ones going around.

Speaker 3 (01:52):
Love a good trend.

Speaker 1 (01:55):
That is one of those pop culture moments that lives
in my mind, rent Free. I also love the reaction
of all the different people at the View because Kelly
Osborne says it, and she says that very you can
tell that she was like trying to make a point
that like didn't land and immediately everybody's like, oh no,
and she's like no, but in the sense that and
they're just like, uh no.

Speaker 2 (02:15):
And I just read this interview with her.

Speaker 1 (02:18):
About that moment because it's been taking off on TikTok
and honestly, like, we'll throw the link to the interview
in the show notes if I can find it. But
she's like, you know, sometimes you have to learn in public.
And I know the point that I was trying to
make and maybe it didn't land, but you know, you
have to grow sometimes. And I appreciated how she handled it.
And I did see a TikTok where somebody made the

(02:41):
point that she was trying to make more succinctly. They
were like, well, if you kick out all of the
immigrants in this country, that is going to be impactful
to all of us, because immigrants bring give so much
to our country, and actually, like we should be respectful
and honor the different ways that their labor makes our
country great. And then in his version, it's like, oh,

(03:02):
that actually makes sense. So she was trying to make
a point. The point didn't land, but it is a
pop culture moment that will stick with me.

Speaker 3 (03:09):
Yeah, I know, that was so funny, and that's good
to hear that.

Speaker 4 (03:13):
At least it seems like she's handling it, you know,
with grace to something I said, because yeah, that is
that is like a you know I'm not gonna say
that when she said wasn't wrong, that was definitely a
weird thing to say, But.

Speaker 3 (03:25):
You know, we all got to Maiger mistakes and grow.

Speaker 4 (03:28):
And I do also agree with the point where it's like, yeah,
we you know, there's a conversation about the like labor
that goes unnoticed and particularly like jobs that people you
know don't necessarily.

Speaker 3 (03:42):
Want to have, and immigration and all that, and that's
definitely a.

Speaker 4 (03:46):
Conversation if you had so yeah, but got a fun
TikTok trend out of it.

Speaker 2 (03:51):
Yeah, I feel I.

Speaker 1 (03:52):
Mean, as podcasters, having to grow in public and learn
stuff in public has been part of the So I
definitely feel for Kelly Osbourne of oh yeah, oh yeah,
I embarrassed myself and a lot of people probably heard
it and that was that was humbling.

Speaker 4 (04:10):
Oh yeah, I like I could not imagine like having
to be on live TV like this is very closely edited.
I'm definitely grateful for that because there's a lot of
times when I'm like I need to like stop and think.

Speaker 2 (04:22):
Yes, oh my god, Joey.

Speaker 1 (04:24):
If you're like listeners, if you're ever listening and you're like, wow,
I can't believe Bridget or Mike or one of the
guests was able to say that so eloquently on the fly.

Speaker 2 (04:33):
It's because we didn't. That is the magic of Joey
and our other editor, Tari, making us sound smarter and
more eloquent than we are on the fly if this
was live TV. Never absolutely not. And also want to
show like.

Speaker 1 (04:45):
The view where the whole point is you're sort of
arguing and like talking over each other and having discourse.
I would have to be like, everybody be quiet so
I can catch my thoughts, like, so I can get
my thoughts together every five minutes.

Speaker 3 (04:57):
Yeah, definitely.

Speaker 1 (04:59):
Okay, So this election season fully kicked off. The New
Humpshire primary appened this week and some voters got a
call at the very last minute from someone giving them
information about the election.

Speaker 5 (05:11):
You know the value of voting democratic on our vote's count,
It's important that you save your vote for the November election.
Will need your help in electing Democrats up and down
the ticket. Voting this Tuesday only enables the Republicans in
their quest to elect Donald Trump again. Your vote makes
a difference in November, not just Tuesday. If you would

(05:33):
like to be removed from future calls please press two.

Speaker 1 (05:36):
So that sounded like Joe Biden, but that was not
Joe Biden. That was an AI generated audio deep fake
that voters in New Hampshire before this week's primary got
on the phone telling them not to vote. And I
don't have a ton to say about this that I
have not already said, but I'll just reiterate it does
not vode well for the use of AI to spread

(05:57):
disinformation and make that impact a lot worse this election cycle.
This is something that I'm worried about thinking about all
of the folks who I am in community with who
also study and work on disan misinformation, particularly how it
impacts democracies, are all sort of saying the same thing,
which is that AI could be poised to really make disinformation,

(06:18):
which is already a problem so much, so much worse.
And what's worse is that we're not really doing a
ton about it right now. The nonprofit consumer advocacy group
Public Citizen says that that robo call really underscores the
need for federal regulation of AI generated deep fakes and
that it is way past time for action. I completely agree,

(06:39):
but this comes as Republican SEC Chairman Sean Cooksey says
that they're not even looking at establishing any kind of
updated rules federally around deep fakes and political ads. At
the earliest, they're going to do it early summer, which
that is after many states have their primary. So it
just seems pretty late to be like, oh, we will

(06:59):
be rolling out in summertime, you know, after states have
already held their primaries. Feels pretty late in the game
to be trying to curb the impact.

Speaker 2 (07:07):
Of this kind of thing.

Speaker 4 (07:08):
Yeah, I'm glad you opened with the positive question because
I do feel like, I think because the primary started now,
like the reality of the fact that there's going to
be an election this year has like started to hit
and it's just been like.

Speaker 3 (07:24):
Absolute terror every time I think.

Speaker 4 (07:27):
About it, Like there really is no kind of good
outcome I could see right now.

Speaker 3 (07:32):
Not to be.

Speaker 4 (07:33):
Complete pestmost, but yeah, I know, I feel like this
is going to be a weird, weird year.

Speaker 2 (07:41):
Friend, don't even get me started.

Speaker 1 (07:44):
I it's one of those things that if I I've
told myself that I'm not going to start really thinking
about the election in earnest in a meaningful way until summer, Like.

Speaker 2 (07:55):
I guess I'm kind of like the fec here. I'm
like I because if I start thinking about it.

Speaker 1 (08:00):
Now, there's plenty of time for me to spin out
and doom and gloom. Everybody that I know who works
on these issues is having almost a sort of like
philosophical crisis around the upcoming election. I genuinely don't know
how it is going to go. We are very worried
as a community. And it's so weird because I remember

(08:22):
when we were preparing in the like Disinfo Election Disinfo
Democracy space, preparing for the last election. We were gaming
out all the different scenarios, and I will never forget
one of the scenarios that we were gaming out was
some sort of a so it was like, oh, if
Trump wins, here's the plan. If Trump loses, here's the plan.

(08:45):
And then a third thing that was like if there
is a widespread election related incident flash irregularity. And at
the time, like we didn't know what we were preparing for.
Turns out it was January sixth. That was very precient. Yeah,
it feels like we're there all over again.

Speaker 4 (09:04):
Oh my god, God, that's so weird to think that
that was like, yeah, there was the election, and then
that I don't know, I feel like it all kind
of blurs together. But yeah, that was such a crazy
time and it's definitely gonna be an interesting next couple
of months. Yeah, not looking forward to it. I I'm

(09:26):
not surprised to hear that people are having their own
kind of crisisies around it.

Speaker 1 (09:31):
This might be like, like too personal. But during that election,
my dad, who is chronically ill and disabled, he was
in the hospital and I went to I dropped everything
all the election work to go be with him in
the hospital. And it was a cognitive issue and so
he was, you know, not awake, was asleep for a

(09:53):
lot of that time, resting, and we were watching it
was it was election night, right, and so we were
watching the election results in his hospital room. And so
he woke up from this like drugged up, medically induced
sleep to seeing Donald Trump on television the day after
the election claiming that he had won, and my dad
was like what, like what did I miss it?

Speaker 2 (10:13):
Did he win? And I remember being like, Dad, I know.

Speaker 1 (10:16):
It looks like the the how where I would have
to start to be like why are you why you
are watching Donald Trump on CNN claiming that he won
an election, and he being like he didn't win.

Speaker 2 (10:27):
Don't worry.

Speaker 1 (10:27):
I know it seems like he did because what you're
seeing on TV, but just take my word for it,
he didn't.

Speaker 2 (10:31):
It was a very weird time, I guess, is what
I'm saying.

Speaker 3 (10:35):
Yeah, I know that's sounds like a crazy conversation.

Speaker 1 (10:41):
So, you know, thinking about how I feel, and you know,
civil society groups who are looking at this really feel
like not enough is being done. If there was a
theme to this episode, it would be do something, listen
to people, don't wait till it's too late.

Speaker 2 (10:55):
It feels like it's becoming too late.

Speaker 1 (10:57):
And another way that we are seeing that this week
is through non consensual AI generated sexually explicit deep fake.
So last week on the News round Up, Mike and
I were talking about this New Jersey high school student
who has been advocating for legislation that criminalizes non consensual
AI deep fakes after the boys in her class were
trading deep fake sexualized images of her and about thirty

(11:21):
other girls in her school. And this week, a school
district in Aurora, Colorado, is dealing with the very same thing.
This week, the Aurora Police Department has released the names
of schools involved in a sextortion investigation, which includes two
middle schools. When I was in middle school that was
grade six to eight. So like, you're like eleven, twelve, thirteen, fourteen,

(11:41):
you are so young to be dealing with the impacts
of a sexually charged deep fake sextortion ring. Like, I
don't think it is okay for kids to be having
to deal with this. So police say that in six
different instances, students at that school district reported being a
direct target of sextortion scheme after being contacted by the

(12:02):
suspect or suspects through Instagram. In dozens of other cases,
students received unsolicited invitations to pay to join a close
friends list on Instagram where sexually explicit material had been posted.
So it really is a marketplace where kids are either
being invited to pay to get this sexual content or
being made to pay to keep that content from being

(12:24):
shared on these platforms. A nightmare. And again, like, I
don't think this is something that should just become normalized
for our kids. If we did another update every single
time one of these AI related sextortion rings was reported
on in a different school district across the country, it
would be all we talked about. Because it is happening

(12:45):
so much. This should not become a normal thing for
our kids, and it seems like it is becoming normal,
and lawmakers are just dragging their feet to actually do
anything to prevent this from becoming a new thing that
kids just have to deal with.

Speaker 4 (12:59):
Yeah, no, that's so scary and yeah, like exactly, Like
I don't know. I think I was like eleven when
I started middle school, Like you're a baby, that is
a child, like.

Speaker 3 (13:09):
And I don't know, it's I was so like when
I was in middle school.

Speaker 4 (13:13):
That was kind of like the beginning of or I
guess not the beginning technically, but like social media was
just kind of like becoming a thing like Instagram and Snapchat.

Speaker 3 (13:22):
Or like starting to become big.

Speaker 4 (13:24):
And I mean it's so sad because I do remember
then like a lot of the conversations were about like
sending nudes or like sending or like sexting or all
of that, or.

Speaker 3 (13:39):
You know, people getting groomed online and.

Speaker 4 (13:41):
All of that, and it's it's it's weird now being like, wow,
that all was so messed up and it is continuing
to happen, and it's continuing to happen in a way
that's like even more dangerous kind of and even more
sort of like out of control, which just is so scary,
Like I can't imagine being a kid right now, Like, yeah,

(14:04):
that honestly just sounds terrifying.

Speaker 1 (14:06):
It's terrifying. And our kids should not have to be
dealing with this, Like nobody should have to be dealing
with this. We cannot have a healthy society when this
is just tolerated and is like just becomes part of
the experience of being a young person, which I just
don't accept that this has to be part of that.

Speaker 6 (14:28):
Let's hit a quick break at our back.

Speaker 1 (14:43):
And this week, Taylor Swift was also targeted by non
consensual deep faith images that flooded Twitter. According to Verge,
these images got forty five million views, twenty four thousand reposts,
and hundreds of thousands of likes and bookmarks before the
verified blue checked user who shared the images had their
account suspended for violating platform policy. The post was live

(15:08):
on the platform for about seventeen hours prior to its removal.
So four o four Media has this really in depth
piece of reporting about how those images originated on Telegram,
which if you don't know what telegram is, it's kind
of like an alternative messaging platform that really gained popularity
with right wing extremists, but other folks use it too,
like a lot of journalists depend on it. So Twitter

(15:28):
actually has rules that ban AI generated deep fake images,
but I can confirm, as of this recording, many of
those images of Taylor Swift, those deep faked, non consensual
sexual explicit images, were still floating around Twitter.

Speaker 2 (15:44):
You know.

Speaker 1 (15:44):
The one user who brought those images from Telegram to
Twitter might have had their account deleted, but they's because
of the way the internet works. Plenty of them are
floating around as we speak, and so the hashtag Taylor
Swift ai was even trending. So you might think, like,
why is Twitter allowing this? Shouldn't they do something? But
remember when Elon Musk took over Twitter, one of the

(16:07):
first things he did was gut the trust and Safety team,
and content moderation there has kind of been like more
or less non existent, like not totally not existent, but
there's just not a lot of robust content moderation happening there.
Like the fact that this could be up for seventeen
hours kind of says something.

Speaker 4 (16:22):
Yeah, I was gonna say, I I feel like at
this point. Nothing, there's nothing, There's very, very little that
could happen that would make me think, wow, why is
Twitter allowing this? Because I know the answer is Elon
Musk does not care and in fact is probably encouraging this.
I will say, obviously the story is terrible and horrible.

(16:44):
This is weirdly the second thing on my like twenty
twenty four predictions.

Speaker 3 (16:50):
Thing that I made at the beginning of the year
to happen.

Speaker 4 (16:53):
What twitch is, Yeah, because like I don't know if
there were like tiktoks that were going around and stuff
where people were like making a list and they're usually
kind of goofy.

Speaker 3 (17:02):
And I I did one too, and one of them
was I was.

Speaker 4 (17:04):
Like, there, I feel like there's going to be some
Taylor swift Ai controversy where like explicit photos are made
and she's gonna get Like, I mean, I don't We'll
see where this story ends, but like you know, I
believe that the Swifties is a powerful.

Speaker 3 (17:19):
Political force, so who's say. Yeah.

Speaker 4 (17:23):
The other one was that uh, Brian Gosling was gonna
get ATN nominated and Marco Robbie is not going to
get an obmated.

Speaker 3 (17:28):
Oh my god, so it has been a weird week
of psychic predictions.

Speaker 1 (17:35):
Apparently, keep us posted if any of the of any
of the other predictions, the other Joey twenty twenty four
predictions come true, that one's happened to me. I predicted
that this was back when he was like much more popular.
I predicted that Justin Bieber was going to get caught
saying the.

Speaker 2 (17:51):
N word, and he did.

Speaker 1 (17:53):
I was like, wow, yeah, so you so you said
that the Swifties are this powerful force, and you are
right about that because even though Elon Musk has not
been able or Elon Musk and his team have not
been able to rid the platform from this image these images,
Swifties actually took it upon themselves to try to flood

(18:14):
the Taylor Swift AI hashtag with real actual content of
Taylor Swift performing to drive down that sexually explicit fake content.
And you know, I know, the Swifties are like a
big force, Like they're like an organized force for good
at times, I firmly believe that like they should be

(18:36):
being able to spend their time like you know, like
they shouldn't have to do this is is what I'm saying.
They should be able to spend their time like buying
merch and trading merch and making bracelets and all of
all of the fun, the sparkly things that come that
go with being a Swiftie.

Speaker 3 (18:50):
Yeah, like they should just be allowed. Yeah, Like they
should be able to just be fans of the thing.

Speaker 4 (18:54):
They should have to be like the PR team for
the volunteer PR team for like a billionaire celebrit and
not that it's on her team to like, you know
that this happened, but it is like they shouldn't have
to be the content moderators for Twitter probably some other
way of putting it, Yeah, especially because I'm sure a
lot of these people are pretty young.

Speaker 1 (19:15):
Yeah, you like we should it should not be up
to like one thousand Swifties to keep potentially a legal
content off of a platform that's run by a billion ere.
That's not a dynamic that I'm comfortable with. And I
have to say, like, it feels really gross to talk
about those images, but I just want to add, because
it's something that has been in my mind a lot,

(19:35):
is that the images are football themed. They depict Taylor
Swift at a big football game. And I don't know
for sure, but I have to assume that this is
related to the fact that she's gotten so much attention
from going to those NFL games because her boyfriend is
a football player, and every time she goes to a game,
it is like breathlessly reported on. And I guess I

(19:58):
just don't think that it's a coincident that these images
depict her at a football game, and I think they're
meant to humiliate her on the basis of the attention
that she has gotten from the NFL.

Speaker 2 (20:10):
And I think that they're they're meant.

Speaker 1 (20:11):
To like, like, people who make visual disinformation are so
good at doing so in a way that works to
get people. They're just very charged. They're very good at
making images that are very charged. And I can't help
but notice the ways that these images seemed charged to
humiliate her in a very specific kind of way about

(20:35):
a very specific thing about her personal life.

Speaker 3 (20:38):
Yeah, that definitely doesn't seem like it's a coincidence.

Speaker 1 (20:41):
Yeah, And you know, I've seen some takes that are like, well,
I'm glad it's happening to Taylor Swift. She is so
powerful and so rich, and that like she'll will be
able to do something like certainly her team will be
able to do something. But I push back on those
takes because like, these images are up, right, like I
confirmed that before we started recording, and it so like

(21:04):
even if Twitter wanted to remove these images, I don't
think they would be able to just by the nature
of how the Internet works.

Speaker 2 (21:12):
And I think it just speaks to the.

Speaker 1 (21:13):
Fact that the powers that be have let this problem
get so out of hand that I don't know what
a fix looks like. And if it's happening to Taylor Swift,
one of the wealthiest, most popular, and most powerful women
in the world, I don't think that it's like, oh, well,
she'll be able to figure out something where this won't
happen to other people.

Speaker 2 (21:30):
It's already happening to her.

Speaker 1 (21:32):
So if it's going If it happens to her and
those pictures are up for seventeen hours and then still
up after that, what hope is there for any of us,
Like what happens when it happens to a high schooler
or a child or any of us, Like lawmakers have
ignored the warnings and gotten no traction on legislation on this,
and I just don't think any of it bodes well

(21:53):
for anybody.

Speaker 3 (21:54):
Yeah, definitely.

Speaker 4 (21:57):
Like I'm glad, and again, I you know, it is
it's I don't obviously like it shouldn't happen to anybody.

Speaker 3 (22:03):
It just kind of it. It's horrible for her on
a personal level.

Speaker 4 (22:07):
And then also like it just kind of enforces that
sort of idea that like women's bodies are like public property,
and especially if they're like in the spotlight.

Speaker 3 (22:17):
But yeah, like, I don't.

Speaker 4 (22:19):
Think if this happened to me, my like three hundred
Twitter followers would be able to you know, change like
flood the hash not that there would be a hashtag,
but you know, yeah, it is.

Speaker 3 (22:30):
It is sort of concerning.

Speaker 4 (22:31):
That like one of the most powerful women in the
world cannot this happen to her with like very little
consequences at the moment.

Speaker 2 (22:40):
Yeah, I mean that you just really nicely articulated.

Speaker 1 (22:43):
One of the things I always say about this kind
of garbage is that these kinds of exploitative content, it
reinforces the idea that women, by virtue of showing up online,
our body and our images are just open for whoever
wants to use them or exploit them in any way

(23:04):
they can. And so I've seen people be like, oh, well,
these pictures aren't real, and she's a public figure, so
like that it goes along with it absolutely fucking not
the cost of showing up and being a public figure,
and you know, being on the internet should not be
to be sexually humiliated. Like and I do think it's
something about reinforcing this idea that, yeah, our bodies are

(23:27):
just up for the taking to be depicted however anybody wants.
And then on top of that, we're supposed to not
be upset by it because it's quote not real, absolutely not,
absolutely not that as a terrible dynamic, that is a
terrible norm. We should all be pushing back on that
because certainly, I'm sure you know that high schooler who
is having the high school student who is having this

(23:48):
happen to her, isn't saying like, oh, it's not real images.
These images, whether they are real or not, they can
do real damage and really impact people's lives. And yeah,
I just think that we should all be pushed for
a culture that recognizes that.

Speaker 4 (24:02):
Yeah, and I definitely it's this is one of those
things too where it's like it's it's terrible, and it's
terrible it's happening to again, it's it's it's terrible that
it happens to anybody.

Speaker 3 (24:11):
I think, like obviously.

Speaker 4 (24:13):
Taylor's story is going to get a lot of attention,
and it is something that is happening to like a
lot of people, and it's probably going to continue to happen.
And interestingly, Slash not surprisingly, you know, it's happening around
the same time and like by the same people that
want to crack down on like sex workers and sex
workers rights. And I think this is another situation looking
at AI where you know, it really is important to

(24:38):
listen to sex workers about these issues because a lot
of times they're going to be like the first ones
harmed by this kind of stuff, Slash exploit it, and
we all are going to end up suffering because of
these sort of exploitative things. So yeah, no, I just
I think listen to sex workers about these issues. Like

(24:59):
is the weird sort of like duality of those efforts
to take away like agency over the bodies of like
non CIS men and at the same time, like those
bodies are kind of seen as public property in these
sort of situations, which is scary.

Speaker 1 (25:16):
Yeah, almost every one of the tech and internet facilitated
harms that we talk about on this show.

Speaker 2 (25:23):
Sex workers have been like, oh, yeah, we called that. Yeah,
we were warning about that.

Speaker 1 (25:26):
You knew about that, And if only people with power
I don't know, listened, I wonder where we would be.

Speaker 6 (25:37):
Let's take a quick break.

Speaker 2 (25:50):
At our back.

Speaker 1 (25:52):
So from one unsavory story to another, there are kind
of a lot of unsavory stories on this episode. I apologize,
but also it's the happier stories. So I know, I know,
I really deeply hate talking about this person. I think
we've only ever talked about this person once before on
the podcast, I think, But here we go. So far

(26:15):
right hate mongering influencer Chai Rachik, who runs the inflammatory
account Libs of TikTok, has just been appointed to the
Oklahoma Education Departments Library Media Advisory Committee, despite the fact
that this person does not live in Oklahoma, it's not
an Oklahoma resident, does not have a child in the
Oklahoma school system, and is a former real estate agent

(26:37):
with no clear background or history or ties to education.
And this person is going to be advising the state
on what kind of media should be in school libraries. Wow, Okay,
So if you don't know what Libs of TikTok is
it is an account where they will like post videos
a lot of times of perceived LGBTQ use or the
adults who support those youth to their millions of followers

(27:00):
to like make fun of them, demonize them, you know,
smear them, lie about them, all of that, and in
several instances, educators or school libraries or hospitals that are
featured on that account will then get death threats or
lots and lots of coordinated harassment. So Libs of TikTok

(27:21):
has been linked to threats at schools and hospitals across
the country. A friend of mine, will Caroless, who writes
about extremism at the USA Today, has chronicled how her
content has led to real world harm and disruptions. So
in Davis, California, a library received a bomb threat that
included hate speech according to police, and the library and
the school nearby had to be evacuated.

Speaker 2 (27:42):
So this came after a group of.

Speaker 1 (27:44):
Speakers started referring to a group of female trans athletes
as quote biological males. A librarian asked them to leave rightly,
so that interaction is caught on a video, and then
a far right backlash from Libs of TikTok and others
ensued in Tulsa, Oklahoma, where this woman has just been
appointed to you know, guide the material that children should

(28:05):
be receiving. Well, in Tulsa, Oklahoma, their school received bomb
threats the day after. Libs of TikTok tweeted about the
school's librarian, who posted a video that was obviously meant
to be a joke, obviously meant to be tongue in chic,
Like anybody who saw this would be like, yeah, she's
kidding around. The librarian and Tulsa posted my radical liberal
agenda is teaching kids to love books and be kind.

(28:27):
So obviously that video is meant to be a joke,
but they took it seriously.

Speaker 2 (28:33):
That's another thing about these people. It's like they have
the worst sense of humor.

Speaker 4 (28:37):
Okay, it's two things, because one, I'm so sorry, but
how much of a loser do you have to be
to like care this much about what's happening in like high.

Speaker 3 (28:46):
Schools, like huge losers on with your life, like.

Speaker 4 (28:51):
And then second of all, yeah, that is like that
is the most basic like joke you can may I
don't know.

Speaker 3 (28:58):
That is so weird to be like, oh my god,
they're so scary.

Speaker 2 (29:03):
Wow, I make this point a lot.

Speaker 1 (29:05):
Do you follow the comedian and writer or I guess
the actress to multi halfenite, multi hyphen it Trace Lassette.
So Trace was talking about Dave Chappelle's most recent net
Netflix special and TRACE's trans and she was talking about how,
like what is going on that? Like, so, okay, we
get it. Dave Chappelle doesn't like trans people. You need

(29:26):
to release multiple comedy specials about this, these people that
you apparently hate. Who spends this much time focused on
and talking about and thinking about and writing about stuff
like this, Like like, it is not normal. It's not
a normal way to exist. She's such a v if
y'all don't follow her on TikTok and on Twitter. She
is so cool, She's very funny.

Speaker 3 (29:47):
Oh my god. Yeah, and no, like absolutely it is giving,
like why why why are you so obsessed with me?
You know? It is really weird.

Speaker 4 (29:57):
I I gotta say, I don't think I've I've given
a single thought to what a high school curriculum is
like in my time since graduating high school. And yeah,
I don't typically, you know, hyper focus on people's gender
identity that have nothing to do with me or my life.

Speaker 3 (30:16):
But yeah, glad, that's.

Speaker 7 (30:18):
Yeah, people you'll never meet, Like it's a wild yeah,
I's And honestly, like when we were doing episodes earlier
in the summer about like book bands and stuff, one
of the things that was really surprising to me is
that oftentimes challenges for entire public school districts were.

Speaker 1 (30:35):
Coming from one parent who did not live in that
school district and did not have kids in that school districts.

Speaker 2 (30:41):
Was like, so you're just like obsessed.

Speaker 1 (30:44):
Over what students in a school that you don't have
any connection to our reading.

Speaker 2 (30:49):
What in the fucking world? Like that is just that
is not who is who does that?

Speaker 6 (30:54):
Like?

Speaker 1 (30:54):
I just like, obviously maybe you want to be a
Fox Newstar whatever whatever, I get it, But like, at
the or that is so I just I cannot understand
that mindset at all. I'm with you here in DC
where I live. Chaia Rachik from Lives of TikTok posted
a video where she films herself calling Children's Hospital, which
is like our major PDXTRA hospital here in DC, pretending

(31:17):
that she was looking for a gender affirming hysterectomy for
a non existent sixteen year old child. She posted this
recording of this conversation to Lives of TikTok last summer,
in which she is questioning two unidentified hospital employees about
whether or not they offer gender affirming hyst directomies to
patients who are sixteen.

Speaker 2 (31:35):
So it's a little bit of a weird situation.

Speaker 1 (31:38):
Basically, the hospital was inundated with harassing calls and threats,
accompanied by a social media post suggesting that the hospital
should be bombed and that its doctor should be jailed,
placed in a woodshipper or worse, and in the end,
Children's Hospital confirmed that whoever she was speaking to in
that video one were not medical care providers for the
hospital and that the hospital does not even hysterectomies on minors.

(32:02):
The hospital said, none of the people who were secretly
recorded by this activist group deliver character our patients. The
information in the recording is not accurate. We do not
perform gender affirming hysterectomies for anyone under the age of eighteen.

Speaker 3 (32:15):
Yeah.

Speaker 4 (32:15):
Also, it is insanely hard to get a hysterectomy as
an adult, like somebody over the age of eighteen, like
as somebody.

Speaker 3 (32:23):
Who like I'll be totally honest.

Speaker 4 (32:27):
My late teen years, almost every single doctor's appointment asked
if I can get a hysterectomy just because I didn't
want to keep getting my period and don't want.

Speaker 3 (32:34):
To have kids. Like it's a no every time. It
is a staunch no. That is insane.

Speaker 4 (32:42):
Yeah, and again, like as as an adult, there's just
so many hoops you have to go through, even if
it's like a legitimate medical issue.

Speaker 3 (32:50):
Yeah, so crazy.

Speaker 1 (32:52):
I've heard from friends. But one of my friends was
able to get a hysterectomy, but she had two doctors
be like I'm not going to perform it just in
case you want to have kids someday, and she's like,
I don't want to have kids.

Speaker 2 (33:05):
I am in a relation. I am a woman in
a relationship with a woman.

Speaker 1 (33:08):
So like, accidental pregnancy is not a thing that's like
on my I have no plans for children in the future.
But like making a medical decision based on a hypothetical
maybe someday man who might maybe someday hypothetically want to
be in the mix to have a baby, it is
all I have to say, Like, you are so right.
These people are trafficking in fictions where they are just

(33:30):
handing out hysterectomies to whoever wants one, regardless of anything
that is not happening. And it's like it's like not
based in reality these things that they And it doesn't
surprise me that the hospital has to be like, yeah, this
is the thing that they said happened that to lie
doesn't happen. And a similar thing happened at a children's
hospital in Boston as well, where their services for children,

(33:52):
for babies, their healthcare were disrupted because of all of
these calls and threats and harassment because of this libs
of TikTok stunts. It's so infuriating to me.

Speaker 4 (34:03):
Yeah, also like what I mean, I guess it's it's
the same trying to apply logic to people that are
not acting logically. But it's like, I guess it's the
same stuff as like people that bomb abortion clinics, But
what is the point of bombing a hospital? Like literally
they just would rather people be dead than me trans

(34:26):
like that is that's the endgame of this, which is yeah, scary.

Speaker 1 (34:31):
Yeah, it's so fucked up. So Superintendent of Public Instruction
Ryan Walters said CHAIA is on the front lines of
showing the world exactly what the radical left is all about,
lowering standards, pourn in schools and pushing woke indoctrination on
our kids. Because of her work, families across the country
know what is going on in schools across the country.

Speaker 2 (34:51):
This is so like, this is like like what the fuck?
Levels are bad.

Speaker 1 (34:57):
But I will say, though, as bad as this is,
some folks in Oklahoma are pushing back.

Speaker 2 (35:00):
State Rep.

Speaker 1 (35:01):
Mark McBride, chairman of the House Appropriation Subcommittee on Education, said,
I don't see any need to have a twenty eight
year old realtor from New York that has no children
appointed to this position when there are extremely qualified parents, teachers,
and librarians in Oklahoma, which I think like a little
bit of a drag mare. Like when somebody from Oklahoma
highlights that somebody is from New York, I think it's

(35:22):
like clear that that's not a compliment.

Speaker 4 (35:25):
Yeah, from the Midwest, who's from Chicago, which is veryd Oklahoma,
But like when somebody really emphasizes the New York that
is not a compliment. Also, I did not realize that
she was twenty eight. That's wild, girl, You should be
at the club.

Speaker 2 (35:42):
What are you doing so Representative.

Speaker 1 (35:46):
Mickey Dollans, an Oklahoma City Democrat and former public school teacher,
said that this appointment might actually violate the department's rules
for advisory committees, which require members to be representative of
the people served. She's not from Oklahoma, like not a
parent in that district, so like difficult to say how
she would be representative of the people served.

Speaker 2 (36:06):
And something I always.

Speaker 1 (36:08):
I feel like I never really make it explicitly clear
when I'm talking about limbs of TikTok, which isn't often
because I cannot stand this person. But you might be thinking, like, well,
she doesn't come out and explicitly tell people to call
in bomb threats or harrass educators or like prevent kids
from getting care at hospitals. But that is part of
the strategy. It's called stochastic terror, and it's when you
use media or communications to sort of wink wink, nudge

(36:31):
nudge make somebody or something or an institution a target
of this kind of harassment. And so she's never going
to come out and say harass these people. She might
actually come out and say like, oh, well, I would
never advocate for harassment of people. But then she'll continue
to do the same thing that gets people and institutions
harassed time and time and time again. So I would

(36:54):
argue that she knows exactly what she's doing. And think
about it, like if every time you mentioned something that
you didn't like happening a school on your massive platform,
if a few hours later that school had to be
evacuated because of a bomb threat or harassment, what did
you stop doing that after a while, Like if you
really didn't want that to happen.

Speaker 3 (37:11):
Yeah, it's always funny, And.

Speaker 4 (37:15):
By funny, I mean just depressing that this is the case,
and probably intentionally the case, but it was always funny
that they seem to be doing exactly what they always are,
accusing the quote unquote radical left of doing where it's like,
I'm sorry, like the quote left you were talking about.

Speaker 3 (37:31):
The response to everything is just we'll go out and
vote and like whatever or like give us what.

Speaker 2 (37:37):
I don't know.

Speaker 3 (37:37):
It's kind of like, this is it's so crazy.

Speaker 4 (37:40):
You only the ones bombing people, You all the ones
called in bomb threats, like take a look at the mirror.

Speaker 1 (37:46):
Please absolutely, And to put this in a larger context,
last year, a court ruled that Oklahoma can enforce its
law banning and criminalizing gender affirming care for transminors, while
a suit against it is being heard, that law into
effect would make providing gender affirming care in Oklahoma a
felony right. And so when you when we're talking about

(38:06):
this happening in Oklahoma, it's really important to understand that
larger context of what folks there are up against. The
climate that they're up against. It is not a safe
climate for these youths. And I just think that it
breaks my heart that you've got dipshits like this who
are promoting people like Chaia to positions of power within

(38:27):
a school district that she does not even have any
connection to. Just to drive home how much they are
not welcome there, and yeah, it's just really horrifying. I
think that Superintendent of Public Instruction Ryan Walters should really
be a shame because also this is not serving the
constituents and the parents and the youth of that community,

(38:48):
like inviting a hate monger in. And they're always like,
just kind of like what you said. They're always like, oh, well,
it's people on the left who are making everything woke
and making everything political, and blah blah blah. You are doing that, sir,
by you bringing this person with no connection to the
community in and giving them a bigger platform, giving them
a position of power in your community, you are the

(39:08):
one who is politicizing everything. You are the one who
is making everything inflammatory. I just I really hate it.
I really hate it.

Speaker 6 (39:15):
Yeah.

Speaker 4 (39:16):
I also like most of these people, I'm like, name
a single trans kid that you know, Like, yeah, I
guarantee you are just this is something you think is
going to be a big issue, but.

Speaker 3 (39:27):
Without even really looking at the reality, which is not
to say that there aren't trans kids in these communities.
There absolutely are.

Speaker 4 (39:33):
Like it's always coming from people that just kind of
have this imagined threat in their brain without ever having
interacted with the trans community or again, like a single
trans kid, which you know. Yeah, usually people that actually
have experience talking to trans kids or talking to the
trans community don't end up feeling this way, which is
part of why this happens. But yeah, why you know,

(39:56):
the lack of exposure to trans people is why this happens.

Speaker 3 (40:01):
But yeah, it is.

Speaker 4 (40:01):
It is always weird that is coming from kind of
this imagined threat than even any sense of reality.

Speaker 2 (40:07):
Absolutely, it's infuriating. I do have a little bit of good.

Speaker 1 (40:11):
News, and that was an upsetting story for all of us,
but so little good news. So called conversion therapy is
an abusive practice that is sensibly meant to change someone's
sexual orientation or gender identity that has been widely debunked,
and pretty much every medical association, like the American Psychological Association,
the American Psychiatric Association, the American Medical Association consider it

(40:33):
to be a dangerous pseudoscience. Rightly, twenty two states and
DC currently have laws banning this practice from being used
on minors. It's horrible, But here's the good news to
Twitter alternatives. Spoutable run by Christopher Bouse, who we actually
have talked to on the podcast before, and Post have
both banned content promoting conversion therapy. So most mainstream social

(40:55):
media platforms like TikTok, Pinterest, nextdoor, and Facebook and Instagram
all all explicitly ban content promoting conversion therapy.

Speaker 2 (41:03):
And now these smaller.

Speaker 1 (41:05):
Sort of niche Twitter alternatives Post and spoutable are also
joining that. Glad had this to say. The leadership of
both Post and spoutable and adopting new policies prohibiting so
called conversion therapy content puts these companies ahead of many others.
This is from GLAD CEO Sarah Kate Ellis. Glad are
just all social media platforms to adopt and enforce this

(41:26):
policy to protect their LGBTQ users. So this is good,
I think for a couple of reasons. One of them
is that we've talked on the podcast before about how changes,
both good changes and bad changes for bigger platforms can
be this domino effect where other platforms feel the need
to follow suit, and I think this is a good
example of the way that for smaller platforms it can

(41:46):
create a domino effect of the same thing, where these
smaller niche platforms are making this change and that changes
and being reflected for other small platforms.

Speaker 2 (41:55):
And so yeah, I think it's cool to see how.

Speaker 1 (41:57):
That domino effect can be working to create positive change
for smaller alternative platforms as well.

Speaker 3 (42:03):
Yeah, definitely great to hear some good news.

Speaker 4 (42:06):
And yeah, it was like the bare minimum of you know,
protecting your queer and trans users, customers, whatever, but it is.
It is such an important stepstake and it's good to
see that that's becoming more common.

Speaker 6 (42:25):
More.

Speaker 1 (42:25):
After a quick break, let's get right back into it.
So we already know that technology like facial recognition is
racist and sexist and unreliable and should not be used

(42:48):
for a whole host of reasons. And here is yet
another story that is a horrifying example of what happens
when police rely on it anyway to make arrests. So
Harvey Murphy Junior is a sixty one year old man
who was arrested in Texas for an armed robbery of
a Sunglasses Hut retail store in twenty twenty two, a
crime that he did not and could not have committed.

(43:09):
After facial recognition technology Fossley matched him as a suspect. So,
according to The Washington Post, a representative of a nearby
Macy's told Houston police during the investigation that the company's system,
which scanned surveillance camera footage for faces in an internal
shoplifter database, found evidence that Murphy had robbed both stores,
both the Sunglasses Hut and the Macy's. Now, police do

(43:31):
say that after the facial recognition technology flagged Murphy, they
then showed his image to the cashier at Sunglasses Hut,
who positively idd Murphy as the assailant, but it could
not have been him because he was actually already in
prison in Sacramento, California, some two thousand miles away at
the time the crime was committed. And he actually only
even found out that there was a warrant for his

(43:53):
arrest for this crime in Texas when he was going
to renew his driver's license in Sacramento, when the DMV
flat that he had an outstanding warrant and arrested him
for this crime that he could not have done. So
here's where the story gets really awful. While he was
in prison awaiting trial for this crime that he could
not have committed, he was beaten and sexually assaulted by
three men. So hours after that assault, he was cleared

(44:17):
of all charges and released, and now he assuming Macy's
Sunglasses Huts parent company and three people that his attorneys
say were involved in the case. He's seeking ten million
dollars in damages and says the assault left him with
lifelong injuries.

Speaker 4 (44:31):
Yeah, that is such a horrifying story. It's look also,
I mean, I we know that prisons are a place
where this happens a lot of times, and it is
so for some reason, the solution is never to fix
those problems and like maybe stop so many people going

(44:55):
to prison and do prison reform, and the solution is
always just to send more people to prison.

Speaker 3 (45:00):
And what else is new that is? Yeah, that is
so horrifying. I don is. I hope he sues them
for whatever they're worth.

Speaker 4 (45:09):
And yeah, that's that's horrible.

Speaker 1 (45:13):
It is And I also hate this statement from macy
So Macy's, when Washington Posts talked to them, they declined
to comment on the pending litigation, but they did say
in a previous statement that Macy's quote uses facial recognition
and conjunction with other securities methods in a small subset
of Macy's stores with high incidences of organized retail theft

(45:34):
and repeat offenders, and so that statement is not about
this specific case. They were like, oh, we're not going
to talk about that. However, it almost sounds like they're
saying like, well, if a few people have to be
arrested wrongly because of our faulty use of this surveillance
techt it's okay if it protects our merchandise. Like, I

(45:55):
just feel like the focus on retail theft and like
quote organized theft, it makes it seem like the most
important thing here is the Macy's merchandise, not the fact
that they are routinely falsely arresting people.

Speaker 2 (46:10):
Who then are could be assaulted in prison.

Speaker 4 (46:13):
Well, yeah, this is America where yeah, property damage is first,
is the worst crime you could commit, or property theft
is the worst ufront, especially if it's sort of a
corporation that is big bad.

Speaker 1 (46:29):
The most important thing we should be protecting are the sunglasses, Jolie.

Speaker 2 (46:32):
I cannot stress that.

Speaker 4 (46:33):
Enough, those sunglasses huts. That is a staple of American culture.
And if we're not going to if.

Speaker 3 (46:41):
The facial recognition isn't going to protect it, who will.

Speaker 1 (46:44):
Apparently, so, this isn't the first time that something like
this has happened. We talked about Portia Woodruff, the heavily
pregnant black woman who was arrested for a crime she
had nothing to do with because of this technology. Rite
Aid was actually banned from using facial recognition technology for
five years after the ftc found at the store. Misused
facial recognition to fossly accuse people of theft and harass them,

(47:07):
causing embarrassment, harassment, and other harm, including physically searching an
eleven year old girl in a way that left her
so distraught that her mom had to miss work.

Speaker 3 (47:17):
Oh my god, like awful.

Speaker 1 (47:19):
I honestly can't believe that. It's like, okay, well you
can't use this technology for five years. I feel like,
once your technology has been misused to the point where
you are traumatizing children for no real reason, you should
lose those privileges forever.

Speaker 3 (47:30):
In my book, yeah, I there.

Speaker 4 (47:33):
Well, so there's something already so dystopian about living in
this like hyper police state that we're all in, where
it's just like we're constantly, constantly being monitored, and there
is another level of dystopia of that that it's like
it's not.

Speaker 3 (47:47):
Even like, oh, you can't be doing any crime.

Speaker 4 (47:50):
You gotta be at your best behavior old times because
you're always being watched. Now it's just like you never know,
and like especially if you are a person of color,
if you're a black person. Yeah, it's just there's just
like another level of dystopia.

Speaker 1 (48:04):
And doing it in service of like protecting merchandise and
keeping the wheels of capitalism Greece, Like it's.

Speaker 4 (48:12):
Just protecting Macy's and the song glass is my like,
are you kidding me?

Speaker 1 (48:17):
It's just so it's just so like it's just so sad.
We deserve such, we deserve better. It's just so sad.
And to your point about like, we know that this
kind of technology disproportionately impacts women, people of color, black people,
black women especially. However, I should tell you Murphy is white.

(48:38):
There have been six other people that we know of
who have been falsely arrested because of this technology. All
of them have been black. Murphy is the first white
man that we know of this happening to being falsely
arrested because of facial recognition technology.

Speaker 2 (48:50):
And so I don't know.

Speaker 1 (48:51):
Part of me wonders if this is going to be
one of those things where it's like when the harms
were impacting women and black people and people of color,
people were like, eh, whatever, but maybe the harm will
then extend to everybody and then it's like, oh wait,
I could be falsely arrested for this, like hold on.

Speaker 2 (49:09):
So we will have to.

Speaker 1 (49:10):
See, but it definitely is again people like we talked
to doctor Joy Bolamwini, I'm an AI researcher on the
podcast a while back. People have been very clear about
the psychic, cosmic, deep threat that this technology poses to
all of us and police departments and retail establishments just

(49:31):
keep using it. It's like those warnings are just going unheard.

Speaker 4 (49:34):
Yeah, absolutely, Like they're just there's no logical reason for
this to be continued to be used at such a
wide scale, Like it only causes like it seems like
it's just causing more harm than actually like and I
don't know what the numbers are for, like the times
to accurate or whatever, but I guarantee it's not enough
to kind of justify the opposite. And again, there is

(49:58):
something really weird about the fact that it's like we've
all just kind of given up the ability to have
any sense of privacy in public, which I guess I
don't know privacy and public whatever, but like, have any
sense of privacy, you have any sense of just like
being able to exist in the world without having like
eyes on you, metaphorical eyes on you, you know.

Speaker 1 (50:20):
And speaking of folks trying to warn folks in power
of the harm that this kind of technology presents, parents
are actually talking about the threats posed by AI to
young people. A coalition of parents working alongside the group
Parents together wrote a letter to TikTok expressing concerns about
AI generated influencers, asking the platform to clearly label when

(50:42):
an influencer is not real but AI generated, warning that
AI generated influencers can make things like poor body image,
body dysmorphia, and self esteem issues worse.

Speaker 2 (50:51):
So the letter was sent to TikTok.

Speaker 1 (50:53):
CEO and it reads, TikTok is flooding kids feeds with
fake computer generated people pretending to be real influencers. AI
generated influencers created by companies to make a profit, post
photos and videos of people who appear to be real
but don't actually exist. These AI generated people do things
like apply makeup on flawless skin and show off perfect bodies,

(51:14):
creating an extreme and utterly unattainable beauty standard. Most of
the millions of kids who encounter these accounts won't know
the people they aspire to look like are not real
people at all. Right now, TikTok relies on companies that
create these accounts to label them as AI but with
thousands of dollars per post on the line, they often
do not. That's why TikTok must proactively and clearly identify

(51:35):
these accounts so young users know which accounts are real
people and which are computer generated, so the parents their
concerns are not unfounded. Nearly half forty six percent of
adolescents aged thirteen to seventeen said that social media platforms
make them feel worse, and TikTok has already been taken
to task for kind of I want maybe allowing is

(51:55):
too strong, but containing content that glorifies disordered eating, right,
they have rules against that content, but those rules are
pretty easily circumvented, and then they sort of don't then,
so they might be like, oh, like pro eating disorder
content is disallowed, but then when people easily get around it,
they don't really have a fix for that. And so
right now, TikTok is supposed to clearly label AI generated content,

(52:19):
but Parents Together campaign director Shelby NOTx says that it's
really kind of not doing that because how that works
now is that the AI virtual influencer will just put
in their bio like hashtag AI influencer, and you would
only know that that influencer was AI generated if you
look at their bio, not on the content itself. Knox

(52:40):
told NBC, I'm not sure that your average kid knows
that a virtual influencer is industry speak for this person
is not real. So because these companies are making money
on TikTok's platform, and it is contributing to a dangerous culture.
Our view is that TikTok has a responsibility to come
in and figure out how to consistently and visibly label.

Speaker 2 (52:58):
These accounts and these videos.

Speaker 1 (53:00):
I had never really thought about the threat posed by
these AI influencers. I will say this though. Some of
them really do look real, but they have this like
impossibly conventionally attractive look, like they're skinny to the point
where it's like bodies like this is not how a
body looks, or like they're They're beautiful to a point

(53:22):
that is like unachievable and unrealistic. And it's like, yeah,
because they're not real. And so I'm an adult when
I see content like that, I generally can tell when
it's not a real person. But if you're a little
kid and you're looking for an influencer to look up to,
you might not know.

Speaker 4 (53:36):
So I just look this up now because I have
not had this come up on my free paget. I
have had the weird like AI kind of MPC things
come up on my page.

Speaker 3 (53:47):
All the time, which are always weird. But yeah, that
is that is free.

Speaker 4 (53:50):
That is not really expected those are actually like uncanny,
like weird, Like I could totally guesh see like a
kid looking at that and it's a real.

Speaker 3 (54:00):
Person that is so creepy.

Speaker 4 (54:04):
And again, like going to the other side of this
issue is like influencing is real work that's often you know,
not taken seriously because it is primarily women that do it.
It is like a lot of the kind of like
are their valid criticisms of influencers. Yes, However, like yeah,

(54:25):
a lot of a lot of the criticism does also
just come from misogyny. And it's like, you know, again,
the jobs that are always going to be affected first
or kind of the people that are already are always
going to be like affected warriors are sex workers, women,
women of color. And it's like this is another example
of like an industry that is primarily female and is

(54:48):
you know not taken seriously because it is primarily female
being taken over by all these tech bros that are
just gonna make the situation way worse and take take
all the bad things about influencing and make that the
main thing. Yeah, that is so scary. I can we
go to like can we can we backtrack on the
whole like trying to make fake people thing and like

(55:11):
just make like if you want a fake thing to
be your influencer, Like, can it be like a cartoon?
Just be like a like a what's the like space
jam or something where we just have like a Looney
Tunes character in the real world for no reasons.

Speaker 3 (55:23):
Like it's like I don't know, Like I feel like.

Speaker 4 (55:25):
There there's got to be a better option if you
really don't want to pay human being.

Speaker 1 (55:30):
Give Lola Buddy a spot right, Like it doesn't Yeah,
I gin guess part of me is like when people work,
when when conversations about AI became so ubiquitous, who does
something like why is it? Why aren't we starting from
a place of like, well, AI can write our movies.

Speaker 2 (55:48):
Be our artists, be are be our creatives, do our influencing?

Speaker 1 (55:52):
Like who decided that the kind of creative jobs and
creative labor that people aspire to you that is, those
are the jobs that we were to have. I mean,
I'm sure people who make money, that's who decided it.
It's like, oh we can, we can humans who have
needs cut them out of the equation and just like
pay some tech bro company to make an AI influencer.

Speaker 2 (56:14):
Cool, here's your money.

Speaker 4 (56:16):
Right, And it's also like, you know, the tech bros
that have been told their entire life that arts and
humanities aren't real important things, that it's only technology and
now only seeing this kind of stuff as the products
rather than the creation.

Speaker 3 (56:30):
And the products aren't even good like it is, Yeah,
it is. It is.

Speaker 4 (56:34):
We live in a weird, dystopian capitalist hellscape and it
just gets weirder every day.

Speaker 1 (56:40):
Well, this last story kind of fits into that what
you're saying, So, oh no, let's talk about this because
it's it's like a it is pretty depressing. So Indigenous
artists in Australia say that AI is basically ripping off
their work without permission or credit and using it to
create knockoff fake Indigenous art, toll on online retailers like Etsy,

(57:02):
and it's turning into yet another threat to their livelihoods
and cultures. So these artists, who are already struggling to
compete with the tens of millions of dollars worth of
fake art produced every year by non indigenous artists, now
have to compete with AI. So major shout out to
Cam Wilson at The Australian News site Criche for coverage

(57:23):
on this, We'll link to Cam's Peace in the show
notes because it's very good.

Speaker 4 (57:26):
I think it's funny that there's an Australian news site
called I Know. This is a very serious story and important.

Speaker 1 (57:37):
Yeah, don't let the name Criche fool you. This is
a very serious story. So Wilson says that AI generated
Indigenous art is appearing on online marketplaces where art and
derivative products are being sold, and that those AI generated
fakes are often directly competing with the work from real
Indigenous artists, even though platforms often have policies that are

(57:58):
supposed to protect Indigenous culture from exactly this kind of thing.
So Adobe and Shutterstock run popular stock images websites where
people can buy AI generated images for a variety of
commercial purposes, and on both of those platforms there are
dozens of fake Indigenous art style images. The images do
have a label that they have been created using AI,

(58:21):
but they're often like vaguely listed as being Aboriginal or
Indigenous art. On Adobe, people can upload AI images for
sale and then earn a cut of the money every
time those images are purchased. Now the company is meant
to only allow users to submit images that they own
the intellectual property of. But it's not really clear like
how this is being policed. Platforms like Etsy and eBay

(58:44):
are filled with cheap digital AI generated prints intended to
be printed and framed, and some of these online platforms
do have policies in place to prevent this kind of
thing from happening, but it's like not clear how or
why it's not really being enforced very well, so when
we'll see it. Chris spoke to US spokesperson at Adobe.
They didn't answer questions about whether AI produced indigenous art

(59:06):
violates their policies, but instead gave kind of a.

Speaker 2 (59:09):
General blah blah blah, we care statement.

Speaker 1 (59:11):
They said, we are continually auditing, evaluating, and improving the
Adobe stock collections to serve our customers needs. That's like,
you may as well have said nothing at that point,
Like that's a statement that says nothing. EBAs policies say
that sellers must not list, sell or promote materials, products,
or services that use indigenous culture and intellectual property in
unauthorized ways, according to their spokesperson, but when pressed on

(59:35):
whether or not AI generated art would violate this policy,
they said, how an artwork is created is irrelevant, and
that a listing using indigenous art in an unauthorized way
may breach that policy.

Speaker 2 (59:46):
Etc.

Speaker 1 (59:46):
Has a policy that prohibits users from selling items falsely
listed as being produced by Indigenous people, but only in
North America and does allow quote indigenous style products by
non indigenous people. So yeah, it just sounds like people
are stealing from indigenous artists. AI is being trained on

(01:00:07):
artwork and cultural work that has been stolen from Indigenous artists,
and then that art that fake artwork is being used
to undercut actual Indigenous artists who already have to compete
with non AI generated fake indigenous artwork a complete mess.

Speaker 3 (01:00:24):
Yeah, it's really just.

Speaker 4 (01:00:28):
Technology, you know, the latest technology aiding further colonization and
colonial kind of ideas, and I that is really messed up.

Speaker 3 (01:00:39):
Also, Like again I.

Speaker 4 (01:00:41):
Was surprised to hear that that Etsy prohibits users from
selling like items that you know are listen is being
produced by it as people to art, because again, yeah,
I feel like there's a lot of ways people get
around that, Like there's a lot of weird especially like
in within like New Age kind of stuff, like a
lot of very coded words that you know, people used

(01:01:04):
to kind of ignore the fact that they are just
culturally or like appropriating cultures that aren't their own or
that aren't you know, being made by the people that
it's claiming to come from and all that.

Speaker 3 (01:01:19):
But yeah, that is that is depressing.

Speaker 1 (01:01:22):
Yeah, it's and it's just like another way, just like
you said, another way that AI is being used to
further oppress people who are already oppressed and already face
so much, right, Like it's just such a grim use
of technology to continue to cut people out of their

(01:01:42):
own culture so that other people can profit off of it.

Speaker 4 (01:01:46):
Yeah, right, and kind of like what I was saying
with that the other story too, where it's like I
feel like a lot of this is coming from these
companies that don't understand that like art, like the the
value of art is like part of it is the
actual process this is making the art and the culture
that is coming from and the like art is culture,
art is a part of who we are. It's it's

(01:02:07):
so like it's such a closed minded kind of it's
such an almost honestly, it is a sad kind like
I feel sad for these people if your only view
of art is like, oh, well, like but computer can
do it faster and cheaper, and like that's what people
want definitely, Like that's just that's just sad at a
certain point, like I truly like you must not have

(01:02:31):
any Like do you feel.

Speaker 3 (01:02:33):
Joy at all?

Speaker 4 (01:02:34):
Damn?

Speaker 3 (01:02:34):
Okay, I don't know.

Speaker 1 (01:02:36):
And like I mean, this was the thing that struck
me over the conversations around like screenwriters and actors and
AI over the Hollywood strikes. It's like, if you value
something so little, why do you want to be in
charge of it? Why do you want to be a
part of it? If you cannot see what makes this
stuff great and special and what brings people and makes

(01:02:57):
them feel connected to it, why do you even want
to Why do you even want it? Why do you
even want to be part of it? It's like you
can't see that, then what are you doing? You clearly
don't get it. It's like another way that like rich people.
We talked about this in a previous episode. I can't
remember which one. My brain is mush from being sick.
But rich people they bought they use their capital to

(01:03:18):
acquire things and then they hate or don't understand those things.
They buy things and then they hate those things. And
then ruin them because of that. It's just like such
a it was the pitch messed up dynamic.

Speaker 3 (01:03:28):
Yeah it yeah, a Pitchfork story right last week? Yeah, yeah,
because it absolutely was. Yeah, I like it is.

Speaker 4 (01:03:35):
So it's so weird that like the people that are
ruining the world and like little livelihoods of all of
these people are like the people that also it's like.

Speaker 3 (01:03:44):
What are you doing? Like seriously, like where do you how?
How is this how why you've chosen to live your life?
Like do you not? Yeah?

Speaker 4 (01:03:54):
Do you not derive joy from anything? If this is
how you're viewing the world?

Speaker 3 (01:03:58):
It is. It is really weird. It is. But you know,
line go up? Brain, which line go up?

Speaker 2 (01:04:06):
Yeah? And this is a real issue.

Speaker 1 (01:04:09):
I mean, Indigenous artists have already have a hard time
competing with people who create fake art and then try
to pond that art office Indigenous, even without AI being
part of the conversation. A twenty twenty two Productivity Commission
report found that seventy five percent of Indigenous style goods
were created by non Indigenous people. Online art businesses were

(01:04:29):
particularly bad. One stock image site analyzed the Commission had
eighty percent of its Indigenous style images authored by non
indigenous people. Similarly, sixty percent of listings on a print
on demand marketplace were also produced by non indigenous creatives.
Kriikie spoke to this Indigenous artist Amy Allerton, who said
that this is really offensive because oftentimes the art will

(01:04:52):
just be like a random mishmash of designs, and it's
like will mix or mess up very like distinctive and
special styles from different indigenous nations and artists and just
sort of like mush them together. Allerton said, it's a
very colonial mindset that they are entitled to the entirety
of us. Indigenous people don't have the power for self

(01:05:13):
determination that we want. This adds to the weight of
all of that. It's like making me redundant. I think
about myself. If I were made redundant, that would be devastating.

Speaker 2 (01:05:23):
And yeah, I just.

Speaker 1 (01:05:24):
Really think that, like, does everything have to be a
soulless cash grab? Does everything have to be about that?

Speaker 2 (01:05:32):
Even art? It just that's not the world I want
to live.

Speaker 3 (01:05:34):
In, absolutely.

Speaker 1 (01:05:36):
But the world I do want to live in is
one where I get to turn round up the news
with you, Joey, So thank you so much for being
here where can folks, what do you got going on
that folk should know about. Maybe the answer is nothing,
but maybe maybe you want to plug something.

Speaker 3 (01:05:51):
Thank you, Bridget.

Speaker 4 (01:05:53):
It was lovely as always, but as the dystopian chaos continues.
If you want to hear more things that I'm working on,
you should check out After Lives, the Leleien Planco story,
hopefully some new projects a lug way. I should be
on stuff Mom've never told you soon talking about more

(01:06:16):
TikTok chaos and this state of TikTok at the moment,
which will be a fun time.

Speaker 3 (01:06:23):
So TVD on that.

Speaker 4 (01:06:24):
But yeah, if you want to follow me as always,
you can to follow me on Twitter or Instagram at
Pat not Pratt.

Speaker 3 (01:06:32):
It's p a t t n O T p r
a t T.

Speaker 1 (01:06:35):
The first time I had you on the show, Joey,
I think I credited you as Joey Pratt and you
were like, it's actually Pat not Pratt.

Speaker 2 (01:06:44):
Actually why it's my social.

Speaker 4 (01:06:47):
It has been my social since I was sixteen, and
it was because if I have Pat my entire life.

Speaker 3 (01:06:52):
People calling me.

Speaker 4 (01:06:55):
My last name Pratt, which is an understandable mistake, but
that is not in fact my last name.

Speaker 1 (01:07:02):
So pat not Pratch. Thank you for being here, and
thanks to all of you for listening. If you want
more ad free content, check out our Patreon at patreon
dot com. Flash Tegody see you on the Internet. If
you're looking for ways to support the show, check out
our merch store at tegoty dot com slash store. Got

(01:07:24):
a story about an interesting thing in tech, or just
want to say hi, You can reach us at Hello
at tegodi dot com. You can also find transcripts for
today's episode.

Speaker 2 (01:07:31):
At tengody dot com.

Speaker 1 (01:07:32):
There Are No Girls on the Internet was created by
me Bridget Tood. It's a production of iHeartRadio and Unboss Creative,
edited by Joey pat Jonathan Strickland as our executive producer.
Tari Harrison is our producer and sound engineer. Michael Almada
is our contributing producer. I'm your host, Bridge Tood. If
you want to help us grow, rate and review us
on Apple Podcasts. For more podcasts from iHeartRadio, check out

(01:07:53):
the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Speaker 6 (01:08:00):
He
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.