All Episodes

July 30, 2025 28 mins

Bridget Todd talks with Garrison Davis about how TikTok creators are using AI generated videos to make viral racist skits with digital blackface.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Cool Zone Media, Welcome to It could happen here, a
show about things falling apart, and today the thing falling
apart is the Internet. And today we have a special
guest episode with Bridget Todd. Hello Bridget, So Garrett, it's.

Speaker 2 (00:21):
Kind of funny that we are talking just a few
days after the Trump administration put out there Woke AI
Executive Orderer.

Speaker 1 (00:30):
Yes, I have not read this yet. I have to
for next week's executive Disorder. I'm not looking forward to it.

Speaker 2 (00:36):
I like that the Cool Zone team kind of sections
off all the Trump federal nonsense so you don't have
to be mired in it all the goddamn time.

Speaker 1 (00:46):
I still kind of am. I just schedule it throughout
my week. I guess there's certain days where I have
to do it.

Speaker 3 (00:52):
Yeah, you gotta pepper it in. You gotta pepper it in.

Speaker 2 (00:54):
Well, yeah, not to give you a spoiler for when
you dive into it yourself, but it's all nonsense. Basically,
the Trump administration is saying that right now, the biggest
threat regarding AI is it being too woke and essentially
telling folks who make AI tech leaders essentially to be
more like Elon Musk and Grock and make sure that.

Speaker 3 (01:16):
Your AI models.

Speaker 2 (01:17):
The only AI models that we will accept in this
country are the non woke ones, ones that don't incorporate
DEI would love to know more about what he thinks
that means, but that's a little preview for you.

Speaker 1 (01:28):
Fantastic, you know, seems like the most important issue facing
our nation right now.

Speaker 3 (01:32):
Definitely, definitely.

Speaker 2 (01:34):
And so it's funny that we're talking about AI because
I don't know if you're on TikTok, but there have
been these kind of shockingly racist AI generated videos all
over TikTok, to the point where I would say that
we are witnessing the revival of the minstrel Show using
AI on social media. This is not a claim I

(01:56):
use lightly. That is how extreme some of this content is.

Speaker 1 (02:00):
I'm not on TikTok, but I think I've seen some
of this content permeate across platforms, certainly on like Instagram,
reels and even even bits of X the Everything app.

Speaker 3 (02:13):
I love that you call it that. That's the full name.

Speaker 2 (02:18):
So for folks who don't know, I want to scram
the conversation in what a minstrel show is. So the
Minstrel Show was a incredibly popular form of American theater
and entertainment in the nineteenth century, where mostly but not all,
white performers would wear black face makeup to make themselves
look like These exaggerated racist versions of black people and

(02:39):
essentially portray very racist stereotypes of black folks being lazy buffoons.
And a common trope in these skits was black people
trying and failing to gain American citizenship because at the time,
black Americans did not have full citizenship, and so a
big plotline would be like, Oh, we had to take
a test for citizenship, but we were too stupid to

(03:01):
figure it out. Are we spaced the data and overslept
because we're very lazy. When these shows would depict black women,
we were often shown as what you might think up
as like a sapphire caricature, which is rude, loud, malicious, stubborn,
and overbearing, kind of like the angry black woman trope
that you probably are familiar with in media today. So
these skits were incredibly popular entertainment, but they also served

(03:26):
the purpose of reaffirming political and social ideologies, and so
you know, the dominant way that people consumed media regarding
black people showed us as lazy, stupid, angry, loud, and
importantly not really able to conform to the dominant culture
of like mainstream, hardworking white Americans.

Speaker 3 (03:45):
That is obviously an incredibly.

Speaker 2 (03:46):
Powerful tool to uphold and reaffirm the idea that black
folks should not be given full citizenship, should not be
given full rights, cannot be you know, integrated into polite
white society. And it almost kind of became this for
their own good attitude that provided like a polite justification
for things like segregation. Well like, oh, well, you know,

(04:08):
I've seen in minstrel shows that black folks are very
lazy and stupid, So it's an as for their own
good that we treat them like shit in society, do
you feel me?

Speaker 3 (04:16):
Yeah, yeah, yeah.

Speaker 1 (04:17):
It's a sort of like infantilization exactly.

Speaker 2 (04:21):
And so even though the minstrel show did die out,
I would argue that we are kind of seeing a
little bit of a comeback using AI in the digital realm,
and just like the menstrual shows of.

Speaker 3 (04:32):
Yesteryear, we're used to affirm.

Speaker 2 (04:34):
Political and social ideologies under the guise of just being
entertainment or just being jokes or just being funny. I
really think it's not a coincidence that we're also seeing
the rise of digital blackface, where non black creators are
using AI to create these viral racist skits that are
steeped in black stereotypes, and that they're really taking off

(04:55):
all over social media today.

Speaker 1 (04:56):
That sounds not fun to hear about, but I'm excited
for you to explain it to me.

Speaker 2 (05:02):
Yes, So I will say, initially, the first iteration of
one of these videos that I saw was not really racist.
It was made by a black creator, I think, trying
to use AI to create sort of humorous skits. But
when that first video took off, people on TikTok started
using AI to create more and more extreme, more and

(05:22):
more racist iterations of these kinds of videos, which is
what we're seeing today. So I will play a little
snippet of an example for you.

Speaker 3 (05:30):
What's up, bitch?

Speaker 2 (05:31):
Is this Bigfoot one hand the baddest bitch in the woods?

Speaker 3 (05:33):
Part time cryptic, full time problem. Don't follow me if
you scared a please.

Speaker 2 (05:37):
So this is a TikTok that got over two million views,
and it basically it uses AI to generate this black
woman stereotypical version of Bigfoot, and this account is so
popular that has generated so many copycats, like this is
a format that has really hit with TikTok.

Speaker 3 (05:58):
There also is another kind of bucket of.

Speaker 2 (06:00):
These that people call slave talk, where it uses AI
to sort of reimagine enslaved people on plantations if they
had social media and we're doing vlogs and so a
lot of those videos were taken down by TikTok, which
is I think good, but essentially it would reimagine these
AI in generated enslaved people basically saying like, oh, well, yeah,

(06:23):
I do have to work out here in the cotton fields,
but at least I'm gonna get meals. At least I
have a roof over my head, essentially really affirming the
idea that, like slavery, wasn't that bad.

Speaker 3 (06:34):
One of the more heinous.

Speaker 2 (06:35):
Examples that I saw of these that was removed from
TikTok was a TikTok shop sponsored video that showed an
AI generated enslaved person working in the fields wearing a
solar powered hat with a sand in it, and basically
he was like, Oh, this work in the field would
be so horrible if I did not have this hat.
And then there's a little link to the TikTok shop

(06:57):
and you can buy the actual hat, which is.

Speaker 3 (06:59):
Just some really dystopian awful shit.

Speaker 1 (07:02):
No, that is like quite literally it's like evocative of
like cyberpunk tropes that people I would assume would not
want to use due to fears of insensitivity. But it's
just on your phone like as like a real thing.

Speaker 3 (07:15):
Yeah, I completely agree, and I love that comparison.

Speaker 2 (07:18):
And I think, like I would imagine if I were
running a TikTok shop that using the AI generated image
of an enslaved person, I would think like, oh, well,
this is certainly not something that I would use to
like sell some cheap fan hat.

Speaker 3 (07:32):
But I mean, I think it is exactly what you're
saying that.

Speaker 2 (07:35):
I think that the extreme quality of these videos, people
are just like, well, it'll get views and then I'll
get more eyeballs on my TikTok shop.

Speaker 3 (07:44):
I don't think there's any kind.

Speaker 1 (07:45):
Of sure Yeah, no, it's a very gross way of
doing like outrage farming for engagement. I guess, like because
like surely they know that these are not going to
like go over easy. Like I think a part of
part of this is generating some degree of like attention
based on it being offensive or extremely gross and knowing

(08:05):
that people will like comment things of that nature.

Speaker 3 (08:09):
Exactly.

Speaker 2 (08:10):
And it's funny that you mentioned that, because the AI
component of this is sort of what makes this novel
and new. But that kind of thing has been all
over our social media for the longest time. Sure, I
remember how big stuff like skit culture was on TikTok.
And I don't mean skits like Saturday Night Live or Portlandia.
I mean skits where they are trying to get you

(08:30):
to think this is somebody's cell phone footage of something
that happened, but really it's like, well that.

Speaker 3 (08:35):
Those are two actors.

Speaker 2 (08:36):
And there was a type of these skits that would
really take off on TikTok, where it was purporting to
be oh, this is a parent who is going off
on a trans teacher for trying to indoctrinate their kid,
and all the comments would be like good for them,
good for that mom, And then the screen flips and
it's like, oh, well, the woman you were just telling
me is the trans teacher, Now she's the mom who.

Speaker 3 (08:57):
Was the next video, yes, exactly.

Speaker 1 (09:01):
No, I like the ones that are set on airplanes
where they all use the same airplane set, Yes, and
they get into like fake fights on airplanes using the
same like five actors playing different roles.

Speaker 3 (09:12):
Yeah, and then if.

Speaker 2 (09:13):
You look carefully in the background, you start thinking, well,
airplanes don't have those strip led lights that you can
buy on Amazon.

Speaker 3 (09:21):
Does actually sme the TikTok lights and the hallways like
five feet wide?

Speaker 2 (09:26):
Yeah, exactly, And listen, I am not above getting taken
in by those kinds of skits. And I guess I
don't love the idea that someone would be dedicating energy
and brain space to getting upset about a set of
circumstances that never really happened.

Speaker 1 (09:42):
But it's the Internet. Come on, that's that's like, that's
half of the Internet. Yes, you know, I don't love it.

Speaker 2 (09:49):
But when the stakes so like when the stakes are
low and it's just like a random fight on an airplane, fine,
when the stakes are higher and it's like, this is
a skit meant to like attack or demonize trans people,
queer people, black people, that's where I'm like, well, what
are we really doing here?

Speaker 3 (10:14):
I think whether or not this.

Speaker 2 (10:16):
Kind of content, like when it's AI generated, we're looking
at things that never actually happened, even though these these
circumstances in these situations never really happened, they still very
much affirm the worldview of the people who are consuming it, right,
And so if you are consuming a skit involving whether
it's human actors or AI generated black people, if that

(10:37):
skit reaffirms your worldview that these people cannot be trusted,
these people are bad in some way, it kind of
doesn't matter if it's real.

Speaker 3 (10:45):
Or not, you know what I'm saying. Yeah, yeah, totally.

Speaker 1 (10:48):
That's like the concept of like hyperreality, where you're trying
to like blend the Internet's exaggerated version of reality with
our physical lived existence, and how these things start combining
in to each other to create this idea of reality
in our heads that's more real than it actually is,
to the point where we take things on the screen
to be more accurately reflective of what's going on in

(11:08):
the world than what we actually experience in our day
to day lives. And so much of that concept is
what drives like American like reactionary politics exactly.

Speaker 2 (11:17):
And when you actually go into the comments of these videos,
which in my opinion are very clearly AI generated, people
don't even comments well, I.

Speaker 1 (11:25):
Mean, well, I mean that easy for you to say
someone who spends their time like researching what's going on
on the internet.

Speaker 3 (11:34):
I'm not sure if Mema and Pop are finding.

Speaker 1 (11:37):
These videos, they're gonna be like, well, this one's obviously
AI generated.

Speaker 2 (11:41):
No, And that's my point is like, I don't even
think they're thinking about it that way, and I don't
think they care that it's not really In the comments
of these videos, It'll be a video, an AI generated
video of a black woman behaving in this very stereotypical
racist way, and the comments will say they're all like that,
and it kind of misses the point of like, well, there's.

Speaker 3 (12:00):
No they in this video because it's AI generated. This
is just a computer puppet. This isn't real. Like, yeah,
I completely agree.

Speaker 2 (12:08):
But I think when you see something online, whether it's
obviously AI generated or not, if it reaffirms your worldview,
it kind of doesn't matter.

Speaker 3 (12:16):
It's the same reason why when there's.

Speaker 2 (12:18):
Like four legged veterans in AI slop holding a sign
that says everyone forgot about me, wish me happy birthday.

Speaker 3 (12:26):
Three billion likes on Facebook.

Speaker 2 (12:30):
I mean, what do you think is going on there?
I find that so fascinating.

Speaker 1 (12:34):
Oh, I mean, I'm not a psychologist, but I don't know.
I think it isn't just the simple reaffirming of someone's
previously held view people are very receptive to. And we
even see this with like you know, with like fake
news headlines, right, and people might point out that this
story isn't isn't actually real. And when people are confronted
with this idea of that they've been tricked by unreality,

(12:56):
they'll be like, no, maybe this one isn't real, but
it could be real. And that's what really matters is
that this this feels true, not that it is true,
but the fact that I feel it resonating is actually
more important than any kind of physical trueness out inside,
like the flesh world like that. That is honestly that

(13:16):
that matters far less than how it impacts how I
feel and how it reflects the world as I see it.

Speaker 2 (13:22):
So I did an episode of my podcast or oar
norgles on the Internet, all about the sort of weird
economy of AI generated disinformation, essentially fan fiction that came
out of the trial of Sean P.

Speaker 3 (13:35):
Diddy. Come oh, that sounds incredibly upsetting.

Speaker 2 (13:38):
It was so upsetting, and the reason I looked into
it is because I have to be honest and say,
one of these AI generated videos got me, right. It
was a video that claimed that the late musician Prince
was able to testify in Ditty's trial from the Beyond
the Grave and that they played a video that Prince
made warning everybody that Didty is this bad guy?

Speaker 3 (13:59):
Right. I am probably the world's biggest Prince fans.

Speaker 2 (14:02):
While I was like Prince always like, it got me
and get.

Speaker 3 (14:05):
Totally affirmed what I want to be true.

Speaker 2 (14:08):
But it was all a lie.

Speaker 1 (14:09):
It's compelling, it's trying to like, it's trying to impact
you emotionally, especially for people who who like Prince, who
who miss miss prints. This could be emotionally compelling, and like,
that's that's what they're like intentionally going after. I think
that's that's why something like that could work so well.

Speaker 3 (14:24):
It got me.

Speaker 2 (14:25):
And when I looked into kind of how these videos
are cranked out on YouTube, so basically any celebrity that
you can imagine there is an AI generated video on
YouTube saying that they were somehow involved in the Diddy trial.
And what's so interesting is in the comments of these
videos that are again pretty obviously AI generated or not real,

(14:45):
and even the description of the YouTube account will say
this is just for entertainment. Nothing here is supposed to
be true. People won't read that part. Basically, if you've
ever had a bad feeling about a celebrity, which who
hasn't totally see there's a video that a with that
worldview that is like, well did you know they were
involved in the ditty pre cops?

Speaker 3 (15:03):
And everybody's like, I knew it.

Speaker 1 (15:05):
That person always gave me the ac if fine, I
knew it. I was smart enough to pick it up.
Not everyone else was smart enough, but I was. And
that's that's a whole other emotional feeling that it's being
targeted by these like AI slop creators where they're trying
to get like affirm people's like like narcissism about their
ability to judge the moral character of strangers.

Speaker 2 (15:26):
That is so it because the people, the celebrities they choose,
it's people that maybe you would have like, I have
no real reason for this, but I hate Kevin Hart
and so in the videos. Don't even ask me why.
I don't even have a real reason. I just don't
like him. Well, he is short, he is short. There
you go love to my short Kings. One of the
reasons I don't like him. This is just me spec

(15:46):
like he just does a lot of ads and you
can't get on social media without his cryptocurrency ad, his
draft Kings ad.

Speaker 3 (15:52):
I just like hate seeing it short Yeah.

Speaker 2 (15:55):
In the AI generated video claiming that he was mixed
up in the Diddy Trials, every comments like I knew it.
I always hated him, And that's affirming people like feeling
like they knew something that other people didn't see, and
they knew it early on.

Speaker 1 (16:08):
Well, And I think what's something that's similar to this
that's happening right now is there's a massive media campaign
right now against Pedro Pascal with with AI generated videos
of him like touching his female co stars, and these
these videos have been have been digitally altered, and it's
in service of this this big harassment campaign against someone
who's like very vocally Protrance writes, there's other possible reasons

(16:32):
for why he's he's being targeted by these videos, but no, Similarly,
it's trying to create this like ick around Patri Pascal
using AI altered media, and it's it's gaining a lot
of traction right now, and it's something that people need
to be like very very cautious of. But yeah, it's
trying to affirm whatever. Maybe you, for some reason have
never liked Patro Pascal. I can't imagine why. But if

(16:52):
you find a video like this talking about how how
he's using a social anxiety diagnosis to inappropriately touch his.

Speaker 3 (16:59):
Co stars, like I knew it. I knew it.

Speaker 1 (17:01):
I never trusted Pedro Pascal and I don't like it.
He's pro trans writes, and you're like, there you go.
They've completely got you. They've been able to like automate
and monetize internet hate campaigns against people that you don't know.

Speaker 2 (17:15):
Gerrett, Literally, right before you and I got on this episode,
I saw a video on Reddit and it's a it's
a scene from an episode of Always Sonny where one
of the guys is like essentially lifting d the female
lead up by her crotch, and the caption was Pedro
Pascal when he feels anxiety next to me, you got
a co star? And I remember thinking, like, this is

(17:36):
such a weird fucking video. But what corner of the
Internet have I wandered into?

Speaker 3 (17:40):
But I didn't. I did not know that there.

Speaker 2 (17:42):
Are horses trying to make me get the ick about
Pedro Pascal Coincidentally, he is someone who speaks up for
LGBTQ wrights, you know, progressive causes.

Speaker 3 (17:52):
Of course.

Speaker 1 (17:53):
Yeah, No, it's it's it's a it's a it's a
huge thing swooping the internet right now.

Speaker 2 (17:57):
And I think it really goes to show how so
kind of easily we can be manipulated using digital content,
whether it's AI generated or AI manipulated or not. Like
our understandings of the sort of general temperature of what's
going on are so so much more tenuous than we think,
and so much more easily manipulated than we realize.

Speaker 1 (18:17):
No, absolutely, no one is immune to propaganda. That is
a great way of putting it.

Speaker 2 (18:32):
I'm happy that you used the word propaganda, because that's
what I really do think these AI generated, essentially menstrual
show videos are. I think it's not a surprise that
we are seeing them the same way that back in
the day, minstrel shows were very popular at a time
when there was an active campaign of attacking black folks

(18:55):
and saying they weren't smart enough and did not deserve
full citizenship, did not deserve rights. All of that I
think we're basically seeing the same thing today. I think
the rise of popularity of this kind of content is
against the backdrop of a very real attack on marginalized
people from this administration. You know, there was just this
very medipiece and ProPublica about how Trump and Musk their

(19:17):
goage stuff really was an attack on black women, specifically,
like black women with stable federal jobs totally, and that
these attacks essentially it was like you were able to
smear black women career civil servants as you know, they
were DEI hires, they were undeserving of these jobs, they
really just deserved to be fired. And you know, really

(19:38):
black women just became these easy targets for an administration
hostile to marginalized people. So if we have all of
that happening against the rise of this form of digital
media that is using AI to reaffirm these stereotypes about
black women that we aren't able to behave ourselves in
polite society, cannot figure out a way to solve conflicts
without resorting to violence, are loud and obnoxious. Then when

(20:00):
you hear about real life human black women getting pushed
out of their employment or attacked by this administration, you
might think, well, maybe it's for the best because they're
not suited for that work anyway, because the kind of
content that I have been consuming on TikTok, and I
think it just reaffirms this world of view that real
life human black folks are not self actualized human beings.

(20:21):
We're just a collection of tropes and stereotypes and caricatures.

Speaker 3 (20:25):
I don't know what to say there, but I agree, yes, And.

Speaker 2 (20:29):
I do think there's a kind of platform accountability question and.

Speaker 3 (20:34):
All this because oh, most certainly. Yeah, Like, the.

Speaker 2 (20:37):
Reason why we're seeing the rise of these videos is
because of the recent introduction of Google's VO three creator.

Speaker 3 (20:44):
It came out about a.

Speaker 2 (20:44):
Month ago and it's Google's latest AI video generation model,
and essentially it's designed to create these realistic looking videos
from text prompts. And the thing that kind of makes
it a step above is that you can incorporate things
like synchronized audio, dialogue, sound effects, me music. It is
really taken off with creators online who are using this
tool to create everything from.

Speaker 3 (21:06):
These AI skits to AI influencers to AI muck.

Speaker 2 (21:09):
Bangs you know where people eat tons and tons of food.

Speaker 3 (21:12):
Oh, this is so upsetting it is.

Speaker 2 (21:15):
And then like another kind of offshoot of this is
you have people who use VO three to make content
like this and they get tons of use and then
they're like, oh, if you want to learn how to
make this yourself, pay me and I'll teach you how
to do it too. So it's like there's always a
weird like MLM grift in there somewhere.

Speaker 1 (21:35):
That is the content creator classics, like a mid tier influencer.
We're not like that good at what they do, but
is able to supplement their income by offering courses to
people to teach them how to make similarly some subpar content.
And it's interesting that we've reached the full AI automation
aspect of this, right is this used to be a
big thing among like YouTubers. I was not aware that

(21:57):
this is now a thing among like AI TikTok influencers,
But that makes sense because this is like the easiest
thing to automate, So of course there's going to be
like an influx of people trying to make a quick
buck on racist AI slop.

Speaker 2 (22:11):
It makes me so sad, and I do think, I
mean when I guess I would be curious how Google
feels about the fact that like this is what their.

Speaker 3 (22:22):
Tool is being used for.

Speaker 2 (22:23):
Right, I wonder like if leaders have a sense that
this is harmful, not just harmful to black women like
me who are depicted in this kind of content, but
harmful for the Internet as a whole. It makes the
Internet experience worse for everybody. And I guess, I guess
I would imagine that like Google probably doesn't care that
this is what their their technology is being used for.

(22:43):
Like if I had a direct line to some Darbachi,
the head of Google, I would show him these clips
and say, like, is this what you had in mind
for Vo three or is this a misuse of this
tool that you just put out and unleashed on all
of us?

Speaker 1 (22:55):
Yeah, And are you going to dedicate some like millions
of dollars of research into stopping this from happening? No,
of course not, Like they're not going to build comprehensive
tools that prevent platform abuse like this, Like that's not
going to happen as long as people are using it,
and then people are hearing about it and it's spreading,
Like that's that's what they want. If there happens to
be offensive use cases of it. If anything, That's good

(23:17):
because that drives engagement. It gets people to know about
the product.

Speaker 2 (23:19):
And I think that's another one of the reasons why
Trump's you know, executive orders on AI.

Speaker 3 (23:26):
That we saw early AI.

Speaker 2 (23:28):
I mean, like, I will be the first person to
admit that we have very deep problems when it comes
to AI. Anybody who listens to Better Offline knows this, Like,
this is not a secret.

Speaker 3 (23:39):
AI is often biased.

Speaker 2 (23:40):
AI is often wrong because it is trained on us
humans the bias little bucks that we are right, and
so that shouldn't be a surprise to anybody. I also
will say, like, some of the solutions of how we
fix that are complex and not super simple.

Speaker 3 (23:54):
But with Trump's executive order, he basically is.

Speaker 2 (23:56):
Signing an order saying all AI must be objective, it
must adhere to the objective truth.

Speaker 3 (24:03):
Of the United States. And it's like, well, who determines that?

Speaker 1 (24:05):
Who who determines the objective truth of the United States?

Speaker 3 (24:10):
The President?

Speaker 2 (24:11):
I mean, if you ask Trump, yes, him, And I
guess that's the thing that pisses me off is that
there actually are complex issues and problems when it comes
to AI. But this executive order just is like, oh,
the problem is is that it's woke. The solution is
me signing an executive order saying no woke in AI,
and rather than getting any kind of actual solution or

(24:34):
having the conversation, we just get fucking nonsense.

Speaker 1 (24:37):
You know, it is worrying for multiple levels, including the
fact that the president thinks he's the orbiter of objective
truth and it thinks he can legislate that or thinks
he can executive order that into being by either you know,
benefiting or punishing tech companies who follow his policies.

Speaker 2 (24:56):
Yeah, I mean a spoiler alert for that executive order.
That's exactly what he saying. And you know you used
the word propaganda earlier, and that really is if there
was like a thesis statement of what I wanted to
say in this episode, is that that is exactly what I.

Speaker 3 (25:10):
Think is going on here.

Speaker 2 (25:11):
It really does remind me of minstrel shows because even
though minstrel shows back in the nineteenth century were this
popular form of entertainment, it also was an entire manufacturing
enterprise where people made very good money selling racist blackface
figurines as novelties and all of that. David Pilgrim, the
founder of the Jim Crow Museum of Racist Memorabilia at
Farris State University in Michigan put it like this, They

(25:34):
were everyday objects which portrayed black people as ugly different
and fun to laugh at. They were, in a word, propaganda,
And I think that's exactly what's going on here, Like
people like to think about racism as if it's just
this thing that hangs in the air, as opposed to
a system that specific people are personally and intentionally perpetuating
because they are cashing in on it.

Speaker 3 (25:54):
I don't see how Google letting.

Speaker 2 (25:55):
Creators use their tools to create content like this is
any different, Like, yeah, it's that is exactly what's going
on in my.

Speaker 1 (26:02):
Book, that's flatly like that's just like one to one,
Like you're using tech to create like unreal depictions of
racist chriacatures, to please audiences, to reaffirm their own their
own biases, to reform their own racism, and you're monetizing
it and you're automating it to create hashtag viral moments
Like it's it's the most explicit and like gross blatant

(26:26):
form of this that I've like seen, Like I think
Robert a few years ago reported on people using AI
to like make like you know, like true crime videos
of like like like animating like victims of crimes or
like like murder victims and talking about how they were
killed or something, which is very gross and very very disgusting.
But this sort of like organized like like racist video

(26:47):
propaganda stuff can lead to a lot more like actual,
like real world damage.

Speaker 2 (26:52):
I completely agree. I mean those true crime videos, I
remember that. Imagine if your kid was murdered and then no.

Speaker 3 (26:59):
It's so gross.

Speaker 2 (27:00):
Twenty years later someone is like, oh, I've made an
AI depiction of your murdered child telling their story.

Speaker 1 (27:05):
No, yeah, it's it's evil. But I think the damage
that can do is is kind of limited. The damage
that this whole altered reality where racism can get affirmed
leads to I think a lot more actual, likely political
and personal consequences.

Speaker 3 (27:20):
Completely agree.

Speaker 2 (27:21):
And I also think just taking a step back in
the conversation about AI, we're all being told how the
proliferation of AI is going to be the lynchpin of
our economy. It's so important, it's going to change everything,
and then you actually look at some of these use
cases that are taking off, and it's like, well, was
this really worth all the fucking climate degradation to make
this racist AI version of a Bigfoot that looks like

(27:43):
a black woman.

Speaker 1 (27:44):
No more rainforest, but at least we get racist bigfoot.
So oh my god, it well get it. I think
that's a good place to end. Thank you so much
for letting me rant at you about this. I really
appreciate it.

Speaker 3 (27:56):
Where else can people find your work? Bridget Well?

Speaker 2 (27:58):
You can listen to my podcast There Are No Girls
on the Internet. You can listen to my other podcasts
with Mozilla Foundation about ethics in AI called IRL, and.

Speaker 3 (28:06):
You can find me on Instagram at bridget Marine DC. Fantastic,
Oh the Internet. It Could Happen Here is a production
of cool Zone Media.

Speaker 1 (28:18):
For more podcasts from cool Zone Media, visit our website
coolzonmedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you listen to podcasts. You can
now find sources for It Could Happen here listed directly
in episode descriptions.

Speaker 3 (28:32):
Thanks for listening.

It Could Happen Here News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Evans

Robert Evans

Garrison Davis

Garrison Davis

James Stout

James Stout

Show Links

About

Popular Podcasts

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.