All Episodes

March 8, 2024 73 mins

What fake images of Trump with Black voters tell us about AI disinformation: https://www.washingtonpost.com/politics/2024/03/06/what-fake-images-trump-with-black-voters-tell-us-about-ai-disinformation/

A bill that could lead to a TikTok ban is gaining momentum in Congress. Here's what to know: https://www.cbsnews.com/news/tiktok-ban-congress-bill-bytedance-divest/

TikToker and disinfo expert Abbie Richards on why banning TikTok would harm marginalized communities: https://podcasts.apple.com/us/podcast/banning-tiktok-would-hurt-marginalized-communities/id1520715907?i=1000606352152

Elon Musk switched on X calling by default: Here’s how to switch it off: https://techcrunch.com/2024/03/04/elon-musk-x-twitter-calling-privacy-switch-off/

AOC’s Plan to End Deepfake Porn: https://www.rollingstone.com/politics/politics-news/aoc-deepfakes-defiance-act-1234979373/

CBS Sued by ‘SEAL Team’ Scribe Over Alleged Racial Quotas for Hiring Writers: https://www.hollywoodreporter.com/business/business-news/cbs-studios-paramount-reverse-discrimination-lawsuit-racial-quotas-1235842493/

Writers Sound Off About Litigious ‘SEAL Team’ Staffer Who Claims He Lost Gig Because He Was White & Male: .css-j9qmi7{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;font-weight:700;margin-bottom:1rem;margin-top:2.8rem;width:100%;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:start;justify-content:start;padding-left:5rem;}@media only screen and (max-width: 599px){.css-j9qmi7{padding-left:0;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;}}.css-j9qmi7 svg{fill:#27292D;}.css-j9qmi7 .eagfbvw0{-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;color:#27292D;}

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Tot and this
is There Are No Girls on the Internet. Welcome to
There No Girls on the Internet, where we explore the
intersection of identity, social media, technology, and the Internet experience.

Speaker 2 (00:25):
And this is another installment of.

Speaker 1 (00:27):
Our weekly news roundup, where we summarize Internet news that
you might have missed.

Speaker 2 (00:32):
Joey, thank you so much for being here.

Speaker 3 (00:34):
Hey, Bridget, thanks for having me.

Speaker 1 (00:36):
Okay, so I have to start out with a question.
Do you have any tech related pet peeves? You might
have guessed this is like me asking you so that
I can grape about mine, but I want to hear
yours first.

Speaker 3 (00:48):
Joey, Oh, Bridget.

Speaker 4 (00:51):
So many why do you think about the show? So Okay,
this is like a very smalling in the grand scheme
of things pet peeve. But lately I've been really really
annoyed with the Spotify algorithm. Uh, just because I don't
know if y'all have tried, like they have this new

(01:12):
not new, I guess it's been around for like a minute.
But if you type in like angry mix or like
happy mix, like it'll come up with like a collection
of songs so they have created for you, and it
just like hasn't been good. I've got some really weird
butns I think especially, I don't know, I think I.

Speaker 3 (01:34):
Maybe it's also just like, like.

Speaker 4 (01:36):
I've listened to a very wide variety of music, so
I feel like a lot of times too, like a
lot of it will get like mixed up together when
I don't like it'll it'll label things in certain genres
that they aren't.

Speaker 3 (01:46):
Just because I've been listening to it a lot.

Speaker 2 (01:47):
I don't know.

Speaker 4 (01:48):
Anyways, I've been very annoyous to the Spotify algorithm lately.
I'm not And also I've realized it has been very
hard to find like new music to listen to because
it just keeps recommending the same artists that I already
listened to, like when I'm looking for new music, which
you know is fun if I like, I'm just trying
to listen to songs that I like. But like, I think,
especially trying to find new stuff now just because of

(02:10):
this new like these mixes that are tailored specifically for you,
it makes it so much harder to find new stuff
which I don't like.

Speaker 3 (02:20):
Yeah, so and I don't know.

Speaker 4 (02:22):
There's so many issues that Spotify just as a company,
and like the way that like the streaming sort of
net industry works that I it's annoying.

Speaker 3 (02:32):
It's very annoying.

Speaker 1 (02:33):
You have like activated something in me because I could
talk about this all day. So I think this is
just I haven't done any research into this because it's
just banter. But I think I know what's happening with
your playlist, which I think that Spotify is probably if
I had to say, deprioritizing human people who make curated

(02:54):
playlists and curated recommendations. Like I know that they used
to have like big teams of humans working at Spotify
that made playlists, and I know that like they are
using AI to do some of that now, so I
wonder if that's had a hand in why the playlist
don't earnest good. I also think like with the play
with like when you're looking for new music, you want

(03:15):
you when you listen to a song, you want the
next algorithmically generated song to be like not the same
artist or a song that you've heard before, but but
spiritually the same.

Speaker 2 (03:26):
Like I know exactly what you mean.

Speaker 1 (03:28):
It's hard to put into words, like I want it
to be like you're looking for something. It's it's very
hard to verbalize, but do you know what I mean?

Speaker 4 (03:35):
Yeah, it's just weird, I think, especially with they keep
doing these these ones that are based on emotion and
it's like you can't have an AI pick out. I
have a screenshot on my phone of one that was
like happy mix, and the artists that they use in
the picture it's Midski and like, like I.

Speaker 3 (03:54):
Listened to a lot of good Ski. That's not necessarily
my happy music.

Speaker 5 (03:58):
I don't think many people would call up the happy music,
but like the algorithm was like, oh, this has like
an up tempo beat this song. I don't know what's
got I totally, I totally think you're right though. I
think it's a lot of the AI algorithm is sort
of stuff regardless of.

Speaker 6 (04:18):
Because I you know what, I have a lot of
friends that make great playlists.

Speaker 5 (04:22):
I think I make a pretty decent playlist. Like there's
there's clearly something something's some lot of.

Speaker 3 (04:28):
Like TikTok music too.

Speaker 2 (04:30):
Oh, I've noticed the same thing.

Speaker 1 (04:32):
Something else I've really taken umbrage with on Spotify is
how bad their podcast player is.

Speaker 2 (04:40):
Like I have Spotify Premium.

Speaker 1 (04:42):
I have to say, it is where I listen to
my podcasts, but I am almost in like a toxic
relationship with Spotify's podcast player. You'll try to rewind it
thirty seconds and it's like, oh, did you want to
start playing a totally different podcast?

Speaker 3 (04:56):
Oh, it's so weird.

Speaker 4 (04:58):
Yeah, there's certain shows that they like showed the ads
are different, like how they Yeah, it's really annoying for
you know, for a company that is like meant not monopolized.
I guess a lot of people use Apple Music like
other platforms that really like taken over this market, like.

Speaker 3 (05:16):
They don't operate well.

Speaker 2 (05:18):
And I don't know.

Speaker 4 (05:18):
I keep reading things too, and I think this might
lead into what you're going to talk about, But I
keep reading things too about like so much of our
music and media and stuff that used to have like
analog copies, like now people are collecting. Because I was like,
I can't even think of, like I don't even think
about the option of leaving Spotify because I'm like I
have all of my mus I've had Spotify since I

(05:39):
was like like in high school, Like I have like
all of my music there. I don't have hard copies
of those you know songs that I like. I don't
want to lose all of that.

Speaker 1 (05:50):
Yeah, I have a love hate relationship with analog, So
I'm an analog person. Like because I am a woman
of a certain age, I was really into record collecting
like most like hip music people were in a certain era.
And let me tell you, I've had to I have,
So I have a massive record collection in my house.

(06:11):
I have had to schlep this collection to multiple apartments
when I move multiple cities. Had I and I started
collecting records before streaming was ever a thing, So I
never imagined that I would live in a world where
you could just get every single piece of music that
Devo ever put out at the touch of a fingertip.

(06:32):
So it's like, no, I need to have every vinyl
that they've ever made. That's the only way I'll ever
be able to hear it. And then you have to
shot them from like New York to San Francisco to
apartment to the apartment.

Speaker 2 (06:42):
Like yeah, but that said.

Speaker 1 (06:44):
Like, as media companies and streaming companies delete stuff, like
I've been hearing more and more about like, oh you
bought this game, well guess what, Like we could actually
just delete it. It kind of feels like in twenty
twenty four, like what does it mean to own something
digitally if you never really own it in a real

(07:04):
sense if the company that you bought it from can
just like take it away. And I don't know, part
of me, especially as Netflix and Hulu like raise their prices,
part of me is like, might have to dust off
that old DVD player because I have a lot of
those too, and that actually it does lead to my
big tech pet peeves that I have like just been.

Speaker 2 (07:27):
Enraged by.

Speaker 1 (07:28):
I was visiting my parents and my mom has got
a new car, so I was driving her car and
it's brand new, but it's super slick, and her car
barely has any buttons in it, like physical pressable, clunky
old school buttons. Everything is done on a touchscreen mounted
on the dashboard pretty much everything. So I'm like driving

(07:49):
and I'm trying to use my hand to you know,
turn up the volume, turn the song, or put on
the heat or whatever, and I'm feeling around expecting there
to be like a touchable button.

Speaker 2 (08:00):
It's all on a touch screen.

Speaker 1 (08:02):
I don't know if you've driven a car like this,
I'm sure there are people listening who are like, oh,
I prefer the touchscreen, but.

Speaker 2 (08:07):
Whatever happened to a good old fashioned button?

Speaker 1 (08:09):
Like what is this assumption that touch screens are automatically
better than a good old fashion clunky button that you press.

Speaker 3 (08:19):
Oh yeah, no, I.

Speaker 4 (08:21):
My mom also has a car with this sort of technology.
And it's the like Tesla effect where everybody's picking up
because I think Tesla was the first one to have
the like just iPad screen in there, and it's awful,
like driving, it is so annoying and like trying to
deal with the control.

Speaker 3 (08:40):
Yeah no, I don't understand the appeal of it.

Speaker 1 (08:43):
So we actually might not be alone in this. Because
this is from life Wire. The European New Car Assessment
Program or euro en CAP gives safety ratings for cars,
and next January, cars will not be able to get
a five out of five rating unless they use buttons,
stocks or dials for essential safety features. That's what I'm
talking about. I'm sure people listening are like, it's better on.

Speaker 2 (09:06):
A touch screen.

Speaker 1 (09:06):
No, it's not. Buttons were fine. What did buttons ever
do to anybody?

Speaker 3 (09:10):
Exactly? It's the same with everything. It's the same with
like iPhones, Like as soon as they got rid of
the this is my back in, you.

Speaker 7 (09:18):
Know, all the technology these days, but like I feel
like once they got rid of the home screen on
like iPhones or like the little home button home screen
button on iPhones, like that was it.

Speaker 2 (09:28):
That was the end, That was the end of everything.

Speaker 1 (09:31):
Everything went to shit, her whole her, whole culture. I
was just on the podcast also on Outspoken, Black Fat
and Fem and one of the questions they were they
asked was like, what's your what was your favorite piets technology?
And it sent me down this rabbit hole of how
much I loved BlackBerry. And part of it was that

(09:51):
it had the clickable button and this like clickable kind
of mouse stingy in the middle, like a circular button,
And I swear to you I would type messages on
my BlackBerry like I thought I was like working for
the Obama White House or something like.

Speaker 2 (10:06):
I took it so seriously.

Speaker 1 (10:08):
I thought I thought I was like, you know, sending
high pressure, high stakes emails.

Speaker 2 (10:13):
Mine.

Speaker 1 (10:14):
I was in my early twenties, so I definitely was
not sending anything of any import. But something about the
clickable button made me feel like I was really doing business.

Speaker 3 (10:22):
Yeah, there was thing.

Speaker 4 (10:23):
About blackberries too that I feel. I don't know at
least so I understand. I was a child when blackberries
were a thing, so I always associated it with like
it being my dad's like serious work phone that he
had to So I was like in my mind, like
blackberries are like this serious, like I am doing.

Speaker 3 (10:44):
I am going to work.

Speaker 4 (10:45):
You know what a child assumes an adults, Uh, you know,
job and careers like bridget.

Speaker 3 (10:51):
Have you seen the BlackBerry movie?

Speaker 2 (10:53):
But I sure have. I thought it.

Speaker 1 (10:56):
I was lucky enough to get sent a screener. Thank
you whoever made that happen. But I wait, did you
see it?

Speaker 3 (11:04):
I have not seen it.

Speaker 4 (11:05):
I do love my god, I'm Glenn Howardton, though, so
I I gotta see it.

Speaker 1 (11:09):
This is okay. So I love Glenn Howardton, I love
Always Sunny. I love their podcast. Or I listened to
it for a while. I couldn't fallen off on it.
But when I tell you that, like I believe he
could have been nominated for an Oscar for Best Actor
for his work in this movie, like the BlackBerry Movie?

Speaker 2 (11:26):
Is his Citizen Kane like he is? He is?

Speaker 6 (11:31):
He?

Speaker 1 (11:31):
I didn't I guess there are I don't know if
you watch Always Sunny, But there are times in it
when his character Dennis has a real like white hot intensity,
and I think and I never really yeah, I never
really thought he I never really put that together like, oh,
this is like very good acting. He is like a
very good actor. You gotta see it. Please watch it

(11:52):
and tell me what you.

Speaker 2 (11:53):
Think it is.

Speaker 4 (11:54):
It's definitely on my list. Yeah, that was my I
feel like that's a great character for him. I would
love to see him play more like somebody. Actually, I think, God,
there's something some.

Speaker 3 (12:05):
Tweeters something I saw right.

Speaker 4 (12:07):
I am against the idea of these types of remakes,
by the way, just disclaim her, but somebody like they
might be making an American Psycho remake or something. And
somebody immediately posted like a picture of him as like
the oh this guest.

Speaker 3 (12:19):
And I was like, that would be perfect, that would
be so funny.

Speaker 4 (12:22):
That is the one actor I actually would go see
again another needless remake of a movie, but.

Speaker 1 (12:28):
Oh my god, I would I would absolutely watch that.
I would absolutely watch that. Yes, cannot like ten out
of ten BlackBerry movie. Also, it really captures that time
very well, just the time of like you know when
iPhone came out and it was like that Steve Jobs
announcement up on the stage, you know, like just this

(12:49):
how we were all communicating that it really it really is.
It does a great send up of that era, and
I feel like this whole segment is kind of like
er kids these days technology, which leads us into our
first topic very well, which is an update on what
is happening in the fight to potentially ban TikTok.

Speaker 2 (13:10):
Have you been following this?

Speaker 4 (13:12):
I found out about this probably the same way a
lot of people did, which is that I opened TikTok
and there was a very alarming in pographic thing that
like immediately pops up being like save TikTok.

Speaker 3 (13:28):
But yeah, that's that's about as far.

Speaker 4 (13:30):
I'm not gonna lie. Honestly, I saw it and kind
of exited out of it and just kept scrolling.

Speaker 3 (13:33):
I was not in the mood to think about reality
at that point.

Speaker 1 (13:37):
But yes, you're like, I'm just trying to disassociate with
some tiktoks.

Speaker 4 (13:41):
That was not why I opened TikTok was to find
out about the government collapsing.

Speaker 3 (13:45):
I don't know whatever is happening.

Speaker 1 (13:47):
So if you like Joey opened your TikTok app and
saw a big black and white banner that said stop
a TikTok shutdown. Congress is planning a total ban of TikTok.
Speak up now before your government strips one hundred and
seventy million amy Americans of their constitutional right to free expression.
This will damage millions of businesses, destroy the livelihoods of
countless creators across the country. And then I, artists and audience,

(14:10):
let Congress know what TikTok means to you. So here's
what's going on. A bill that would force TikTok's Chinese owner,
Byte Dance to sell passed a big vote today. The
House Energy and Commerce Committee voted unanimously to pass this bill,
meaning that TikTok is one step closer to essentially kind
of being banned of the United States. Now, to be clear,

(14:31):
this legislation would force Byte Dance to sell TikTok to
an American owner, But because of the complicated nature of that,
like who would they sell to, how would work? We
break all of this data in an episode with Abbey
Richards that will put in the show description.

Speaker 2 (14:44):
Basically, people are calling it.

Speaker 1 (14:47):
A ban even though it technically isn't because forcing TikTok
to sell to an American owner is like so complicated
that it may as well be a ban. This bill
in order to become a law would stove the past
the House and the Senate, and some senators have already
expressed opposition to it. So if you open your TikTok
app like Joey and got that big alert urging you

(15:08):
to call your congress person, that is why basically TikTok
is kind of going on the offensive here. And even
though this is just one hurdle of several other hurdles
that would have to be satisfied for this to become
the law of the land, I think TikTok is.

Speaker 2 (15:22):
Really feeling the heat.

Speaker 1 (15:23):
I actually did not get one of these alerts my
open TikTok today.

Speaker 2 (15:28):
I wonder if it's because they know that I'm in Washington,
d C.

Speaker 1 (15:30):
And I don't have a voting member of Congress and
I can actually call on So if you don't live
in Washington, DC and you actually get to enjoy meaningful
congressional representation, well congratulations, call your congress person for me
and tell them that I like TikTok.

Speaker 4 (15:47):
I imagine that having congressional representation. I do think I
gotta say, I do think it is a little bit
funny because and not to single out of TikTok. Obviously
this is like a universal issue, but TikTok has done
so much to repress activists within, you know, particularly recently,
it is a little bit funny to see them now

(16:09):
utilizing their platform to.

Speaker 3 (16:11):
Be like, come on, call your congress people.

Speaker 2 (16:16):
We love free expression. What are you talking about?

Speaker 1 (16:20):
Yes, I mean, that's one of the things that is
so tough about this conversation is that I have a
lot of gripes with TikTok. I have a lot of
gripes about their moderation. I have a lot of gripes
about how content is promoted or not promoted, whether or
not certain topics are being suppressed. I'm not the only
person in hearings. Lawmakers have asked about this too, And
so I don't want to make it seem like I

(16:41):
am like yay TikTok. TikTok has no problems, but some
of the way that these lawmakers are framing TikTok I
really take issue with. And so it's an issue where
it's like it almost sounds like I am very pro TikTok,
even though I really like you, Joey, I know that
that TikTok has a lot of big problems, and I

(17:02):
have a lot of big concerns about how they're moderating
certain things and how they're some of their policies. And
I do have to say it does sound like this
push to get folks to call their Congress people is working.
One house GOP staff or told Politico quote, it's so
so bad. Our phones have not stopped ringing. They're teenagers
and old people saying they spend their whole day on

(17:24):
the app and we can't take it away. Here's Representative
Jamal Bauman from New York saying that banning TikTok would
silence the voices of younger Americans, particularly who are using
the app to organize for things like climate justice or
a full cease fire.

Speaker 2 (17:40):
My colleagues are trying to ban TikTok, which is crazy.

Speaker 8 (17:43):
They're doing it because of the oppression, because of your organizing,
It's because of your good worth. One million email sent
to ban the Wiggle Project, full million emails sent to
demand a sees via much more work to do as
we head into these and lectures and beyond. Again, they're
not trying to do anything to Facebook, even though China
gets most of this information from Facebook. They're going after

(18:05):
TikTok because they're going after young people.

Speaker 1 (18:12):
Let's take a quick break.

Speaker 2 (18:25):
At or back.

Speaker 4 (18:28):
Yeah, yeah, I know, and like I mean to clarify
what I said before too, I think I think honestly
like for you know, obviously TikTok itself and I agree
with you like TikTok is it's not a lot of
messed up stuff, but at the same time, it is
this platform for particularly young people and uh, you know,

(18:49):
like he said in the video, I think, you know,
it's part of the preason TikTok that gets targeted out
these other platforms is because it isn't primarily younger audience.

Speaker 3 (19:00):
And yeah, we've seen all these like movements.

Speaker 4 (19:03):
You know, TikTok may not necessarily treat those movements well
and treat those you know kind of voice as well
on their app, but it is a platform for a
lot of these more progressive movements that I think, especially
like you know, this past year, and that is something
that there are people that have stakes in silencing that.

Speaker 1 (19:25):
Yeah, I mean, I can't sort of turn off the
part of the conversation that does feel sort of like
handwringing about like what are the youth doing on their phones?
And like I am a huge proponent of you know,
safer digital experiences, particularly for young people.

Speaker 2 (19:41):
But people are.

Speaker 1 (19:45):
Describing the views and attitudes of young people in this
way that it's like, oh, well, it must be the
app that is making them feel this way. And it's
like it couldn't possibly be that young people are smart
and empathetic and they've looked at the facts.

Speaker 2 (19:59):
I'm like, they have the point of view.

Speaker 1 (20:00):
It's like it just seems it just seems very disingenuous
to make it seem like TikTok is the reason why
young people are you know, organizing for a full ceasefire
and things like that. Like it just it just feels
very belittling to their perspective and their point of view.
And I just I don't, I don't do you know,

(20:22):
do you know what I'm saying?

Speaker 2 (20:23):
I don't know.

Speaker 3 (20:23):
If I'm being clined, I totally agree.

Speaker 4 (20:25):
I mean, I think it's a kind of like tale
as old at time, there's always you know, the latest
technology or like the latest thing. It's always you know this,
this thing is exposing our kids to this new idea.

Speaker 3 (20:38):
It's never like hey, there's just this idea out there.

Speaker 4 (20:40):
That is appealing to people because you know X, Y Z,
Like I don't know, but but again, yeah, like this
is the same thing with every kind of new technology,
every kind of new art form.

Speaker 3 (20:54):
Not to say that.

Speaker 4 (20:55):
TikTok is necessarily an art form, but I don't know,
like a new kind of platform or of like communication.
This is kind of the conversation that has had over
and over and over again, and like A, there's that
and then B I think they're genuinely I mean, I
don't want to say I don't know what is up,
because I do think I know what is up, and
it is it is so weird, like with the calls

(21:19):
for ceaspire, I think with like previously, when like the
Black Lives Matter protests were kind of in a bull swing,
like every single time, there's sort of this idea of
like oh this, well, like my kid couldn't actually think that,
or like general public couldn't think this because it is
against what the status quo is or against what is
like the most convenient kind of point two, I don't

(21:42):
know like it is sort of part of the thing
that is like politically most convenient.

Speaker 3 (21:47):
That kind of argument is connected to so many things.

Speaker 4 (21:49):
I think that's where you get like what like Nancy
Pelosi tell like protesters to like go back to Russia
or whatever because they're like agents of the crumbling, like
it was some crud. We're talking about Palestide, which is
a totally different, Like I don't know, like if this
it's this idea that people in general can't think for themselves,
and especially young people and especially progressive young people, And

(22:12):
I think that is not surprising, and I don't think
it's gonna stop necessarily, but yeah, it is, like clearly
that's the root of the problem, not the fact that.

Speaker 3 (22:19):
TikTok is like poisoning people's minds.

Speaker 2 (22:23):
Yet it's so insulting.

Speaker 1 (22:25):
We have talked about a TikTok ban on the show before,
and so I think folks probably know how I feel
about it. You know, just like any social media platform
in twenty twenty four, there are definitely valid concerns about TikTok.

Speaker 3 (22:38):
You know.

Speaker 1 (22:38):
Anybody who tells you that there are not security concerns
with TikTok just doesn't know the facts. There were reports
that TikTok accessed journalists' information in an attempt to identify
which employees were leaking information about the app. TikTok actually
admitted to doing this, according to an internal email, but
when asked about it directly, TikTok CEO responded that, oh, well,

(23:00):
spy is not the right way to describe that, which,
like it did sound like spying to me, so TikTok doesn't.

Speaker 2 (23:05):
I don't want to make it seem like.

Speaker 1 (23:06):
TikTok has like a squeaky clean record, because they do not. However,
this is not the kind of opposition that lawmakers are
levying against TikTok. They're saying that TikTok is a big
national security threat because it is a Chinese owned company
and the platform is being used to like steal American
data and feed it to the Chinese Communist Party.

Speaker 2 (23:26):
So not only is.

Speaker 1 (23:27):
There no smoking gun or no evidence of this being
the case, it's also just like a plainly xenophobic thing
to say, Like the way that lawmakers have spoken to
the head of TikTok makes it clear to me that
there is some obvious xenophobia going on, And so I
think that lawmakers have sense that people want them to

(23:48):
crack down on big tech, and I think that lawmakers
want to look tough on China always, right, and so
going after TikTok as this singular boogieyman is a way
for them to kill two birds with one stone and
really do that. I also think that the reality here
is that the United States wants to remain the sort
of major, big player when it comes to big tech.

(24:09):
They don't like the idea of other countries, you know,
becoming major players in the big tech landscape and the
social media landscape. They wanted to just be for you know,
your Mark Zuckerberg's and your Elon Musk's of the world
and not your shows these shoes of the world.

Speaker 2 (24:25):
And I think that's what it really comes down to.

Speaker 3 (24:27):
Yeah, definitely.

Speaker 4 (24:28):
I mean they've been very transparent about the fact that
it is more to do with China than it has
to do with like protecting people's data or whatever, which
like I don't know, yeah, like Facebook's probably doing just
as much harm, if not more. And uh, I mean
that that clip of the like Senate hearing where they just.

Speaker 3 (24:47):
Kept asking the guy if he was part of the
Chinese part.

Speaker 1 (24:50):
Like what's their connection to China? Right, but really, what's
your connection to China?

Speaker 2 (24:54):
Like it's so fucked up. The fact that people.

Speaker 4 (24:57):
Like still think this is about genuine safety concerns and
not just yeah xenophobia. Like after that, I think just
kind of goes to show how messed up a lot
of this is. But yeah, again, like it is sort
of the double edge or the sort of two sided
issue of like this, there's still all these issues with

(25:18):
this platform. They've still done a lot of messed up stuff.
But also like, clearly this is not coming from a
good place or from a genuine concern for people's safety.

Speaker 1 (25:29):
Because if, for we're coming from a genuine concern for
all of our data privacy, we would have some sort
of meaningful data privacy legislation, which we in the United
States do not. It does not begin an end with
banning TikTok. All of our data is essentially on sale
for whoever wants it, including China. Only a few weeks ago,
Biden signed an executive order that limited the sale of

(25:52):
some of our data in some ways to quote countries
of concern, including China. But as Wired points out, this
scative order only means that US data brokers need to
take some steps and some security requirements during transfer, many
of which were already required by law, and so like,
I have a hard time understanding why TikTok is this

(26:15):
massive threat of all of our data being given to China,
given that a foreign adversary like China would not even
need TikTok to get that data right now, you know,
as long as they follow certain specific guidelines, they can
totally legally buy that data from US based data brokers.
And so that to me is a huge problem. I

(26:35):
don't think that we should be making TikTok this big,
you know, boogeyman while not meaningfully addressing the actual root
cause of the issue, which is that we don't really
have the kind of data protections and privacy protections that
we need in this country. And so, just to sort
of wrap this up, I think it is still too
soon to say whether or not TikTok will actually be banned.

(26:56):
If I was a betting woman, I believe it could
be banned. Like like, I think it's more likely than not,
but that's just my opinion. But even still, this bill
still needs to pass both the House and the Senate
and be signed into law. By the President, in which case,
if it did become the law, Bite Dance will still
have six months to sell before any kind of band

(27:17):
ticks effect.

Speaker 2 (27:18):
So we'll keep following this.

Speaker 3 (27:21):
Yeah, yeah, yeah, I guess we'll find out.

Speaker 4 (27:25):
I will say, I personally, I don't think it's gonna
end up being banned.

Speaker 2 (27:30):
I think, ooh tell me more.

Speaker 3 (27:32):
I just think.

Speaker 4 (27:36):
I just think it's it's like ben in the kind
of culturals I guess too long, Like it's like if
this the first round of this happening, I was like, oh, yeah,
they're definitely gonna ban tich Talk. I feel like by
this point, by like twenty twenty four, I feel like
it's too far gone.

Speaker 3 (27:52):
And also, like they said, you.

Speaker 4 (27:54):
Know, there's all these like teenagers and older people too,
I guess like calling in and complaining about it.

Speaker 3 (28:00):
I do think. I think at least if there is
a band, I don't think it's gonna last.

Speaker 4 (28:06):
I think it'll it'll or there's gonna be some sort
of way of going around it.

Speaker 1 (28:09):
But oh, for sure. That's something I really think about
a lot. Is like in like you could like there
are so many ways to get around this band. I
would also be really curious how it will be enforced,
and you know, it has to be said that I
do think TikTok has really been instrumental in traditionally marginalized
communities really being able to have a bigger voice. You know,

(28:33):
our communities have you know, don't unless you like know,
somebody on the mask tedd at the New York Times
or whatever. Sometimes it's hard for our communities to really
have access to the kind of media power that other
groups have. And so I do think that this is
if it is banned, I think that marginalized creators and
marginalized communities would really suffer the most because we don't

(28:55):
always have those avenues and those channels that are ready
built to get our voices and stories out there. There
are so many things pertaining to marginalized communities that I
personally would not have known about if not for TikTok,
and so I'm concerned that that as a platform for
me staying informed about what's happening in other traditionally marginalized

(29:16):
communities other than my own. I'm concerned for what the
banning of that app means for me being able to
stay informed with what's going on.

Speaker 4 (29:24):
Definitely, yeah, I totally agree, And I mean, I feel
like part of the reason why right now this is
so concerning is just the fact that it's like Twitter's
gone to shit too, Like there's not necessarily like I
feel like TikTok was kind of where I turned when
Twitter stopped being like functional.

Speaker 3 (29:40):
So I don't know, I don't know what's gonna where
people are gonna go next.

Speaker 1 (29:45):
Yeah, I feel the exact same, speaking of Twitter being
kind of a garbage factory, Well, this week over on Twitter,
they rolled out video and voice calls.

Speaker 2 (29:55):
Y'all might recall that Elon Musk.

Speaker 1 (29:57):
Really wants Twitter to be like an everything app where
you do your banking, your phone calls, your emails, all
the things that you already do online now. But I
guess like you can do all of those things over
on Twitter, but I guess at great security risk to
yourselves and also shitty.

Speaker 2 (30:12):
I guess that's how he's envisioning it.

Speaker 4 (30:14):
Yeah, you never want him to like access your bank account,
but in a less secure and more annoying way.

Speaker 3 (30:23):
Appreciate it.

Speaker 1 (30:25):
I mean, like, finally a solution to this problem. So,
because we're talking about Elon Musk, here the rollout of
these video and voice calls has not been good and
it's been rolled out to everyone.

Speaker 2 (30:38):
By default.

Speaker 1 (30:39):
You do not get the option of opting in if
you had the Twitter app on your phone. I don't
think it's on the browser yet, but you have it.
So here's what you need to know about this rollout.
First of all, Twitter's new video and voice calls leaks
your IP address to anybody that you talk with, and
it's really complicated to limit who you can talk to.
Tech Crunch has a really good explainer which we'll link

(31:02):
to in the show notes, where they say that a
person's IP addressed is not like hugely sensitive necessarily, but
these online identifiers can be used to infer location and
can be linked to a person's online activity, which can
be dangerous for high risk users. So if you're an activist,
if you're a public figure, if you're a dissident, if
you're somebody who you know is engaged in sensitive conversations

(31:26):
or sensitive work, that could be concerning for you. Another
thing to know about this is that Twitter does not
mention encryption in any of their official help center pages
at all, so their calls probably are not end to
end encrypted, which potentially allows Twitter to listen in on
conversations into end encrypted messaging apps such as Signal or WhatsApp.
Those prevent anybody other than the caller and the recipient

(31:49):
from listening in. And I do feel like, luckily we're
going to a place where that is like more of
the norm. And it's kind of interesting to me that
Twitter rolled this out without even referencing whether or not
calls are encrypted in any of their messaging. When tech
Crunch actually emailed Twitter to ask about this, like incredibly

(32:11):
basic privacy and security thing, they of course got the
standard Twitter reply, which is like, oh, we're busy now,
please check back later, which again to me, really speaks
to the carelessness with which this was rolled out, you know,
giving it to everybody by default, you know, not having
people opt in or even like meaningfully telling them what's

(32:32):
going on, and then not being able to answer some
of these basic security questions to me is like a
big red flag. So my advice to everybody is, unless
you have some particular need to be doing calls over Twitter,
you should turn this off.

Speaker 2 (32:47):
It's pretty easy to turn off.

Speaker 1 (32:48):
We will drop the tech Crunch piece with pretty clear
instructions with images on how to do this.

Speaker 2 (32:53):
In the show notes.

Speaker 1 (32:54):
But unless you've got some real pressing need to have
elon musk, like listening to your sensitive phone call, I
suggest that everybody.

Speaker 2 (33:02):
Turn this off. Let's take a quick break at our back.

Speaker 1 (33:24):
So, speaking of online shenanigans, I have to talk about
this recent report from BBC breaking down a new ish
disinformation trend that we're seeing ahead of the twenty twenty
four US presidential election, which is fake AI generated images
of Trump with black people. So I've actually seen a
bunch of these in the wild. The one that sticks

(33:47):
in my mind is an image of Trump posing with
a bunch of young black men. This was shared un
ironically by somebody that I follow on Facebook or like
I'm friends with on Facebook, and so like was not
shared in a like, look how stupid this is kind
of way, shared in a like yay Trump, this is
great kind of way. And so the caption of this
fake image was like Trump's motorcade was driving around Baltimore

(34:10):
when these kids asked him to stop and take a picture.
And it was seen by one point three million people.
And it's one of those images where like, if you
thought about it for ten seconds, like think about it critically,
for ten seconds. If Trump's motorcade was driving through Baltimore
and a bunch of random kids on the street were like, Hey, Trump,

(34:32):
stop your motorcade and get out of a car and
take a picture with us. Do you think that would happen? Like,
it's just like it's one of those things where it's
like the image is compelling. I will give them that,
But if you just thought about it, you would realize
this is this doesn't make any sense.

Speaker 2 (34:47):
It doesn't it doesn't hold water.

Speaker 3 (34:48):
Yeah, I don't think.

Speaker 4 (34:54):
Yeah, I would say I would not expect that to
be something that Trump would do realistically. I'm also like,
looking at this photo, I mean, it's the whole it's
the AI thing, where like I can I get how
people could like maybe mistake this for an actual image
at first, but they're so like allthough the people in
the picture look so like airbrushed, it's so weird.

Speaker 1 (35:17):
Yes, their skin is like way too polished and smooth
and kind of kind of like shiny in that way
that just like your normal skin doesn't skin unless it's fake.

Speaker 3 (35:27):
Yeah.

Speaker 4 (35:27):
Yeah, one of these pictures, because I'm just seeing them
now for the first time, one of them genuinely does
like it actually is a little freaky, like it look
kind of looks like a real photo. And I think
if I was like scrolling on like Instagram and I
saw it, I would assume it was a real photo.
But like, yeah, no, it all of these photos like whenever,
like AI photos always.

Speaker 3 (35:48):
Like have a weird kind of like inhuman quality to them.

Speaker 2 (35:52):
Yeah, I mean they do.

Speaker 1 (35:54):
Now it concerns me, like, like not even that long ago,
we were always like, oh, look at the hand, look
at the fingers, And now it's like it's it's figured
out how to do the hands on the fingers, and
now we have to look at other tells.

Speaker 2 (36:06):
And so I'm with you.

Speaker 1 (36:08):
I do think a lot of these pictures like I'm
not above this, like I have been taken by AI
pictures before, even as somebody who thinks about disinformation and
AI a lot, Like I think it was the.

Speaker 2 (36:22):
Pope image the Pope. Yeah, that one got me.

Speaker 1 (36:25):
But it was exactly like you said it was the
morning I was I had not I didn't have my
contact lenses in yet. I was just like drinking coffee
and scrolling social media on my iPad in the morning
while having breakfast, and like you, like of course, in
that instance, when you're just quickly scrolling and you're not
necessarily primed and poised to be having your critical thinking

(36:48):
hat on, you might see something quickly and be like, oh, okay, sure,
Like guess the Pope is wearing this Balenciaga puffer. Guess
Trump is hanging out in Baltimore with these kids on
the stoop, Like, guess it makes sense, and we should
have a media ecosystem where you don't have to have
your critical thinking cap on twenty four to seven whenever
you're on social media to determine what is true and

(37:09):
what is fake. Like it is wild to me that
we all have to be so on our toes all
the time, but I do think in this climate we
kind of do.

Speaker 4 (37:19):
Yeah, definitely and yeah again, Like this this first photo, I.

Speaker 3 (37:25):
That that if this one it's it's.

Speaker 4 (37:27):
Just like with a group of like the women, yah,
the women, yeah, the one with the yeah, this first one,
that one's freaky that that I think that could trick
me like that you can still tell even with that one,
like the hands are a little bit miss messed up,
but the way they have it, like you can only
really see like two people's like one hand on two
different people. So so it's easier to kind of look

(37:51):
over I think it's easier to kind of.

Speaker 3 (37:52):
Like not not really catch on.

Speaker 4 (37:56):
Uh yeah, this is just a perpetual aid a nightmare.

Speaker 1 (38:04):
But you know, so, as far as we know, these
images are not coming from anybody connected to Trump or
Trump's campaign, but the co founder of the organization Black
Voters Matter, which is a group that encourages black folks
to vote, says that the manipulated images were clearly pushing
a kind of strategic narrative designed to show Trump as

(38:25):
popular in the black community. And so we don't even
need to look to AI to see this everywhere, this
notion that black folks like Trump. I you know, try
to monitor what's happening in known disinformation spaces and in
far right extremist media. They say things they're so fucking racist,

(38:45):
but they say things like, oh, well, the Trump mugshot
means that black people are gonna love Trump, or like
the Trump sneakers, you know how black people love sneakers.
It's like it's the so like it's not just coming
from these AI visual disinformation pieces that message that black
people are excited about Trump. I'm seeing being pushed in

(39:05):
spaces that have nothing to do with AI, and it's
it's very telling to me.

Speaker 2 (39:10):
And again, they're always so.

Speaker 1 (39:14):
Offensive the way, like like the way that they are.

Speaker 3 (39:17):
The sneakers one was crazy.

Speaker 4 (39:19):
Yeah, the sneakers one was so like I was like, oh, okay, we're.

Speaker 2 (39:23):
Doing that all right?

Speaker 1 (39:25):
Yeah it is And so like what I guess, even
though you don't need AI to make this kind of offensive,
you know, digital fakeries, it is concerning. So the Washington
Posts Tech Vertical spoke to the person who made the
image that you're talking about that that you said looked
so real. He is Mark Ka, a host of a

(39:49):
conservative radio show in Florida, who said that he created
the image using the AI image tool mid Journey to
illustrate a November twenty nine post about Trump's alleged growing
support among black voters. He said that he knew that
posts that have an image tend to perform better on
social media platforms, so he just made this image on

(40:09):
mid Journey and he said it took him thirty seconds
to create, which is concerning that like, in less than
a minute somebody could make this like pretty convincing image.
The Washington Post also spoke to doctor Joan Donovan, who
made a really good point about that coverage of this
kind of thing is kind of tricky and can also
sort of feed into the problem because the people who

(40:30):
make these kinds of images want people to be talking
about them, right, and so the image itself might not
be newsworthy until it gets a kind of coverage that
makes it go viral. Doctor Donovan says, one of the
things we have to consider about campaigns like this is
how much they are stunting in terms of trying to
get media attention for something that otherwise would not merit headlines.

(40:51):
Some propagandists might recognize that it's possibly not newsworthy that
black people support Trump, but it is newsworthy that AI
generated photos of fake voters are circulating. And to doctor
Donovan's point, Marquaye actually said that his AI generated fake
Trump images did not go viral until they were covered
by BBC, and so you know, doctor Donovan might have

(41:14):
a point there.

Speaker 2 (41:14):
So doctor Donovan went further and spoke to.

Speaker 1 (41:17):
This idea that yes, like AI that gives people the
ability to quickly and effectively make this kind of visual
disinformation is really scary and like bad but cheap fakes
and other kinds of medium manipulation also exists, and it's
also a big concern because you know, whereas these Trump

(41:37):
deep fakes now have labels on Facebook and Instagram that
say they are not real, cheap fakes and other kind
of low budget media manipulation might not trigger that the
same kind of misinformation detectors in the same way and
can potentially spread further.

Speaker 4 (41:54):
Yeah, definitely, And yeah, like like something that Trump has
been very good at, like you know, monopolized. They're taking
advantage of media coverage and media coverage of like you know,
outrageous or sort of out their behavior.

Speaker 3 (42:10):
This kind of reminds me of like.

Speaker 4 (42:13):
Early on, I think this was more the twenty sixteen election,
but I look the twenty twenty election too, Like whenever
there would be like some sort of group of like
marginalized people that.

Speaker 3 (42:24):
Supported Like I remember there was a big like kind of.

Speaker 4 (42:27):
Article about this like gaze per Trump like thing, and
it was.

Speaker 3 (42:30):
Like, Okay, it doesn't even like it's probably like.

Speaker 4 (42:33):
A a tiny group, and then b it's like the
whole like sort of like shock value. It is like, Wow,
this marginalized group that like the Republican Party kind of
hates uh is supporting like their super right wing candidate.
But yeah, like again, that's not necessarily newsworthy. It's sort
of not showing to scale the actual kind of impact

(42:55):
of these there or not showing to scale like the
actual support or numbers or whatever what that means. What
that argument is, there's probably what there's people of you know,
whatever marginalized group that support Trump or the Republicans for
like whatever reason, God and of itself is not necessarily newsworthy.

Speaker 3 (43:16):
But what it does is it kind of like distorts
distorts like the wider story and.

Speaker 4 (43:23):
Yeah, it's it's it's just weird. It's like, yeah, once
everything about this election is just gonna be so insane.

Speaker 3 (43:34):
It already has been.

Speaker 2 (43:36):
But yeah, yeah, I.

Speaker 1 (43:37):
Think you make a really good point that I think
that's what doctor Donovan is saying, is that, you know,
we really need to be thinking a little deeper about
these stories and what makes them do is worthy Because
a group of gay people loudly advocating like a small
group of gay folks advocating their support for Trump, it's

(43:58):
not like there's a deeper story about well, has Trump
actually you know, been good for this community, and what
are the actual numbers of folks who support Trump? Like
I do, I do wonder if it it like makes
it tempting to turn what otherwise would just be like
maybe a small stunt or something into a larger pattern.

Speaker 3 (44:20):
You know. It definitely, like yeah, it definitely.

Speaker 4 (44:25):
It feels like the issue is mainly just that, like
it's sort of changing what the actual meaning of Like
it changes changing what the story is. It's implying something
about the way a certain group of people votes. I
feel like the whole kind of way of like tracking
votes based on like you know, particular identities that people

(44:46):
have too, Like that already is so outdated and like
we've already seen the kind of nuances of that, and
like you know, yeah, there's different factors in people's lives
that make them vote certain ways or do certain things
or believe certain things and just kind of like groping
in it into like.

Speaker 3 (45:05):
I don't know this.

Speaker 4 (45:05):
Yeah, again, it feels like they're following for what the
this kind of campaign wants them to, like what message
they want them to circulate, which doesn't seem.

Speaker 1 (45:14):
Good exactly, And that's exactly what Disinformation specialist Nina Jenklitz,
who was also on the podcast before, said that we
don't actually need to be totally terrified because we do
have common sense, and common sense is still a really
good deep fake detector. And Nina says, you know you
should when you see images like this, you should ask yourself, like,

(45:37):
is Trump actually a great friend to the African American community?

Speaker 2 (45:41):
If something seems.

Speaker 1 (45:42):
Off about it, it is probably a deep fake, right,
And so just really going back to our common sense
of well, what do I know about this person? Does
this seem like something that would actually be doing. I
also would go further and say Donald Trump is a
known germa phone, He's not usually photographed like like that.

(46:03):
And the one deep fake image he's like hugging these
two women, and it's like he's not someone who is
generally photographed being that touchy feely with people, let alone
black women.

Speaker 2 (46:12):
So is that something that you would see right right? Right?

Speaker 3 (46:17):
Known racist? Like this is not I literally I that
can like the twenty.

Speaker 4 (46:23):
Six er I guess yeah, thought that would be twenty
six back in twenty sixteen, like because I was I'm
from Chicago originally, and there was like a rally that
he canceled in Chicago because he was afraid he was
going to be like get shot. Like it was because
of all of his stuff that he was like it
was insane, like this man and and I'm not saying,
I don't know who's to say what would have happened,

(46:44):
but it's like one of those things of like this
man is so like I don't think he is going
into predominantly black neighborhoods uh in Baltimore if like, I
just that just not seemed to be the kind of
thing that he uh would be putting himself. But that
situation because of certain beliefs that he has.

Speaker 1 (47:02):
You know, you don't see Donald Trump stopping his motorcade
and chilling with the homies on the stoop in Baltimore.

Speaker 2 (47:09):
You don't see it.

Speaker 3 (47:10):
You know, anything could happen.

Speaker 1 (47:18):
More after a quick break, let's get right back into it. Well,
speaking of AI, you know, we talk on this show
a lot about AI deep fakes, specifically non consensual deep

(47:42):
fake images, and it really does feel like every week
there's like a new school where it's happening, or a
new celebrity who is being targeted but now representative Alexandria
Okazio Cortez has a plan to stop it. AOC told
Rolling Stone that she's going to be leading the House
companion of the disrupt Explicit Forged Images and Non Consensual

(48:04):
Edits Act of twenty twenty four, also known as the
Defiance Act. It's a anacronym. That's why it's such a
mouthful to say. With a bipartisan group of representatives. So
this bill is her first move since being named to
the House of Representatives by partisan Task Force on AI. So,
how this legislation would work is that it amends the
current Violence Against Women Act so that people can sue

(48:26):
anybody who produces, distributes, or even receives deep fake, non
consensual pornographic images if they knew or recklessly disregarded that
the victim did not consent to those images. AOC says
that she developed the legislation working with survivors and survivors groups,
which to me is very important because oftentimes legislation that

(48:48):
doesn't really like center and listen to the people impacted
can end up having, you know, a different impact than
the intended impact, and like can just end up like
not making the lives of people who are targeted or
survivors better more than twenty five organizations have endorsed this
bipartisan legislation, including the National Women's Law Center, the Sexual

(49:08):
Violence Prevention Association, the National Domestic Violence Hotline, and Ultraviolet,
where I used to work. AOC said that she started
working on this legislation the week after those viral Taylor
Swift AI generated deep fakes happened on Twitter. So again,
for as much as we talk about the harm that
deep fakes represent for women, girls, BIPOC people, other marginalized

(49:34):
people who, according to the UN and just common sense
and life experience, are at a heightened risk for experiencing
technology facilitated gender based violence, and given the way that
we know technology like AI deep fakes is already disrupting
democracy amazingly, there are no federal laws criminalizing it.

Speaker 2 (49:55):
Like every time I think about that, I just think.

Speaker 1 (49:56):
About how wild it is that it's just like kind
of allowed, Like some states have taken steps, but federally nothing.
And so this legislation, if it goes through, would be
the first federal law taking any kind of action against
non consensual AI generated deep fakes.

Speaker 3 (50:16):
Yeah, that really is crazy.

Speaker 4 (50:18):
I mean I mean, like yeah, like this type of
deep fake, you know, is obviously kind of a new thing,
but like, I feel like this has kind of been
an issue for a while, and you would think at
least like there would be some sort of like libel thing.

Speaker 3 (50:35):
You could like put it under somehow. But yeah, I mean,
like I've said before, I think swifties can change the world.

Speaker 4 (50:48):
Once again, I'm a little skeptical about this chik Tawk
fan going through. I believe in the power of fangirls
and like teenagers that just a deep obsession and parasocial
relationship with Taylor Swift, amongst many other celebrities.

Speaker 1 (51:07):
Have you ever read that book Everything I Need I
get from you How fangirls created the Internet as we
know it?

Speaker 3 (51:13):
I am not, but like I should.

Speaker 4 (51:15):
That sounds like that's totally up my alley, Caitlyn Tiffany,
Is it changed my whole basically, long story short, You're
right that fangirls created the Internet.

Speaker 2 (51:28):
They control the Internet, they control everything.

Speaker 4 (51:33):
Look, I mean, yeah, I got I've spent a lot
of my formative years on Tumblr and I do it
was this g I was listening to it like another show,
and they were talking about that that new movie that
is coming out that's like based on Harry Styles.

Speaker 2 (51:52):
Oh yeah, I just watched the trailer for that a
moment ago.

Speaker 6 (51:55):
Yeah, because it was so funny.

Speaker 4 (51:59):
They were talking about this whole phenomenon of these like
fan fiction movies, fan fiction book to book to movie,
and I'm like, yeah, no, this has been like a
thing like this is I like, the fan fiction writers
are like a pillar of the online community and now
are making their way into the big screen as well,
which is is an interesting phenomenon. But h yeah, I

(52:21):
love that. I definitely will check that book out.

Speaker 1 (52:24):
Oh, speaking of the big screen, we have to talk
about this lawsuit.

Speaker 2 (52:30):
The most recent like jaded, jilted white guy lawsuit.

Speaker 4 (52:35):
So you do that should be a new segment, like
in addition to the thelon now.

Speaker 3 (52:41):
White guy law suit.

Speaker 1 (52:42):
Alert Like yes, er er white guy lawsuit, new new
white guy lawsuit just drapped. Okay, we're adding that that's
gonna be a new segment.

Speaker 3 (52:53):
Love it.

Speaker 1 (52:54):
Okay, So since the Supreme Court struck down the race
portion of affirmative action, we've been covering all these different
legal challenges. We're gonna have our new segment, so keep
any it out for that. But all of these different
legal challenges have been rolling in from white folks who
were like salty, they didn't get something they wanted, and
they're being like, oh, it's because a woman or a
person of color unfairly took it, and that's racism against

(53:17):
me as a white person. And so this newest lawsuit
has been filed by a script coordinator on the show
Seal Team named Brian Bennecker, who claims that quote, heterosexual
white men need extra qualifications to be hired in Hollywood. Historically, yes, historically,
you know how it's always been heterosexual white men who

(53:38):
can't get a break in any industry, whether it's Hollywood,
you know, especially Hollywood.

Speaker 2 (53:45):
It's like historically, historically, you know.

Speaker 1 (53:48):
How, like it was white men who like weren't allowed
to be in movies, had to enter through the back
interest of nightclubs when they were performing all of that historically,
remember how that happened. Oh yeah, so let's let's pause
and imagine that this reality where it was straight white
men who needed to work like twice as hard to
prove themselves and not everybody else in Hollywood. So basically

(54:11):
he is being represented by America First Legal, which is
the legal initiative of the former Trump White House advisor
Stephen Miller. Stephen Miller of all of the like Trump administration.
One of my favorite outlets, Wanket always uses the phrase
chuckle fucks of all of them, like he's the one
that sort of I found the most personally odious. But

(54:33):
their thing is basically filing complaints with the Equal Employment
Opportunity Commission against major companies including Starbucks, McDonald's, Morgan Stanley
over their corporate diversity and hiring practices. I believe that
CBS is the first entertainment company that they have targeted
in this way, so his lawsuit is pretty ridiculous. Basically,

(54:53):
he argues that CBS has diversity quotas and that he
was promised like a writing job when one opened up
on the show, that instead they hired a bunch of
unqualified bimbos instead of him, a white man who deserved it.
That's basically the gist of the lawsuit. He says that
CBS's hiring practices have created a situation where heterosexual white

(55:15):
men need extra qualifications, including military experience or previous writing credits,
to be hired as staff writers compared to their non
white LGBTQ or female peers.

Speaker 6 (55:26):
Extra qualifications as in previous credits, like writing experience, you know,
a resume.

Speaker 3 (55:36):
I'm being discriminated against.

Speaker 4 (55:38):
Oh so I am so this Maybe this is just
specifically I have a lot of friends that work in film.

Speaker 3 (55:43):
I have a.

Speaker 4 (55:44):
Background working in film, so maybe this is particularly like
pissing me off because of that.

Speaker 3 (55:50):
But like go to.

Speaker 4 (55:53):
Any film set, it is so like a historically it
is a deeply, deeply massogynistic, deeply raised this industry. And
still it is like the majority of the people you're
gonna see, they are gonna be why men, Like yes,
oh my god, especially in the writer's room.

Speaker 1 (56:11):
Yeah, I mean it just sends to me like what
he's really saying is like I want a system where
being a white, straight male means I get stuff and
that is never questioned. Like that's the system that I want,
Like that's what That's what I think we should have.
And just to be real for a second, like I
almost didn't include this story in the roundup, but it's
a little bit of a smoke screen because I really

(56:33):
wanted to read some of these comments that Deadline published
from women writers and writers of color about this, because
they are hilarious. They're a plus gold. So Deadline did
a roundup of people responding to this lawsuit. Here a
couple of my favorites. So a lot of them point
out that this person is a script coordinator and he

(56:53):
is demanding that he bypassed like script supervisor and be
made straight like just I want to go right to
the top. I demand to be made a producer. This
quote that mansuming CBS is also demanding they make him
a producer. If that's not peak whiteness, I don't know
what the fuck else is Lomo? Please somebody go find
a writing sample of his that's from Kelly Terrell, which

(57:15):
I thought was really funny.

Speaker 3 (57:17):
Wait, really quick, I've never heard somebody say Lomo before.

Speaker 2 (57:20):
Oh is it l m ao?

Speaker 5 (57:22):
I just say lbao, but maybe oh, I don't know.
It was like, maybe that's a is that a millennial?

Speaker 1 (57:28):
Just wait, I'm realizing I have always said it in
my head and I've never said it out loud, And
now I'm wondering if that's like not a millennial thing,
if it's just a Bridget Todd thing, that's just like
how I've always said it.

Speaker 2 (57:45):
Wait, I'm gonna listen to this.

Speaker 1 (57:46):
I mean, I say, wait, here's Okay, So I found
an ask a Reddit how do you pronounce lmao?

Speaker 2 (57:53):
Top comment? If you're one of those people who say.

Speaker 1 (57:56):
Lmao or l l out loud rather than laughing that
I think it's pronounced, Please disown me.

Speaker 2 (58:03):
So we'll never know. We'll never know, you know what,
Keep this in.

Speaker 1 (58:08):
I want this as a digital record of people get
back to us.

Speaker 3 (58:11):
Do you say lamo? Do you say l m a O?
Is it a?

Speaker 2 (58:15):
Do you say something else?

Speaker 3 (58:17):
Yeah? Do you say something else? God? I but yeah, no.

Speaker 4 (58:21):
I think that's what I'm gonna start telling people for Like,
career advice now is if if you want to be
if you want to get a producer gig, you just
gotta sue a major.

Speaker 2 (58:30):
Just sue them and demand it.

Speaker 3 (58:33):
My heart.

Speaker 2 (58:34):
Uh yes, I hope you're listening.

Speaker 3 (58:42):
That truly? Yeah, that is that is That is peak whiteness.
Uh as a as a white person.

Speaker 1 (58:50):
This is my favorite from Miles Warden that Seal Team
lawsuit reads like a series of mad libs, which is
ironic because he's mad at libs. Oh chuckling to myself
about that one earlier today, it.

Speaker 3 (59:04):
Is seal teab Like, I've never heard of this show.

Speaker 2 (59:07):
I've never heard of it.

Speaker 1 (59:08):
I feel like it's the kind of show that like
somebody's dad probably watches, Like it's like I think it's
about like it's got David Bryannis.

Speaker 4 (59:17):
It seems like it has a very very dedicated like
fandom on tumbler somewhere, and then everybody else who watches it, yeah,
is like somebody's dad.

Speaker 1 (59:25):
It's like, oh, I'm sure this reminds me of like
there are so many shows when I'm flipping channels, and
part part of it is because I just came back
from visiting my parents and like they're watching stuff I've
never even heard of, Like we live in such different
like television watching silos. I guess like my dad has

(59:46):
gotten really into this show about a bodyguard. I think
it might as be called Bodyguard and it's all about
a guy who's a bodyguard. And it's like part of
me is like are these real shows? Like where are
you getting these are real?

Speaker 4 (59:59):
There's just like an endless list of I feel like
there also is like that's what I would saying. I
remember like I had a friend telling me about a
show that was very very similar to this, but I yeah,
I feel like there's just like an endless like list
of like military, police, firefighter, whatever.

Speaker 1 (01:00:17):
Oh my god, it is sort of like one of
those things where when did we decide that the main show,
the main occupations for television shows is gonna be firefighter
police maybe like doctor, nurse, like military person.

Speaker 3 (01:00:35):
There's too many police shows. Oh my god, they're way,
way too many police shows.

Speaker 2 (01:00:40):
I'm telling you.

Speaker 1 (01:00:41):
Come spend a weekend at the Todds House in Richmond, Virginia,
and you will, like it's all they watch, like the shield,
that this, that, the blue line like It's like, I
had no idea.

Speaker 2 (01:00:53):
My parents were so into this, into this like law
enforcement universe. I had no ideas. Was like new information
to me.

Speaker 1 (01:01:00):
So as hilarious as all of these these comments on
this lawsuit are, there were two that I wanted to
highlight that really made me think. One is from Britney
Van Horn, who says, deeply embarrassing on the script coordinator's part,
But everyone who uses diversity as a quote easy excuse
for not hiring or promoting people are complicit here too.

(01:01:21):
Diversity is way too often used as an excuse for
not hiring or promoting someone you had no intention to hire.
And that really rings true to me that you know,
this person who issuing CBS was a script coordinator for
like fifteen years, right, Like I can understand being annoyed
at never being promoted after like fifteen years, never getting

(01:01:42):
to be a producer. But and like if you're someone
who was a manager of this person, being like, oh, well,
they want a diverse writer's room, that's why you didn't
get hired.

Speaker 2 (01:01:52):
I could totally see people saying.

Speaker 1 (01:01:54):
That as just an easy way to avoid having to
have a conversation that's like, actually, your work isn't cutting it,
you can't hack it. We're not gonna promote you for
whatever reason. We're not interested in, you know, you having
a bigger role in what we're doing here. Like I
do think Britney's point is really interesting. And this one
from Friend of the Show Ish Danny Fernandez. She used

(01:02:15):
to have a podcast on iHeart. I don't know if
she does anymore, but I friend in my head, I
don't know if I don't know, she's never been on
the show, But Danny call us, we love you, Danny tweeted.
UCLA releases a diversity report every single year. Anyone can
google it. White men are the majority of writers in Hollywood.
They make up the majority of writers room. This is

(01:02:36):
not an opinion. This is a fact. You lose jobs
to each other, to other white people, and that I
think is so key because Danny is right. The numbers,
as you said, Joey, are super clear. That it is
not white men who are underrepresented in the entertainment industry
is just not It's just a fact. And so you

(01:02:57):
cannot sue your way into a career. Nobody is owed
a career in entertainment. And you know who knows this
better than anyone. The black people and queer people and
trans people and women and other people of color who
are underrepresented in the field. The people who drop out
after ten years because they're like, yeah, I'm never gonna

(01:03:17):
move up. I have hit this ceiling of as high
as I can go and so I'm leaving. That happens
time and time again. And so being told that it
is people of color and other marginalized people who dominate
the industry, not only is it just like laughable, but
it just like flies in the face of the cold
hard facts.

Speaker 4 (01:03:37):
Absolutely, And I mean, I don't know what the deal
is with this case, but like I'm willing to guess
that most of the writers in the writer's room are
probably also white men. And yeah, it is like there's
so many different issues here. It is super super hard
to work in the entertainment industry. It's super hard to
work in the film industry.

Speaker 3 (01:03:58):
You know.

Speaker 4 (01:04:00):
I know that firsthand. I know that through friends experiences.
I think this sort of phenomenon and again, yeah, like
a lot of these these companies and these production houses
that have a stake in or that it kind of
and again, like to go to that point about like
diversity being an easy kind of excuse quote unquote. The
issue isn't the the you know, the people that are

(01:04:22):
being hired for quote unquote diversity, if that is actually
happening oftentimes it's not.

Speaker 3 (01:04:28):
The problem is.

Speaker 4 (01:04:28):
These studios that are like, look, I don't have to
have yeah, again, like I don't even really have to
be doing the work of trying to hire more people
from diverse backgrounds or people that aren't just white men.

Speaker 3 (01:04:39):
I can just say that diversity is the.

Speaker 4 (01:04:42):
Issue, and then I get out of here without really
like having to deal with all of the bigger issues
with like Hollywood and how it operates and but yeah,
it's it's yeah, nobody's owed a career these same people,
like yeah, it is like again it's coming from these
same people that are oftentimes very like pull yourself up
by your bootstraps kind of thing.

Speaker 3 (01:05:03):
And it's like, yeah, I know.

Speaker 4 (01:05:04):
This is the dark side of all of this, is
that a lot of times it doesn't really work out.
And yeah, yeah again again, all of this to say,
you go to the actual reality and the actual numbers,
like white cispen are still the dominant force in Hollywood,
like that that's a fact.

Speaker 1 (01:05:24):
Yes, As I was reading about this kind of to
your point, do you remember how there was a time
when folks like this would say things like facts don't
care about your feelings unless that feeling is I would
like a job please.

Speaker 2 (01:05:37):
I feel like I have been overlooked.

Speaker 3 (01:05:39):
Actually, facts don't care about your feelings.

Speaker 6 (01:05:41):
Like I'm saying it right, it's the same like all
of these idiots Ben Shapiro and whatnot, the people that
are like they're all failed screenwriters.

Speaker 4 (01:05:50):
Like it's these people that want to blame their personal
failings on quote unquote diversity and like marginalized people. And
then they turn around and they're like actually, you guys
are the ones doing that. You're the ones being snowflakes
and being like your feelings and not playing attention to facts.

Speaker 3 (01:06:08):
It's it's so weird.

Speaker 4 (01:06:11):
It's such a like victim complex that I don't know,
Like I don't know what kind of how removed from
reality slash just completely in your own head narcissists you
have to be to kind of view the world that way.

Speaker 2 (01:06:24):
But here we are, here, we are. Well, good luck
on your lawsuit. Yeah, I will say that when I
found out.

Speaker 1 (01:06:32):
That Ben Shapiro got his start as a screenwriter and
it didn't work out, so much click into place for me.

Speaker 4 (01:06:39):
It was like, oh, he's just mad that people didn't
like his stuff. Like so many of these people are
just mad that people don't think they're funny or interesting
or what. Elon Musk didn't try to be a screenwrinner,
but so much screenwriter, but so much of his like
thing is just that he is. He is so mad
that people don't find him funny. I all of Yeah,

(01:06:59):
I don't know. There, I think we just need to
full on like like no more white men making entertainment
period for like a couple of years.

Speaker 3 (01:07:10):
Like actually, obviously think we should do that.

Speaker 4 (01:07:13):
I know that's what they think is going on already,
but I think we actually need to do that, like
we need to reset.

Speaker 1 (01:07:18):
Oh my god, imagine if it really happened, like you
want to. You want to talk about like this guy
being a cry baby. Now, imagine if what he what
he is laying out is happening.

Speaker 2 (01:07:29):
Imagine if that happened for real.

Speaker 1 (01:07:31):
Just just just marinate on that for a moment, Listeners, the.

Speaker 4 (01:07:36):
Most insufferable man you know has lost his hobby, and
you know.

Speaker 3 (01:07:42):
I have had to deal with a lot of men
in film. It is terrible.

Speaker 2 (01:07:48):
Oh my god. We need to do a whole like.

Speaker 3 (01:07:51):
Just dish out all I want to.

Speaker 2 (01:07:54):
I want to.

Speaker 1 (01:07:54):
I want to hear it. You can, we can anonymize it,
but I want to hear it. I'm sure I can
only imagine for sure. So the last thing I want
to do before we end is folks may know someone
that we had on the show, one of our earliest guests.
I think it was maybe the first interview I did
for There Are No Girls on the Internet with Shafika Hudson,

(01:08:15):
the creator of the hashtag and movement Your slip is
showing passed away, and we did an episode where put
her episode back on the feed and I shared some
of my thoughts and I just wanted to say I've
really been wrestling with her death and the loss of
her and what it means for the internet and social media.

(01:08:39):
It feels like the end of a period, like thinking
about it, and it was like, oh yeah, back when
Shafika was, you know, doing her thing on the internet,
it was a time when I was really active online
and it felt really fun and exciting, and it was
just really not just mourning her, but also mourning that
the death of that time online. And I got reached

(01:09:01):
out to by Penelope Green, who is a writer at
The New York Times who was doing an obituary on
Shaffika Hudson. And Penelope was great. We had many conversations
about Chaffika's life and her work, and she said that
the reason she was getting in touch with me was
because somebody, I think a listener of this podcast reached

(01:09:24):
out through I guess the New York Times has like
an obituary request form where you can flag people that
have recently passed that you think the New York Times
should write an obituary on, which led to Penelope Green
contacting me, which led to the obituary. So we'll put
a link to the obituary in the show notes. I
believe it will be in print in the New York Times.
If you are the person who reached out to Penelope

(01:09:47):
Green on behalf of Shaffika Hudson to get her obituary
in the New York Times, I just want to say
thank you. You're probably not listening to this, but it
really meant a lot to me, and it really meant
a lot of people who knew Shaffiica and work with
Shafka and Shafika, to whom Shabika touched their.

Speaker 2 (01:10:05):
Lives in some way.

Speaker 1 (01:10:06):
And you know, one of my disappointments is that I
don't feel like Shaffika's loss was mourned by the tech
community and the disinformation community as I think it deserves
to be mourned.

Speaker 2 (01:10:22):
And so.

Speaker 1 (01:10:24):
Taking part in this series of interviews with Penelope Green
from New York Times was really personally cathartic for me
and gave me a lot of closure on that. And
I think it's really important that Shaffika Hudson's name and
legacy is memorialized in this way. So all of that
is to say, if you are that person, it really

(01:10:46):
meant a lot to me.

Speaker 2 (01:10:47):
Thank you.

Speaker 1 (01:10:48):
Thank you to Penelope Green for reaching out to me
and other people who knew and work with Shafeica, and
for writing such a comprehensive and beautiful tribute to somebody
who I just thought was fucking great and is no
longer here.

Speaker 2 (01:11:03):
And folks can.

Speaker 1 (01:11:04):
Read the obituary in the show notes, or you can
pick up a paper. I think it might be being
published today. It's Thursday now, but Friday when you hear this.
So yeah, I have a lot of people to thank
for what ended up being a moment of personal disappointment

(01:11:24):
that I get to now sort of process with something
that feels good. So thank you, and thank you Joey
for for being here and for listening.

Speaker 3 (01:11:34):
Oh my god. Of course happy to be here. And yeah,
totally echo what you said.

Speaker 4 (01:11:40):
You know, the work that Shafika did is just so
foundational to a lot of the stuff that we cover
on here, and I think, you know, it's really it's
really important to uplift those stories and to see you know,
the New York Times spotlighting this. So yeah, absolutely echo
with that. But yeah, of course always happy to be
talk about the chaos of the world.

Speaker 1 (01:12:04):
Uh with you same and thanks to all of you
for listening. I will see you on the Internet. If
you're looking for ways to support the show, check out
our merch store at tangody dot com slash store. Got
a story about an interesting thing in tech, or just

(01:12:24):
want to say hi, You can read us at Hello
at tangodi dot com. You can also find transcripts for
today's episode at tenggody dot com. There Are No Girls
on the Internet was created by me bridgetpod. It's a
production of iHeartRadio and Unbossed Creative edited by Joey pat
Jonathan Strickland as our executive producer. Tari Harrison is our
producer and sound engineer. Michael Almato is our contributing producer.

(01:12:45):
I'm your host, Bridget Todd.

Speaker 2 (01:12:46):
If you want to help us grow, grate and review
us on Apple Podcasts.

Speaker 1 (01:12:50):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.