Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. This is
another installment of our weekly roundup where we dig into
stories that you might have missed on the Internet so
you don't have to. And I am so thrilled to
(00:25):
welcome my guest co host for our conversation today, Francesca Fiorantini,
journalist and host of The Bituation Room. Thank you so
much for being here.
Speaker 2 (00:34):
He thanks for having me Bridge. It's so good to
be back in a different era, but really the same era,
a continuation of the awfulness. I'm really happy to be here.
We need more shows like yours.
Speaker 1 (00:47):
We need more voices like yours. I'm happy to see
you everywhere. The Bituation Room is fantastic. Also, you've got
a live show coming up suit.
Speaker 2 (00:54):
Right, yes, yes, yes, if you're hearing this next Friday
in Los Angeles, or and you are in Los ange
Angelus or want to get to Los Angeles, yeah, Friday
at the Allegian Theater, May thirtieth. We're gonna fix La
Bridget La post fires, pre Olympics. It's it's increasingly unlivable here.
(01:15):
But council member Unisses Hernandez, John Iderol of the Damage Report,
someone from the Rent Brigade, which has been blowing the
whistle on all the price gouging that realtors have been doing.
And Rachel Reyis of the LA podcast, which is an
excellent podcast if people don't if people want some local
politics podcasts. But yeah, it'll be good. It'll be good.
Speaker 1 (01:34):
Sometimes I gotta laugh to keep from crying when you're
tuning into our political landscape these days. Indeed, So with that,
are you ready to talk about some stories that are
happening across the internet.
Speaker 2 (01:46):
So ready?
Speaker 1 (01:47):
Okay, So the first one is not even really a story.
It's more just I want people's takes. This was blowing
up in my kind of podcast or group chat. So
I was like, you know what, I'll bring it to
the podcast. Are you familiar with the podcaster theo On? Yes,
So for folks who don't know, Theovon is a super
popular podcaster. He rose to fame on MTV's road Rules
(02:09):
and like competition shows like The Challenge, he hosts a
super popular podcast, a podcast that like, I gotta have
a million podcasts in my life and it will not
equals as popular as his show is. I would say
that I personally believe that he is one of the
reasons why Trump was reelected. Trump kind of agrees with me,
Like he shouted out theovon from his victory stage and
(02:30):
like thanked him by name. He is like a pretty
vocal supporter of Trump. He's had him on the podcast
and everything.
Speaker 2 (02:37):
But was he at the inauguration? Bridget He was?
Speaker 1 (02:40):
He was, Oh, And what's funny is that he kind
of is doing like a if you if you watch
that show the Royal Gemstones. He kind of favors Keith,
Like he wears like a mullet, Like that's kind of
the So like seeing him at the inauguration was like, huh,
that's interesting.
Speaker 2 (02:56):
But Keith Keith is a good person.
Speaker 1 (02:59):
Yeah, Keith's He's a good guy. I don't know that
I would qualify theopha. Well, that's sort of the question
that I'm bringing to the podcast. On his podcast this week,
he was talking about Gaza. Here's what he said on
the podcast. Quick heads up though that we did edit
this clip for length.
Speaker 3 (03:18):
You know, I wanted to say something. There's been something
that's just been kind of on my heart, and so
I feel like I should bring it up. There is
you know, we've had people on the podcast in the
past to talk about it, and there's just a there's
a conflict that's been happening in the Middle East. People
(03:39):
know about it between Israel and Palestine and some of
the areas over there, the Gaza area they talk about,
and uh, and I just think it's it feels to me,
I don't know if I it just it feels to
me like it's a genocide that's happening while we're alive
here in front of us, are in front of our lives,
(04:06):
and I don't Sometimes I feel like I should say something.
I'm not a geologist or geographer or anything like that,
you know, so I don't know a lot of the
Some of it. I do know, though, like I know
the basics of the issues over there. But for me,
it's just like how I feel like you see all
these photos of people, just children, women, people, body parts,
(04:35):
just people like putting their kids back together, and I
just can't believe that we're watching that and that more
isn't said about it. You notice, like, what are we doing?
Speaker 1 (04:47):
So I'll stop it there when I saw when I
saw this, my friends, my like podcaster friends know that
I deeply dislike this person. I deeply dislike the avonn
And people were dropping this clip in the group chat,
being like, oh, it's so good that he said something.
You know, this clip got twenty million views, so they
(05:08):
say on X, And I guess for me, I fully
don't know what to do with this because, on the
one hand, I obviously support anybody who's using a platform
to speak out against genocide, but this guy is like
all in for Trump, very vocally, and so I have
a hard time kind of understanding how somebody can make
a statement like this and seemingly not have the tiniest
bit of like self reflection on their part of it.
(05:32):
And so this rapper, Zach Fox, wasn't having it. He said,
don't let Theovon fool you. He'd be blowing a hookah
smoke and doing key bumps with Trump, his family and
the literal architects of the genocidees fate crying about. Then
he posted a picture of Theovon with Avanka Trump and
Jared Kushner partying in Miami from Levanka's Instagram just a
few days ago and said what do you mean what
(05:53):
we are doing? Ask them, which I have to admit
is like a pretty funny burn.
Speaker 4 (05:58):
Yeah.
Speaker 1 (05:58):
But and like literally a few days before recording this,
THEO was with Trump doing an opening set at a
military base in Qatar, and so like this actually turned
into a bit of an argument because I, on the
one hand, totally understand how some of my podcast my
like leftist podcaster or friends were like, anybody who's using
their platform to speak up is doing a good thing.
(06:19):
If twenty million people are getting eyeballs on this and
he's making a different subset of the population think about this,
that is a good thing. I can understand that point
of view, but I just don't buy it.
Speaker 2 (06:31):
Yeah, I mean, first of all, I love the receipts
that were brought against him by that rapper, Like I
love and you're like, oh, I'm waiting for this to
be really old. No, no, no, it's it's from like a
week ago him sitting down with Jared Kushner and Ivanka Trump.
I mean Jared Kushner, who spearheaded in part the Abraham Accords,
(06:52):
which effectively completely cut out the existence of Palestinians from
any kind of peace process in the Middle East, which
is kind of it's actually just a oxymoron to even
say what I just said, because there can be no
peace without you know, Palestinians involved, and so that I
think was really important. And also Jared Kushner stands to
(07:14):
gain should Trump's ethnic cleansing of Gaza succeed and the
Trump of you know, the riviera of the Middle East
comes to pass. That's Jared Kushner has openly said, this
is beach from property. I mean, this is truly on
the bones of hundreds of thousands of Palestinians. Again, official
(07:35):
death tolls stopped back when you know there was any
kind of infrastructure to account for all the dead.
Speaker 5 (07:41):
But I really I hate to come on this show
and be like I really agree with the host because
I want to make it spicy and interesting. To Bridget,
I super agree with you, Like I really come down
on like very very similar to you on all of this.
Speaker 2 (07:58):
And I will say two things about it. Like, I
do think that the issue of Palestine is and the
issue of Gaza and the genocide that's happening is so bipartisan,
it's so blanket it's almost cultural in terms of how
much people are against this and in ways that I
(08:18):
think really supersede any party loyalty or any right left situation.
So I think that just has to be named that,
Like people who are against genocide are also people who
are like I don't know, I don't vote who what
is a congress?
Speaker 4 (08:30):
Like?
Speaker 2 (08:30):
There are people who are just like you know, there's
nothing to do with party, And clearly from Theovonne, dumbass
talking his way through that, like I'm not a geologist, Like, bro,
what are you talking about? You're like stalagtites here, like
what do we like? Geology? I mean a geographer. None,
neither of those things actually relevant to this issue. I'm
not a geologist or a geographer. And then you know,
(08:52):
saying what he said. So I feel like it's like,
it's both it's good that he said this to his
massive audience, but you're absolutely right if you have I
have an audience with Donald fucking Trump, which he does
say something to his ass, say something to that instead,
it kind of feels like I feel like he probably
(09:13):
got he's got comments on it, people talking about it
around him. He just wanted to address it, and then
he can kind of go on about his way and
not do anything activate his community, talk about how Trump
is making it worse, talk about all of the genocidal
things that the removal of Palestinians to what Libya is,
the new proposal receiving a fucking four hundred million dollar
(09:37):
jet that now we know Trump's solicited from Cutter and
you know, just like just to have. Yeah, it's pretty sickening.
And so no, I don't, you know, I have to
come back to the like it's not for us, Bridget
Like there's a little bit of a like it's not
for us. And so hopefully people who don't listen to
(09:59):
Theovonne listen to our shows, listen to you, listen to
this podcast and are like, we don't got to go
to war against these people. But I'm not trying to
take time out of my day to like, you know,
give them a bunch of flowers when they're very much
in bed with just fascists at this point.
Speaker 1 (10:15):
Yeah, I feel very similarly to you. So like my
this is like my Roman Empire. I think I think
a lot of these like bro podcasters, but THEO in particular,
I think they are playing a character. I think that
he plays this kind of like uh, Chuck Stoner everyman, Like,
I'm not a geologist or nothing, but this seems bad.
(10:36):
I don't think that he's a stupid person or an
uninformed person. I think that he's making a ton of
money from his association with the Trumps, with the Kushners.
I think he is seeing that, like the situation in
Gaza is a genocide and is quite bad, and as
you said, it's like not a left right issue, it's
like a right wrong Like I can see with what's happening,
and I can see that it's very bad and very wrong.
(10:58):
I think that he's like, oh, shoot, I need to
speak to this, but I want to do it in
a way where I don't have to. I can still
sort of enable it. I can still sort of make
money from my participation with the people who are architecting it.
I don't want to have to, you know, I want
to get to perform sadness about it while also continuing
(11:18):
to basically say nothing about it. And so yes, yeah,
I don't. I it's hard for me to. I mean,
maybe I'm cynical. It's hard for me to take this
as a win when it's like, yeah, what do you
mean what we are doing? You have audience with the
people who are doing this. Talk to them, don't talk
to me.
Speaker 2 (11:33):
Well, it's uh, you know, I have a lot of
I have a lot of thoughts. And I was like,
uh when we started the show, I was like, I
gotta leave soon. But then now I'm like, oh, no,
an issue that I really really want to talk a
lot about. It's interesting, it's ironic. Trump also said this
same line. He's sitting in front of Netanyahu in the
Oval office, going it's terrible all the killing that's happening there.
(11:54):
And you're like, oh my god, buddy right next to
you as the guy who's doing it. And of course
he knows right. But it's much easier to sort of
New York Times headline this whole thing like Palestinians died
on their own. There is no culprit like and then
you cover your ass because you said something, and then
you get to move on and you don't have to
actually talk about who is doing the thing, but the
(12:15):
other thing that happens here. And this is what I
think the podcast bros. Because when you said podcast pros,
it reminded me of another interview this week that Bernie
Sanders did with Andrew Schultz on the Flagrant podcast, in
which Schultz tries to say they call us podcast bros.
The way they called your supporters Bernie bros. We're the same.
(12:37):
And I was like, no, no, no, you're not. And
I was very disappointed on you know, as someone who
likes Bernie Sanders, super disappointed by his performance there. I
did a whole breakdown on my show about it, if
you want, Like, way too many thoughts of mine. However,
what's crazy about like the right wing podcast Comedian Rise,
(12:57):
which I think you're totally right, there is a care
there that they are all playing and they're doing it
for money, you know, Like, but I was having this
conversation this is what is crazy for me, Like every
time you ask a liberal excuse me, a liberal comedian
doesn't matter, a podcast or whatever, John Oliver, John Stewart,
(13:18):
Bill Burr to name three of them, and you say, hey,
you seem to have like lots of thoughts about the
world and you know everything that's happening, and you know,
do you feel like your role is almost larger than
just a comic. It's like you're having you know, an impact.
They all go like no, no, no, no, no, don't listen
to me. I'm just a comedian. I shouldn't be the
(13:40):
person leading this conversation. What do you want to talk
to me for? And I do agree with that in
a way, because what happens when we deify comedians as
sort of thought leaders is your theo von Andrew Schultz's,
and those two guys have no problem saying, Oh, I'm
I am an intellection, a leader. I do speak to like,
(14:01):
you know, the voice, the voice of the voice, the
voice of the canceled, you know, white male, like that
is who I'm speaking for. And so you have this
moment where it's like, oh, the right is fully embracing
the line between comic and political and politics and the
so called left and really left, but the liberals are
(14:22):
completely abdicating it, almost out of responsibility, but also leaving
this huge gulf where maybe we should be feeling. I
don't know what your thoughts are on that.
Speaker 1 (14:33):
That's such I had never put that together, but that's
such a good point, and it's very frustrating to watch
because then the voices who get to be amplified and
grow are the ones who are spouting nonsense and also
spouting nonsense in a way where it's like that you
can you They like they want to be intellectuals with it.
Like I think that's the thing that gets me is listen,
nobody appreciate the hustle and a grif more than I do,
(14:56):
Like get your money, do what you gotta do. But
they're like, don't, don't pull us into it and make
it be like, oh yeah, I'm an intellexual for thinking this.
If you want opinions on you know, trans identity, I'm
your guy, me the comedian. It's like, no, what what
are we doing?
Speaker 4 (15:12):
Right?
Speaker 2 (15:13):
And I think it's also the death of the expert.
I mean, we the Internet as a show that talks
about it. You know, the Internet in its most beautiful
periods and in iterations is like Okay, you're listening to
or seeing someone who you feel speaks in plain language,
is again personable and isn't necessarily an expert, but giving
(15:35):
you kind of their gut, you know, reaction, which can
be good or bad. But then we get into a
lot of trouble when it's like, yeah, we're gonna listen
to non trans people talk about trans stuff. We're gonna
listen to non black people talk about, you know, issues
that affect black community. We're gonna like like all that
and you're just like no, no, no, no, Well now now
you're literally elevating. And again this is not to say
(15:57):
that identity gives you that edge, but we're not going
to listen to I don't know a fucking you know,
like ethnic studies scholar whose department is being obliterated through
different Trump executive orders, like those are the experts that
we should be listening to. Instead. It's like the worst
of the Internet is just listening to dumb asses talk
(16:18):
about things they just learned about.
Speaker 4 (16:21):
Ough you put that perfectly. Let's hit a quick break
at our back.
Speaker 1 (16:44):
Okay, speaking of dumb asses, we have to talk about
this legislation, the Big Beautiful Bill. I know you've been
thinking about it and talking about it. Something that I
don't feel like is getting enough shine is the fact
that this big Beautiful Agenda Bill has a rule baked
into it that passed would prohibit states from enforcing any
law or regulation regulating AI for ten years. I mean,
(17:09):
that does not seem like a good thing to me.
The advocacy organization Demand Progress organized a bunch of academic
and like civil society organizations one hundred and forty one
of them to sign a letter, including organizations like the
Georgetown Laws Center on Privacy and Technology, the Southern Poverty
Law Center, and employee coalitions like Amazon Employees for Climate Justice,
basically being like, this is bad. The letter reads, this
(17:32):
moratorium would mean that even if a company deliberately designs
an algorithm that causes foreseeable harm, regardless of how intentional
or egregious the misconduct or how devastating the consequences, the
company making or using that bad tech, would be unaccountable
to lawmakers and the public.
Speaker 2 (17:50):
I mean, it's beyond AI right, because you know, every
day we're finding out about some new shit that Meta
did deliberately. What was it targeting young girls who I forgot.
Speaker 1 (18:03):
What the story was. It was girls who who took
a selfie of themselves and then deleted it because they
were unhappy with it. They have technology that it's like,
oh self esteem issues, huh, let's get them, and then
they will go, yeah, it's.
Speaker 2 (18:16):
And then market them with whatever totally totally creams. And yeah,
there was like some other study that was like, you know,
like children are using like anti aging cream, Like oh
my god, yeah, man, this is so bald. Faced like
it is so clear what they're trying to do. And
this is when kind of like the the tech oligarchy
(18:40):
really is showing its whole ass, like this is like, Okay,
this is what you wanted. Trump is enacting it, and
the Republicans are enacting it for you. A ten year moratorium.
And of course, you know, we believe in states rights
when it comes to what you want to do with
your own body, but we don't believe in states rights
and literally anything else on anything. We want to run Roughshotover.
(19:00):
You have a sanctuary city, you know, we want to
sweep up anybody in your sanctuary city. There's no protections
from ice. And same with this AI stuff, which you know,
being in California where Gavin Newsom has been very kind
of like flirty with AI, like oh, we you know
it can maybe it can be good and we have
to study it and blow it's like you think those
studies have happened in like kept pace with the rollout
(19:24):
of AI.
Speaker 1 (19:24):
No.
Speaker 2 (19:24):
No, but it's almost like this is a perfect little
cover to be like, well, what can we do. We
can't regulate against this industry. Guess I must cash in
on it.
Speaker 1 (19:36):
Yeah, that's exactly what's going on, and there are some
I mean, I'm glad that you use that example of
in California, because like, there are so many ways that
AI can be used to make our lives harder as people,
and we already we already know that AI can be
used to discriminate. So like Colorado pass the law of
requiring tech companies to protect consumers from the risk of
(19:57):
algorithm discrimination, and so like, if if a is making
the decision of employment, they have to inform you that
that has happened, and they have to inform you that
you're interacting with an AI system that we know can
discriminate against you. Where I live in DC, there was
a whole study about the use of AI algorithmic rent
pricing tools, and so like, if your landlord senses that
(20:18):
you're desperate for a place to live and that you'll
pay doubles for that apartment, they can use AI to
do that. Wouldn't it be great if there was state
based legislation to be like, actually, you can't do that,
Like the this is legislation that actually has real impact
in people's lives, and so having a ten year moratorium
on it, I think is really bad. And as you said,
(20:40):
like what happened the state's rights?
Speaker 2 (20:43):
What's going to be left in ten years? Like I'm sorry,
like thinking about ten years down the road. I mean,
we are going to reach a breaking point very much
earlier than ten years. But like AI has already taken
so many jobs. AI is already threatening I mean the
health sector, to say nothing of what we do, content creation,
hoovering up all of our voices are are I mean
(21:06):
you already someone signs away if they post on social media,
you know, any copyright to that content there, It's like
what is what is there left? And these are all
systems that are being trained on our labor. I mean,
it is a massive labor issue while it's simultaneously under
minds labor. So yeah, it's it's terrifying. But also isn't
(21:28):
Grok amazing? I mean, the way that Grook knows about
white genocide, it's just fantastic.
Speaker 1 (21:36):
Oh my god, you could not have set me up
better for something I want to talk about in a minute.
But speaking of taking people's jobs, it's coming for the
jobs of women. Traditionally work done by women, we are
more vulnerable to AI taking our jobs and work done
by men. This is according to a new report from
the United Nations International Labor Organization. So this report says
(21:57):
that nine point six percent of traditionally female jobs were
set to be transformed by AI, compared to only three
point five percent of jobs typically carried out by men.
And so, to be clear, the report stressed that human
involvement is still needed when talking about jobs and displacement.
They say. We stressed that such exposure does not imply
the immediate automation entire of an entire occupation, but rather
(22:19):
the potential for a large share of its current task
to be performed using this technology. And so, I don't know,
it really reminds me of what the CEO of that
app Duo Lingo said, where he was like, oh, in
the future, we won't even need teachers. AI is going
to be better at teaching everybody than any human could be.
By the way, we have no evidence of that being true.
(22:41):
And he went on to say, in the future, don't worry,
we're still going to need teachers because somebody is going
to have to provide childcare. And as a former educator,
I was really like that that comment hits a lot
of my like, oh yeah, trigger.
Speaker 2 (22:56):
They called you a childcare provider.
Speaker 1 (22:58):
Yeah, exactly. So for folks who don't know the first
iteration of my career, I was an educator. I spent
most of it in a classroom. And I think the
idea of deprofessionalizing and devaluing education, especially education being like
a field that is predominantly a field that women have
a lot of, like we're overrepresented in that field. It
(23:20):
really I just been deeply offended by this, Like I'm
offended by the idea that like, oh, AI will be
teaching young people in the future and human educators, people
who went to school, went to college, got degrees, They'll
be essentially like babysitters, gig work style babysitters. And that's
the future that we'll have.
Speaker 2 (23:37):
Right, And the reality is none of it is supported
by any research, any science, any data, any first hand account.
It is just we can make a book, we want
to make money. And there's one aspect we haven't completely
privatized yet, and that is public education. We're going to
destroy it however possible, and AI is one of the ways.
(24:01):
I also think, like super remote learning. Despite what the
right said during COVID, which was like, you know, the
kids and their mental health and no, no, no, no bullshit
because Betsy Devas's family has been actively invested in moving
learning online solely and again literally, my brother is a
public school teacher in Oakland, and the pandemic was terrible.
(24:25):
Kids did not learn more. Remote learning, you know, was
really difficult. It was because we were trying to keep
people safe. But that is no way to actually teach.
And that's with him being an actual teacher, a real person.
But the idea that we're going to just go to
like online learning by done by AI, I mean, bridget
(24:48):
doesn't really matter when all you want to do is
sort of produce little worker bees who can you know,
never fight for themselves or you know, keep people in
sort of a constant like I don't know, third grade education.
They get stuff their podcasts and whatnot, like no authoritarianism, fascism,
There's no point in educating people in these systems.
Speaker 1 (25:08):
It really reminds me of this quote from Chris Gilliard
who says every future imagined by a tech company is
worse than it's a previous iteration, Like I believe that
in my bones. And when I heard this guy talking
about the future of education, I was like Damn that
is bleak.
Speaker 2 (25:24):
Yeah, I mean they really like I think that AI
and crypto are just I mean, when will the bubble burst?
That's what I'm camp. I'm like, just burst already, God
damn it. But it's got to destroy first and then
maybe it will burst.
Speaker 1 (25:39):
Speaking of destroy, I have to talk to you about
this piece about how Elon Musk's AI facility is it
essentially just like polluting a black community. So Elon Musk
he has this AI company called Xai. They built a
massive supercomputer in Memphis, Tennessee. He's been championing it with
all the sort of usual predictable fanfare, saying it's the
(26:00):
most powerful AI trading system in the world. It's been
really selling it locally to this community in Memphis, saying
it's going to be a big source of jobs and money.
But residents in nearby Boxtown, which is a majority black,
economically disadvantaged community that has long endured industrial pollution, says
that this facility is just another threat to their health.
Now you probably know this. The secret of AI is
(26:23):
that it's like very power hungry. So when you are
asking groc about white genocide that is using like or
like using AI to figure out what it would sound
like if I don't know, bugs Bunny read the Declaration
of Independence or whatever you're using it for. You're using
(26:43):
a lot of power.
Speaker 2 (26:44):
Yes, and displacing a voice actor. Yeah, thank you, Yes,
because I'm or really good impressionists. Because I'm sure a
great impressionist could give you that rendition of bugs Buddy
reading the Declaration of Independence.
Speaker 1 (26:56):
So Elon Musk Company has currently has no airs, appearing
to rely on a loophole for temporary turbines, and so
this facility is just like pumping out smog using gas
powered turbines that are just like completely sickening this community
and telling the people who live in this community, oh,
this is this is gonna be good for you, like
this is for your own good. One resident said, our
(27:20):
health was never even considered. The safety of our community
was never ever considered. She lives three miles from this
facility and already suffers from a lung condition. And I'm
sad to say this is like not anything new for
this part of Memphis. It has seventeen other polluting facilities,
including an oil refinery, a steel plant, gas fired power plant. However,
none of those facilities are owned by Elon Musk, who
(27:42):
was obviously like in the Trump administration, and an administration
that we already talked about is like very invested in
defanging any kind of AI regulation.
Speaker 2 (27:51):
Yeah, I mean, what's really also disturbing about this is that,
as you mentioned, Memphis is already you know, an environment
mentally uh like a toxic site because of corporate pollution,
and Donald Trump is completely stripped any attention any funds,
(28:13):
any you know, any projects that are focused on cleanup
specifically because they mention things like environmental racism or disproportionately
affect black and brown communities, and you know, and it's like,
because they do that, it's like, well, how come we
can't get our toxic sites clean up? It says, you
(28:34):
don't fucking have them, Like you don't have them, my guy.
And so your fragility because this mentions race again it's deliberate,
is it will get people killed. And so this is
why it matters to actually talk when you say environmental racism,
that is not just an empty fucking buzzword. It is real.
(28:56):
But yeah, I didn't know that turbine, like I would
very much like to see. I'm trying to look at
the images, but like as someone who's only one time
used AI when they were like, we're gonna make yourself
look hot in different context. I remember that was like
I was like, okay, but that's the only time, like
(29:17):
hand to God, never asked, grog, never ask Although now
if you Google's are something, you're sort of inherently forced
to do AI because it will populate an AI answer
for you. So there's no choice to be like, hey,
I'd like to not pollute Memphis or wherever you know
their data servers and centers are. But I do think
(29:38):
we have like like where are the exposs on this?
You know, I hope this is just the beginning of
the reporting.
Speaker 1 (29:44):
Totally and to your point, like about what this looks like.
So Representative Pearson, who is a lawmaker there, he said
that it's pretty much what you're thinking. He said, it's
an actual gas plant in the middle of a neighborhood.
And you don't need any permitting to do this. Something
that's fa drastically and significantly on our system of checks
and balances, which like, yeah, that wouldn't be people who
(30:06):
like complain about like well, what are they going to
clean up our sites? That wouldn't be happening in a
community that was not economically disadvantage in majority black like,
I'm sorry, this would not be happening in a wealthy,
predominantly white community. That's why it's happening there.
Speaker 2 (30:23):
No, of course not.
Speaker 1 (30:24):
And so I have to give it up to some
of the residents who were quoted in this CNN piece,
because the residents of this area of Memphis have been
fighting this kind of fight for a long time. They
spoke to Kashan Pearson, a resident who said that they
see their area as a quote sacrifice zone where companies
put facilities that sicken residents. They in twenty twenty one,
(30:44):
residents successfully fought off a crude oil pipeline that would
have crossed Boxtown and multiple other predominantly black communities in Memphis.
In twenty twenty three, they successfully campaigned to close a
medical sterilizing facility which had been there since the seventies,
which was pumping out toxic pollutione and link to breast cancer.
So like, they have been fighting this fight for a
very long time. And something that I thought was really
(31:06):
interesting is that even though memphis Is Mayor Paul Jung
is very supportive of this facility and says like, Oh,
it's gonna bring new jobs and economically revitalized the area.
The residents there are smart enough to be very skeptical
of back claim. They're like, oh, you know, even though
they're making it seem like there's going to be training
for like white collar jobs and like tech jobs, the
(31:26):
residents are like, it's my understanding that data centers do
not typically need large numbers of workers. And I think
that we're being sold to false promise and the only
jobs that might possibly manifest for us or like janitorial jobs.
Speaker 2 (31:38):
And he is right, yeah, yeah, yeah, no, I think
that's really important. And I also think it's a it's
a challenge for you know, on a local level, on
a statewide level, like we need better we need better representatives,
we need we need folks who are actually going to
fight this. Not sadly this week, senators like Kirsten Gillibrand
or you know, Reuben Diego who voted in favor of
(32:01):
basically a crypto currency you know, so called you know,
uh like regulation bill, but it absolutely does not regulate
crypto or stable coin and allows Trump to continue his
meme coins and everybody else. So like again, it's all
kind of part of like the tech oligarchy getting their way.
(32:22):
The other thing that's important to know, and I'm not
sure if this is part of this story, is also
that in order to cool down the data centers to
do the generative AI you which is just so funny,
like generative, like like who the fuck is generating? In
a punch of power and fresh water? You have to
use fresh water to cool down the systems, not even saltwater,
(32:46):
so like they get super super hot because everyone's like
trying to like manufacture you know, the perfect AI girlfriend.
Uh and then you know, and then a bunch of
fresh water gets used. Where do you think that freshwater
comes from? Probably the same community. So it's I mean,
it's truly pillaging local communities like this one.
Speaker 1 (33:08):
Yeah, and I really it broke my heart, but was
so accurate to hear that resident describe his own community
as like a sacrifice zone that people just take and
take and take and exploit and exploit and exploit. And
they do so for these you know, so called innovations,
and they're just supposed to be okay with it. They're
supposed to be okay with being deprived taken from having
(33:31):
their resources taken and sickened in the process, or they're
meant to be like thank you for these opportunities to
do this. It's really just a toxic dynamic, and it's
one that I think we really need to change. Like
like I it makes me angry that lawmakers who are
meant to be representing the folks who put them in
(33:53):
office have just like abdicated the responsibility on this and
thrown up their hands and said, oh, maybe maybe they'll
be a janitorial in it for you. Like, it's just
very deeply insulting.
Speaker 4 (34:06):
More after a quick break, let's get right back into it.
Speaker 1 (34:24):
So Elon Musk was interviewed and he said that he's
going to spend a lot less money and time in politics.
He said, I think I've done enough. We'll see if
I have a reason to do political spending in the future.
I don't really see a reason. He's also talking about
politics a lot less. He used to post about politics NonStop,
but now it's mostly just about his companies. According to
the Post, in February, more than half of Musk's posts
(34:47):
and reposts on x were about Doge. About thirteen percent
of his February post mentioned Trump by name, nearly double
that proportion named one of his companies. So far in May,
those numbers have reversed. Now under twenty percent of his
posts are out doge or politics, and more than half
are about his business ventures. So yeah, this is a
pretty big shift. And I have heard people say like, oh,
(35:09):
because if companies are taking a hit, he's super unpopular.
What he's doing in government is unpopular, so he's retreating.
But I actually see it differently. I think what he's
actually saying is like, I've got what I wanted, so
now I can be done. I have, like you know,
grifted my grift. I have gotten rid of the voices
who would have regulated my companies. I have made some cushy,
(35:31):
lucrative contracts. I've got a little money in my pocket.
I think the government grift has paid off for him.
He has personally enriched himself and now he can back off.
That's what I think.
Speaker 2 (35:40):
Again, I really want to disagree with you here, Bridget,
but I think it's yes, I think it is that.
But I also do think that the amount of Tesla
takedown protests, which I mean hats off to the continued
protests that are taking place on all different kinds of
dealership and whatnot, as well as the Wisconsin you know,
(36:04):
Judge seat loss after he poured, however, many millions and
millions of dollars into that and got his ass handed
him handed to him by the people of Wisconsin is amazing. Like,
those two things are huge, and so he even just
on a pr level, knows that he's not helpful to
the MAGA brand. But you're I do think you're right
(36:25):
that the behind the scenes, if you have gotten all
of our irs information, our social security data, you know,
if you have now allowed for sharing between federal agencies
of our private information in addition to, according to whistleblowers,
(36:46):
possibly foreign governments, you've got a fuck ton of leverage
and a fucked onn of way to make even more
money than you already have. So he might have a
little dip right now. You know, he obviously borrowed or
he bought Twitter on some Tesla stock that's not doing
very well. But give him some time when you know,
(37:09):
big Balls and some other little pepe meme can develop
some new payment system for federal workers who remain and
and or sell our data to a foreign I mean,
like then he's like, we also know that he's strong
arming other nations into using starlink to get their internet,
(37:29):
so like bullshit that he's going away, bullshit, he's taking
a break, He's just pressing pause while more evil is done.
Speaker 1 (37:39):
Yeah, the grift is truly global with him. I I mean,
I we talked about this on the show before, but
when he was doing his sort of woe with Me tour,
being like I'm losing so much money and I'm only
all I'm trying to do is point out waste and
fraud and what like, I'm not a bad guy, cry
merely like I I wish that Like he in my mind,
(38:02):
he is like the ultimate welfare queen of Like he
is actually the person who was like scamming and personally
enriching himself at our dime. He is taking from me
brokies who live in fucking one bedroom apartments like myself
to enrich himself. And he's the richest man on earth.
Speaker 2 (38:17):
Three children. He's he's definitionally a welfare.
Speaker 1 (38:20):
To a tea and so it just is wild to me.
I don't know. I when he's when he's talking about
like retreating from politics, that completely agree. I don't think
he I don't think he's retreating at all. I think
he's like it's just getting he can do more behind
the scenes than he can in front of the scenes,
because I think that when he's publicly attached to stuff,
(38:41):
it doesn't like people don't like him, Like, yeah.
Speaker 2 (38:43):
He's talks and he's got the opposite effect. I mean,
he should have never been in the public eye. They
prove that he also you know, in oval office meetings,
you know, he would get into big fights with different
agency heads and you know, he was not a pleasant
person to be around. Dude was like sleeping on couches
in the White House and not showering and wearing the
same shit over and over again, like you know, all
(39:04):
for the American people, like no, all for massive security
breaches and again stealing our information. So I'm here's my
thing is like if and when Democrats ever regained control.
To me, the damage that Doge has done is probably
(39:26):
the biggest thing that needs remedy. And if there is
not accountability, if there is not a restoration of both
jobs but also our data privacy of you know, security protocols,
I mean, it's really sick to bridge it because like
you know, we're in a moment where like the oligarchy
is such, where like the lines have been drawn. If
(39:47):
a lot of tech people are just like, well, I'm
in it for the money. Fuck democracy and all the
sort of veneer of we are an innovative future looking democratic,
multicultural da da d all. That's bullshit. It's been bullshit.
But I'm like, I know there are good people in
Silicon Valley and in tech. I know there are. They're
(40:08):
not in power, but I know there are, and I
know there are people who could work with the federal
government to help us. Shit, So this is me is
sort of like think, you know, future dripping about like
how do we get all of this back? How do
we wrest this control back from these gorules?
Speaker 1 (40:25):
Yeah, I firmly believe that. I mean, there was a
time where you had more thoughtful, ethical voices and tech
who genuinely were interested in using technology to build a
better future. And I think that, you know, a lot
of those voices got sideline erased, pushed out. Those happen
to be diverse voices as well, and so those people
are there, they exist. I think that people like Elon
(40:49):
Musk and other tech leaders have really been able to
flip the script and make it seem like one technology
only gets built on a handful of companies by a
handful of people. That's so not true too, that the
only people who are doing anything, or the only voices
worth listening to in tech are the most unethical, not thoughtful,
paranoid like delusional, like not with it, people like it
(41:14):
truly is. And it's like like we have gotten mixed
up and are tape being taken along for the ride
of these horrible people who are telling us this is
the way it has to be. And I know, I
firmly believe that that is a lie. It does not
have to be this way. It was not always that way,
and they're banking on us like memory, holding the fact
that it used to not be that way. That when
you thought there was a time where when you thought
(41:35):
about technology, it was exciting, You had apps that you loved.
When you thought about what the internet was, it was
like lots of weird little sites that people were running
just because of the fun of it, or just because
they cared about something, or it was a community of
people gathering, you know, around one idea or one passion.
And I think that we have forgotten, we have allowed
these tech leaders to memory hold that and make us
(41:56):
forget what it used to be like it used. I
don't want to get too nostalgic, because like it definitely
had problems back in the day as well, but like
it used to be better than this, and it can.
It has been better than this, and it can be
better than this. We deserve better than this.
Speaker 2 (42:10):
Yes, And and actually the height of freedom, I mean,
what's so? I mean they're open hypocrites and elon and
you know the rest of the Peter thiels like they're
all full of shit. I mean they're all like, oh,
free speech and free society and and and it's like, yeah,
but the old Internet was the freest we've ever been,
(42:30):
you know, like the Internet being run by ten companies,
tech being run by like three companies is not freedom
at all. So I think that's the other thing is
like we also have to reclaim this mantle of freedom
and free speech. And it's just wild to me that
And it's such a failure of the Democratic Party too,
(42:53):
that like we can I started to get really sick
to my stomach when I think about all the things
that we could have, would should have, But like Democratic
Party could have been the party of privacy and you
know ending, you know, surveillance, and of course Obama did
a lot of good things, but Mom also went after
Edward Stein. And it's like, no, we could have actually
(43:14):
planted a flag on that and been like, we protect
your data. There's nothing more important than protecting your data.
You have the freedom to you know, use the Internet
as you want, and it shouldn't be just a bunch
of conglomerates and Biden's antitrusts, you know, FTC started doing
some of that. So I just it's like, god, it
makes me so terrified because this is one thing where
(43:35):
also Democrats still believe in that old culture of tech
and they use it again as like a shield to
still raise money from actually really villainous people.
Speaker 1 (43:48):
I think that democrats get to sort of in ways
that are unearned, in my opinion, get to enjoy like
being a little bit better on the issue, even though
like you're not really doing much on it. And like,
I if we could go back in time and have
the Democratic Party, Yeah, as you said, make privacy and
internet freedom part and parcel of what it meant to
(44:10):
be a democrat. Imagine we're going to be today, Like,
but things could be would be so different.
Speaker 2 (44:15):
Right, And actually I do think like the grift of
oh I'm a forward thinking you know, pro democracy, multi
and multiracial, multi religious, you know, like support DEI program's company,
I think that grift was kind of meant to pull
the wool over on. I think some liberals eyes and
(44:36):
that you know, the final form really is monopoly and control.
And you know, now Mark Zuckerberg god to be Sheryl Sandberg,
which is just so funny and rich, and I just
could I just like this, I need I could write
like a ten episode podcast about the rise and fall
of lean In, you know, because it's like, nah, man,
(44:57):
the final form is we're in control where the oligarchs,
fuck you you allowed us to get this big ha
ha ha. We just played with some you know, lightweight
DEI programs, which are good again, but we just you know,
we don't actually care about real equity or real inclusion
or you know, actually doing you know, making sure our
(45:21):
spaces are not actually hotbeds of racism and misogyny. So anyway, first.
Speaker 1 (45:28):
Of all, I have met Sheryl Sandberg in person, and
I don't know if I've told this story in the
podcast before, but I was like barely invited to a
book launch she did, and I will never forget it
was sort of like it was right after Trump got elected,
and it was like it was at a very different time,
and I remember it was all of these like journalists
(45:49):
and like women media, like Rah rah women in media.
And this was right when we were starting to learn
more about like Cambridge Analytica and what the role that
Facebook had is getting Trump elected. And Cheryl Sandberg is
there doing a very kind of like puffy Q and
A about her new book about grief, Option B and
(46:11):
somebody asked. She was like, oh, I've been reading about
the way that Facebook has been perhaps illegally using all
of our data to help get you know, to help
get Trump elected and meddle our elections. And I'll never forget.
She was like, today's about women's empowerment. Let's stay on focus.
Like that's how she answered the question. Today's about the women.
(46:32):
Let's let's let's keep our focus on that.
Speaker 2 (46:34):
I have so many words, but that is truly disgraceful.
Speaker 1 (46:39):
It was, And I have to say, you're it's almost
like you are a mind reader with the whole company's
kind of limply embracing DEI and then just using it
to gobble up more power. Because did you hear about
Verizon and DEI, So basically, in twenty twenty, Verizon claimed
to be like all in on DEE. They put out
(47:01):
statements after the death of George Floyd that was like,
diversity and inclusion are greatest superpowers. YadA, YadA, YadA. Cut
to this week, Verizon sends a letter to the FCC
stating that it is ending all DEI policy, So no
more DEI teams or roles. They're taking all the all
the references to DEI from their training materials, no more
workforce diversity goals, and they're taking it further because they're
(47:24):
basically like memory holding it. They scrubbed the fact that
Verizon was ever involved in any of this from their website.
When you go to the DEI pages online, it just
redirects to generic Verizon content, and.
Speaker 2 (47:37):
Oh my god.
Speaker 1 (47:37):
The Verizon said that this was because of like Trump's
executive orders and Supreme Court and federal mandates, but like
there's no Supreme Court ruling banning DEI or you know,
those kinds of goals, and you know, certainly other companies
like Costco when Trump was like, you need to stop DEI.
They were like, kick rock, will do what we want.
And the timing of this is very suspicious because Verrizon
(48:00):
announced this just a day before the FCC approved Verizon's
twenty billion dollar acquisition of Frontier Communications. UH. The SCC's
approval emphasized that Verizon had gotten rid of their their
quote discriminatory DEI policies, and it really does seem like
Verizon was like, fine, well, abandon all of our commitments
to DEI if you were if we're able to get
(48:22):
this merger, and the SCC was like, bet sure, great dude,
that's exactly what happened.
Speaker 2 (48:29):
It's not even there. That is exactly what happened. There's
no mystery here. That is so terrifying and disgusting. I mean,
and you see the same thing with you know, CBS,
right paramount is trying to what is it work with
like Skynet News one, I don't know where the fuck
(48:49):
is sky something, and they're trying to get their merger
and it's like, yeah, oh, we can just do this
by getting rid of all of these policies. These are
going to have massive impacts on workers. I mean, that's
I think the other thing that we need to remember
about the anti DEI backlash is it's it's really it's
a worker's fight. It's a fight on an attack on workers.
(49:11):
Everything we're seeing as an attack on workers, and we
have to like again, whenever you know, it's like, the
working class is women of color, predominantly women of color.
That is just what it is. The amount of Black
Americans in federal jobs. What is it like twenty percent
of the workforce or something, or twenty percent. I forgot
what the twenty percent was, but it's a huge percentage.
(49:32):
This is deliberate. It's an attack on workers. It's a
racist attack on workers.
Speaker 1 (49:36):
Absolutely.
Speaker 2 (49:38):
But yeah, that's that's so sick. God, I'm so glad
Verizon has shipped service in my neighborhood and we had
to switch to AT and T. I'm sure AT and
T is up to no good and I'm gonna be
a T mobile bitch at some point. It's gonna have
to happen more.
Speaker 4 (49:56):
After a quick break, let's get right back into it.
Speaker 1 (50:12):
Wait, I have to ask switching gears a little bit.
Did you hear about this fake AI generated booklist that
was published in the Chicago Sun Times.
Speaker 3 (50:22):
Oh?
Speaker 1 (50:22):
My gosh, okay, it's all I've been thinking about for
the whole week. So it's just like a very wild
story to me. Basically, some newspapers, like print newspapers, including
the Chicago Sun Times and at least one edition of
the Philadelphia Inquirer, published a syndicated summer booklist that was
like obviously AI generated that included fake books, like made
(50:43):
up books by famous authors. So it said that Perceville Emritt,
who won the Pulitzer this year for fiction, was coming
out with a book called The rain Makers that's supposedly
set in quote the near future American West, where artificially
induced rain has become a luxury commodity. Uh, bullshit, that
book doesn't exist. He did not publish that book.
Speaker 2 (51:02):
Oh my god.
Speaker 1 (51:03):
They said that Chilean author Isabella Allende was putting out
a book called Tidewater Dreams, and they described it as
these the first climate fiction novel all the first, have
the very first. These are not real books.
Speaker 2 (51:18):
In fact, Octavia Butler.
Speaker 1 (51:20):
She would be to differ. In fact, of the fifteen
books on this like must read twenty twenty five summer
fiction list, only five of the books were real. The
rest of them were all made up.
Speaker 2 (51:32):
Yo, So they straight up outsourced their reading list to
AI and thought no one would notice.
Speaker 1 (51:37):
Yes, and they're not, Like they're like, it would be
bad enough if it was just like an online thing.
This was in their print edition. Like people got pictures
of themselves holding the print edition of this, So like,
how did this come to be? I saw on Blue
Sky that the Chicago Sun Time. Somebody who worked there
was like, we're looking into how this got published, which
(51:57):
that also raises questions of, like you don't know how
this got published? Very good question, like definitely worth looking into,
so it gets a little stickier. According to Victor Limb,
the marketing director of the Chicago Sun Times parent company,
Chicago Public Media, the list was part of licensed content
provided by King Features, a unit of the publisher Hearst Newspapers.
According to NPR, even though the piece has no byline,
(52:20):
a writer did claim responsibility for it and did say
that it was generated by AI. Obviously, he says, huge
mistake on my part and has nothing to do with
the Sun Times. They trust the content that they purchased
from me is accurate, and I've betrayed that trust. It
is on me one hundred percent.
Speaker 2 (52:36):
Wait, I love it. I love reading a little bit more.
This guy, the freelancer says that while he does sometimes
use AI to create content, he typically checks it before
submitting it.
Speaker 1 (52:49):
Oh so you what I can what I can, Yo, bridgie.
Speaker 2 (52:54):
Like I used to write when I lived in Argentina
years and years and years, I'm not really like ten
years ago. Like one of the things that there was,
like a freelance gig was farmed out. Was just like
writing travel guides for some online site about cities I
had never been to. But I did my research and
(53:15):
looked at other guides and looked at the city explanations
and just kind of like I didn't plagiarize, but I
wrote my own after reading a bunch of other guides
version for this guide. Now, that is very much like
that job right there, which maybe I don't know how
much should I get one hundred dollars is definitely being
farmed out to AI as we speak. But like I
(53:38):
love the idea that, like if I had that job now,
my lazy ass couldn't even do my own fucking reasearch
on a listical on a fucking listicle of books, and
ironically their.
Speaker 1 (53:50):
Books you have to read.
Speaker 2 (53:53):
You don't even have to read the book when you
got to read the title.
Speaker 1 (53:56):
You just go to the books that've been released in.
Speaker 2 (53:59):
Twenty twenty five. I just look at just go to publishers.
I don't even like I don't follow books, but like,
I don't know Penguin Random House, Like, it's so wild
to me that someone And and look if he's like
if he came clean and was like, listen, I got
fifty dollars to do this article sucked the Sun Times,
I will I'm like, I'm with him, But come on,
(54:21):
my guy, how many people in Memphis can't breathe because
you had to do this Canada bai for how much
fresh water was used so we could create a fake
Isabella Yende book.
Speaker 1 (54:34):
And that's I mean that. The commentary around that pretty
much said exactly that. Kelly Jensen, who is a former
librarian and the editor of Book Riot, had a very
good point. She said, this is the future of book
recommendations when libraries are defunded and dismantled. Trained professionals are
removed in exchange for made up, inaccurate garbage. And it's
just like what you were saying, Like you couldn't even
(54:57):
scroll Amazon and be like, well, I haven't read this book,
but it's a book you could buy it that exist.
Speaker 2 (55:01):
Exactly exactly, It's real, Like what was the prompt to
like AI is also bad? You know what I'm saying.
AI is not actually good at it. And that's even
if it were good, and in some cases it is,
I still contest fuck AI and I just again I
(55:22):
don't know, like especially as writers, because what you're doing
is you're actively undermining your goddamn job. I don't know.
I have a lot of thoughts because as I came
from aj plus you know, Valgaiesier Media, and they were like,
you know, we were doing spearheading a lot of this
like short form content, but but BA text on screen
over images, right, that was like now this news and
(55:44):
aj plus were really big on Facebook. Now all of
that stuff is just AI. I think I'm assuming. I
believe it can generate like a couple of images from
a story text on screen and then done. So it
is it's not like, oh, yeah, we've we shouldn't have
created the terminator, you know, or the T one thousand,
(56:07):
but like there is a little bit of like this,
what is what happens when like with the I don't know,
the sort of shorthand and shittification of content and news
in general. I'm having a little bit of a existential
moment here.
Speaker 1 (56:22):
Yeah, I mean, I wish so deeply I could say
I've never used AI, but I have been known to
use chatshept if I have to write a rough email,
like an email where I'm like, oh, I really don't
want to write this, or like this is I don't
know what to say, like a like a like a
complic an email, right, I have to convey a complicated
set of feelings.
Speaker 2 (56:43):
And oh, okay, it's complicated feelings. Let's say AI.
Speaker 1 (56:49):
I feel like that's probably the worst, the worst youth.
Speaker 2 (56:52):
Say tell me you're talking about a breakup without telling
me your.
Speaker 1 (56:56):
Yeah, right, it's I don't see it will help you
if you dreading writing a breakup email, it could help you.
I'll put it that way. But I will say, like
having a big think about it. I had a similar
existential like spin out about this because I was like,
don't I owe the people who are like actually in
my life the time to sit down and like put
(57:17):
my thoughts together, and even if it's hard, even if
it's cathartic even if it's difficult, don't I owe them
me the human actually sitting down and like like writing
out how I feel before I send it, as opposed
to just like asking chat GBT to do it for me.
I really had a like deep, deep, I don't want
to say crisis, but like you know, and I think
(57:38):
when I hear people like Mark Zuckerberg talking about the
future of friendship, how you know, Facebook, he admits, is
like no longer for friends in the future, role have
AI friends on Facebook and that's who we'll talk to.
It's like, that's not a future that I want. Even
when human to human connection is difficult or hard or emotional,
that's the point of being alive. Like, I don't want
(57:58):
to outsource that. I don't want my I don't want
my friend to be AI and never like.
Speaker 2 (58:02):
But also that is the point when you are an
unsuckable little shit, uh you know, and you all got
into tech starting to rank chicks rather than talk to
them in their face. That's what happens. I mean again,
we're just creating all Like the word tech bro's been
(58:25):
around forever and it is a smarmy kid that cannot
make eye contact with anybody, but also makes more money
than you'll ever see in your life and dresses like shit.
Speaker 1 (58:36):
Yeah, And are these the people we want to take
our cues for the future of friendship from? Like, do
you want someone like Mark Zuckerberd in charge of design?
If using technology to design the future of what friendship
will look like in your life? I would argue no.
I would argue that again, that iteration of the future
is worse than the one that came before it.
Speaker 2 (58:56):
Okay, bridget question for you. What happened to the metaverse?
Is it still a verse?
Speaker 4 (59:02):
No?
Speaker 1 (59:03):
They kind of abandoned it. H Yeah, they kind of
abandoned the meta They were all in on the metaverse.
They even had a big announcement when their metaverse AI
characters kind have legs. They had a whole big announcement.
They're like, we've got legs now. It took them a
long time to break the legs code. But no, it's
(59:23):
no longer really a thing.
Speaker 2 (59:25):
Okay, So so it kind of went the way of
the Google glass because I've been talking to people about like, look,
it is possible to shame people's product to shit.
Speaker 1 (59:36):
You know. Yeah, do you remember when Google? I mean
you you live in California, so maybe you've lived through this.
When Google Glass first came out, and like it was
all the stories were like I went into a bar
in San Francisco, We're in Google glass, and I got
thrown out. Like it was just like story after story.
Speaker 2 (59:51):
Crazy about that. It was crazy about that. I don't
actually think there were that many stories, but there was
that one story where this woman goes into a bar
with the Google glass and then writes some posts as
if she's the victim. But everyone was like, honestly, you
sound so unfun and I would have kicked you out too,
(01:00:14):
And the and the reverb of that story of how
like again before we even had Karen language, how Karen
it was, and how corny it was, and what a
tech idiot and kind of just a loser you looked
like it it was magical. So it's like it's it's
almost like, I don't know, it's like mythical, that story
(01:00:35):
because it really tamed the Google glass wearers. And I
don't know what, what's the big faculous Yeah, the big
oculus or the Google one that people kind of walk around,
Oh about shame, shoot, what is that? Oh, it's the
it's the new Apple VR. What is the Apple Vision Pro? Oh, yeah,
(01:00:55):
Apple Vision Pro, the Apple Vision Pro. Like, I don't
know if people are actually wearing that out.
Speaker 1 (01:01:03):
So I have a theory that people don't want wearables.
This is I've said this from day one. The only
wearables that have actually ever seen anybody wear are the
ones that are that like essentially look like sunglasses. So
I will say, like Meta does have a collab with
ray Bands, I would never wear it, but those are
the only wearables I think that people would actually wear
(01:01:24):
ones that just look like like regular sunglasses. I don't
think people want big shit on their face, Like, I
don't think that you would, like imagine sharing a space
with like your partner or your roommate, and as opposed
to it as like holding your phone, you've got a
big thing on your face. It's like it's like too
much shit on your face. I don't think people want that.
Speaker 2 (01:01:42):
That is my like, Yeah, there's too much.
Speaker 1 (01:01:45):
I don't even want to be around anymore.
Speaker 2 (01:01:47):
Yeah, I don't want to be around anymore. There's too
much shit on me and words to live by.
Speaker 1 (01:01:50):
Really, Oh, frand Jasica, talking to you is such a delight,
Like you, Oh my God, you have such a like
you just bring such an energy and a light to
these stories. I do this every week and like this,
you have a you really bring a player to this.
I really appreciate it.
Speaker 2 (01:02:07):
Oh my God. Likewise, thank you so much. These are
great stories and also stories I was haven't even talked
about this week, which is wild because I talk about everything. No,
but like, that's why I think this show is so excellent,
because you're like, you miss this so and everyone's over here,
Bridge is over here, so and they're very important. But yeah, man,
(01:02:28):
thank you for having me.
Speaker 1 (01:02:29):
Where can folks listen to the Situation Room?
Speaker 2 (01:02:32):
I was gonna segue into the plugs. I just teed
you up instead, YouTube dot com, slash Franny fo f
r A and I FIO for the live show Tuesday's
Wednesday Fridays one being Pacific four pm Eastern, or the
Bituation Room wherever you get your podcasts.
Speaker 1 (01:02:48):
And you can find me on social media. I'm on
Instagram at bridget Ran DC, I'm on TikTok at bridget
Ran d C. And I'm on YouTube at there are
no girls on the Internet. Yeah, that's right, I just
started it, so I'm have not memorized you'll be my
like ninety third follower subscriber, Earli. Thanks so much for listening.
(01:03:09):
We will see you on the Internet. If you're looking
for ways to support the show, check out our March
store at tegoty dot com slash store. Got a story
about an interesting thing in tech, or just want to
say hi, You can reach us at Hello at tangody
dot com. You can also find transcripts for today's episode
at tengody dot com. There Are No Girls on the
(01:03:30):
Internet was created by me Bridget Tod. It's a production
of iHeartRadio and Unbossed Creative edited by Joey pat Jonathan
Strickland is our executive producer. Tari Harrison is our producer
and sound engineer. Michael Almado is our contributing producer. I'm
your host, Bridget Todd. If you want to help us grow,
rate and review us.
Speaker 4 (01:03:47):
On Apple Podcasts.
Speaker 1 (01:03:49):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple podcast, or wherever you get your podcasts.