Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this
is There Are No Girls on the Internet. Welcome to
another installment of our weekly news roundup, where we round
up all the stories at the intersection of text culture
and social media that you might have missed so you
(00:25):
don't have to. And I am thrilled to be joined
once again by my producer Mike.
Speaker 2 (00:30):
Mike, thank you for being.
Speaker 3 (00:31):
Here, Bridget, Thank you for having me.
Speaker 1 (00:33):
I wanted to first give a little bit of an
update on a story that you and I covered together
earlier this week about this wave of AI generated videos
that are flooding TikTok right now that feature black people,
typically black women, but some black men in these very
over the top exaggerated racist ways, to the point where
(00:54):
I argue they are essentially the new iteration of minstrel
shows for the digital age. Minstrel shows were an incredibly
popular form of entertainment in the nineteenth century where mostly
white performers would wear black faces to make fun of
black people and portray us as stupid and lazy. But
they weren't just entertainment. They were also a way to
affirm a political and social climate hostile to black folks
after slavery. This is essentially what these videos on TikTok
(01:17):
are doing, just using AI and we made that episode.
Speaker 2 (01:20):
We recorded it about a week ago on Saturday.
Speaker 1 (01:24):
It has not even been a full week, and already
I have seen more and more iterations of these videos
on TikTok. One of the hallmarks of these videos that
we talked about in that episode, it was really a
deep dive into where these AI generated videos are coming from,
the historical context of them, the technological context of how
they're being made, and what they say about our culture today.
(01:44):
One of the points that we made is that AI
generated content oftentimes gets more and more extreme, and so
in just a week's time, I feel like the version
of these videos we were talking about has already kind
of been cranked to eleven in terms of the extremest
qualities that I am seeing in this kind of content.
Speaker 3 (02:03):
Yeah, one hundred percent. After we recorded that episode, over
the weekend, I created a TikTok account, which is not
something that I'd had before, and my for you page
was just filled with those kinds of videos and the
ones that are there today. I looked just a couple
(02:24):
of hours ago before we got on the mic. Here,
there's so much more extreme than the ones that you
and I were concerned about just like four or five
days ago. So just more extreme. And they, like to
be clear, they started off extreme and now they're like
so over the top. I can't believe that I'm looking
(02:44):
at a mainstream social media platform like TikTok and not
some kind of like video powered four cham. No.
Speaker 1 (02:51):
Absolutely, So there's one in particular that I saw that
I want to talk about now because it honestly, not
only has the racism been to eleven, but it's gotten
even more horrifying, if you can believe that. So one
of the types of these videos that we talked about
in that episode was what people are calling slave talk,
which kind of shows these AI generated enslaved people imagined
(03:14):
if they had social media and they were vlogging their
experiences on the plantation. And so that is bad enough,
end of sentence.
Speaker 3 (03:23):
However, the premise is insane to begin.
Speaker 2 (03:28):
It's noxious, it's it's terrible.
Speaker 3 (03:29):
Noxious is a better word.
Speaker 1 (03:31):
They have found a way to make it even worse
because I saw videos where someone was making these slave
talk Ai generated TikTok videos that showed an enslaved person,
you know, using AI to demonstrate what their time on
a plantation might have been like and having them say, oh,
slavery wasn't that bad. I don't actually mind it. Being
(03:52):
an enslaved person on a plantation isn't so bad.
Speaker 2 (03:55):
But they were.
Speaker 1 (03:56):
Utilizing TikTok shop to use these videos to sell a
solar powered sun hat that has fans built into it,
and so it was a an enslaved person being like, oh,
my time on the plantation is hard, but this hat
has made it so much easier. Go to my TikTok
shop and my bio to buy this hat. Now, I
(04:18):
will say enough people complained about this video that TikTok.
Speaker 2 (04:21):
Took it down, so it's no longer available.
Speaker 1 (04:23):
They're no longer selling this hat using a borrent AI
generated enslaved people fan fiction.
Speaker 2 (04:30):
However, this is what I mean.
Speaker 1 (04:32):
I feel like this really, even just in a few
days since we made that episode about this kind of content.
The fact that somebody would put this kind of content
on TikTok and then utilize this kind of content to
sell a sun hat really just says a lot I mean,
I guess I gotta say I'm happy TikTok took it down.
But when I talked about the ways that I thought
(04:55):
that this kind of content really said something about how
extreme we are willing to go and how AI really
allows us to do that at scale. When I made
this episode, I didn't think it was gonna get this dark,
and it got so dark, so fast.
Speaker 3 (05:10):
So dark, so fast, And something about using the technology
to sell such a pedestrian item as like a solar
powered sun hat with a little fan on it is
like extra dark like it. It's so almost benign, built
(05:33):
on such an absurd marketing premise that it's actually like
unimaginably dark.
Speaker 1 (05:42):
It's pretty bad, and there's just something so dystopian about it.
And you know, I made this point in our deep
dive about this kind of content that it really bothers
me that this is the use case for AI. We're
being told that this is the lynch pin of our economy.
It's going to be so important, it's going to transform everything,
and the way that we are seeing it used is
just so small and gross, making what I would argue
(06:09):
is AI generated minstrel show content to sell cheap junk
on TikTok shop.
Speaker 2 (06:15):
I just we're cooked. It's just it's bad. I will say.
Speaker 1 (06:20):
One pivot of this kind of content that I did
see is in the wake of the news that the
United States bombed Iran. So most of the content that
we talked about in that episode was about how black
women were being shown in this incredibly negative, racist, stereotypical
light where we were aggressive, violent, loud, ghetto, all of that.
(06:40):
But in the wake of the bonding of Iran, this
video went viral that showed what I would have thought
was an obviously AI generated black woman who is supposed
to be a soldier who is celebrating this bombing with
her other soldiers, and she's saying, you know, blessed, she's
saying what sounds like a Bible quote, like are the peacemakers?
Speaker 2 (07:04):
It's paker.
Speaker 1 (07:08):
And so even though so much of the AI generated
content that is flooding TikTok that depicts black women now
is negative, what's also interesting to me is that I
don't think it's a coincidence that they chose an AI
generated black woman to be in this video. You know
that I think is pretty clearly AI generated propaganda to
make the United States like really excited about the fact
(07:31):
that we bombed Aron. And when I saw this video,
I thought, well, this is obviously AI. Like, no one's
gonna think this is a real person. I have to
kind of say, like, the comments of that video surprised
me in that the majority of the comments that I
(07:51):
saw did not seem to clock that it was AI,
and it just I mean, I'm not saying anything groundbreaking here,
but the ability of AI to generate propaganda at scale
it was worrisome. And I think especially using black women
as a symbol of that. I mean, I think whoever
(08:12):
created that video chose to have it centered on an
AI generated black women because I do think there's a
narrative that people trust black women when it comes to
political happenings, you know, the phrase of like trust black women.
And so I think that it's interesting how black womanhood
is used as this easy shorthand when you want to
(08:33):
get people on your side of your propaganda video. But
also you could swipe up and you could see a
video of that same black woman being depicted in an
incredibly demeaning, disrespectful, racist way. And so it's like, I mean,
when do we get to have our humanity depicted. It's
either we're this mouthpiece for like jingoistic, raw, raw American sentiment,
(08:58):
or we're being compared two animals in racist AI generated content.
Speaker 2 (09:04):
There's no in between.
Speaker 1 (09:05):
No one is showing our actual humanity and any of
this content.
Speaker 2 (09:08):
I guess as we I'll.
Speaker 3 (09:09):
Say, like when we first recorded that episode about a
week ago, it was a problem that people were noticing
and talking about, like, hey, what's up with all these
strange videos? And it's only gotten worse. It is interesting,
and you know, commendable that TikTok is taking some of
(09:29):
these videos down, but you know that's not a solution
on the same scale of the people who are putting
these videos up, right, Like, if there are a thousand
people creating a thousand of these videos every hour and
putting them up, and only the most popular generate enough
(09:53):
reaction and calls to take them down, the TikTok or
whatever platform actually takes action, that's not effective solution. So
it's gonna be really interesting to see what happens with this,
Like it can't just keep going on and getting worse.
(10:14):
Something's got to break.
Speaker 1 (10:15):
When I was doing platform accountability work and like working
with the leaders at platforms like TikTok, who I used
to sit down with regularly, some of the folks who
worked there about their content moderation policies. That was the
biggest frustration was that it was like playing whack a mole,
and there's no whack a mole strategy.
Speaker 2 (10:32):
Like I thought we were doing good work, and I'm.
Speaker 1 (10:34):
Proud of the work that we did, but you know,
one comes down to come up in its place, and
so yeah, I think that it really demonstrates the need
for TikTok to do something meaningful if they want to
get a handle of this kind of content on their platform.
And we have already seen this in the last few days,
how much this content has taken off, become more extreme,
(10:55):
become more racist, and really ratcheted up what they're doing.
So I think it's great TikTok took this on video down,
but that's certainly not going to be enough to turn
the tides of this.
Speaker 3 (11:05):
Yeah, and we see that all over the place, and
not just this category of videos, but like health misinformation,
vaccine misinformation, political attacks, like whack a mole is not
an effective solution. Whackable is like one step removed from
(11:26):
just giving.
Speaker 1 (11:27):
Up, so we will put the link to our deep
dive into AI generated digital blackface. I guess menstreal content
on TikTok in the show notes.
Speaker 2 (11:38):
Check it out. I'm pretty proud of that episode, so
if you haven't listened to it, please.
Speaker 4 (11:42):
Check it out. Let's take a quick break at our back.
Speaker 3 (12:03):
Now that we've got the easy breezy banter out of
the way, should we get into the news round up?
Speaker 1 (12:08):
Well, I mean, speaking of horrible, noxious things, we have
to talk about the absolutely enraging tragedy that happened last
week in Minnesota, where Minnesota House Speaker Melissa Hortman and
her husband Mark were gunned down in what authorities say
was a politically motivated killing. She will lie in state
(12:29):
at the Capitol Rotunda this week, a day ahead of
their funeral. I mean, reading this was sort of like
the saddest milestone, the saddest, most enraging milestone I've ever read.
Hortman will be the first woman and one of fewer
than twenty Minnesotans accorded the honor of lying in state
at the state Capitol rotunda, Which something about that really
(12:50):
horrified me. That in order to be the first woman
to have this honor, she has to be murdered in
an act of political violence.
Speaker 2 (12:59):
It makes me sick.
Speaker 1 (13:02):
And I know this happened a bit ago, but I'm
still in I'm still as enraged about this as I
was when I heard about it, And I think one
of the reasons why I'm so enraged is how quickly
we moved on from this story. You know, I would
argue that a story in which a person committed what
it's pretty obviously an act of like political violence and
(13:25):
terror on multiple elected officials, the fact that that was
like over and done within a few days just does
not sit right with me. And it really just reminded
me how much the mainstream media really just allows right
wing influencers to set the agenda.
Speaker 2 (13:41):
They Once it.
Speaker 1 (13:43):
Was clear the attacker was a Trump supporter, the right
wing media stopped talking about it because for a while
they were like, the shooter was a Democrat, he was
a leftist, da da, da da. Once they couldn't say
that anymore, they just stopped talking about it. And I'm
enraged at the way that the legacy media really just
followed suit. You know, I knew I wanted to talk
about this story today, so I was searching for some
(14:04):
of the news articles written about it, and most of
the most recent coverage I saw was either from local
Minnesota press or global.
Speaker 2 (14:13):
Press, and I thought, how shameful, Like, aren't we ashamed?
Speaker 3 (14:18):
Yeah, it is shameful, like you say it. Really, I
think demonstrates how thoroughly right ring influencers are just able
to control the media right now, this isn't a story
that serves them any longer. They thought it was for
a little while. Senator Mike Lee made some really regretful
(14:39):
comments in the immedia aftermath of the killings, But after
it became clear that this was not a flattering story
for the right, they just stopped talking about it. And
once they stopped talking about it, apparently everybody else did too,
and the media did as well. It really feels like
(15:02):
a pretty major event we should be talking about. We
just we aren't used to political leaders being killed in America.
It's a very unusual and scary story, and the way
that the media has so quickly moved on makes it
seem normal and almost expected in ways that that are wrong.
(15:28):
I think every American should be like really upset and
concerned and horrified.
Speaker 2 (15:37):
I mean, I think you said it.
Speaker 1 (15:38):
I think that's the point of why you move on
to normalize it, to normalize political violence against people who
don't go along with the status quo. And I think
in some of the commentary that folks have said, you know,
Lee included, I think that's I think that's what they're
trying to do. They're trying to say this is this
is people should expect this. And you know, we've seen
(15:59):
other and officials talk about how they were afraid to
speak up, afraid to go against Trump. I think this
is what they're afraid of, you know. And I think
the fact that the media is essentially their doing that
dirty work for them by saying, you're right, we will
normalize it by not acting like it's a big deal,
and then it becomes not a big deal. I think
(16:20):
that you're right that this is a different kind of
thing than what we're used to seeing in the United States.
And the way that we have just so quickly moved
into like, oh, this is normal. Maybe it'll get three
days of coarbadge and we'll move on to something else,
really is telling. And it also sparked another thing that
we talk about on the show a lot, which is
just you know, I was really thrilled to see that
(16:42):
Mamdanie won the election in New York, and I'm sad
to say that one of my first thoughts was, I
hope he has security. I hope he's safe.
Speaker 4 (16:51):
Like it.
Speaker 1 (16:53):
That's not the way it should be. We should not
be worried about the safety of elected officials and political
leaders in this way. And one of the reasons I
was interested in talking about this on the podcast is
because it really brings up an issue that we talk
quite a bit about, and that is just the way
that our data privacy, or lack thereof in this country
(17:13):
really does pose a specific threat to women and other
marginalized folks in politics, in government, in just like local
civic spaces, because how did the attacker find the lawmaker's address?
Super shady data broker sites that in the United States,
because of our lack of any kind of meaningful, functional
data privacy laws, allow for anybody's information to be easily
(17:37):
found online.
Speaker 2 (17:38):
If you're willing to pay for it.
Speaker 1 (17:39):
MSNBC published the items the attacker had on him during
the murders, and it included photos of a notepad that
they found in his car.
Speaker 2 (17:46):
With a long list of people search sites.
Speaker 1 (17:49):
Where anybody can basically use them to find the home
address of anybody in the US. So we know that
before he killed Hortman and her husband, he shot State
Senator John Hoffman and his wife, Vette Hoffman first. So
the gunman had notebooks in his car containing the names
of more than forty five Minnesota state and federal public officials,
including Hortman's name and her home address. Senator Ron Wyden
(18:11):
said in a statement, the accused Minneapolis assassin allegedly use
data brokers as a key part of this plot to
track down and murder Democratic lawmakers. Congress doesn't need any
more proof that people are being killed based on data
for sale to anyone with a credit card. Every single
American safety is at risk until Congress cracks down on
this sleazy industry. And he is one one hundred percent correct.
Speaker 3 (18:34):
Yeah, And I mean, this is the most egregious possible
example of the sort of harms that can happen when
everybody's personal information is just out on the internet available.
But there are so many other harms that are more
common than murder, like people have their identities stolen, people
(18:56):
get harassed, things that happen so much more routinely that
do not reach this level. It shouldn't take like highly
visible public figures getting murdered to call attention to this
(19:17):
very serious problem. And yet here we are.
Speaker 1 (19:21):
No and this is not even the first time that
this kind of thing has happened. So back in twenty twenty,
this men's rights activist used information that he got from
a data broker site to find esther Salsa, a judge
who was appointed by Obama who had dismissed his lawsuit
challenging the men's only draft. He used that information that
he got from a data broker site to find and
break into her home and shoot and kill her child.
(19:43):
And So, if you're listening and you're thinking, well, certainly
my home address is not on one of these shady
ass sites. I have never put my address on the internet.
I am so sorry to be the one to tell
you this. Your address is probably on the internet, because
how do people's information get on these shady ass sites.
We did a whole episode about this with the founder
of an organization that helps women avoid doxing, which we'll
link to in the show notes. But if you've ever voted,
(20:06):
if you've ever had the utilities turned on in your home,
If you've ever paid a parking ticket, odds are your
information is available for purchase online. And here's the kicker,
it might have even been put there for sale by
your local government.
Speaker 2 (20:22):
It is a travesty.
Speaker 1 (20:23):
It is disgusting, it is disgraceful, and it is dangerous.
So the police found the list of eleven data brokers
in the suv driven by the man who murdered the
Minnesota state representative and her husband, and the list naming
the data brokers also included notations about which sites were
free to use and how much information they require to
obtain detailed data about the individuals being searched. This is
(20:46):
according to an FBI affidated So yeah, he just basically
was able to go on to these shady people finder
data broker sites to find this information and it led
to two people being murdered. This is just so interest
I mean, the lack of us having any kind of
meaningful data privacy laws in this country can lead to
people literally getting murdered.
Speaker 3 (21:08):
I think part of what makes this so maddening for
me is just the fact that it doesn't need to
be this way. As a society, we could choose to
put privacy protections in place, but we don't. We've allowed
tech companies and scammers who profit off of selling these
(21:29):
data brokerage services, We've allowed them to convince us that
it's already too late. We should just abandon the idea
of privacy entirely and embrace digital nihilism. And I hear
a lot of people of like my parents' generation, espouse
these like digital nihilist ideas like oh, my data is
out there, what does it matter? Blah blah blah, But
(21:50):
it does, you know, Like either that or the idea
that any attempts to protect privacy would be some version
of overbearing regulation. It's going to stifle industry. But that's
all nonsense, Like all of it. We don't have to
choose between privacy and innovation. Lawmakers and regulators could put
(22:16):
in place workable solutions to protect privacy, but for various reasons,
we just choose not to.
Speaker 2 (22:24):
We don't have to live like this, like we deserve better.
Speaker 3 (22:27):
Yeah, we do, we deserve better. And they're like in Europe,
they don't live like this. And people in California have
protections that the rest of us don't have, Like there
are models it's it's a choice to live like this.
Speaker 1 (22:43):
Melissa Wrtman and her husband should still be alive. We
don't have to live like this. We shouldn't have to
live like this.
Speaker 4 (22:55):
Let's take a quick break. That are back.
Speaker 1 (23:09):
Well, speaking of ways that we deserve better and ways
that we don't have to live like this.
Speaker 2 (23:15):
Did you see.
Speaker 1 (23:17):
RFK Junior's big plug about wearables.
Speaker 3 (23:20):
He plugs the stupidest shit. Everything that comes out of
his mouth is so stupid.
Speaker 2 (23:24):
But yes, okay. So, Health Secretary Robert F.
Speaker 1 (23:28):
Kennedy Junior announced one of the largest HHS campaigns in
history his words to encourage the use of wearables to
track health conditions. You might be asking, what's the wearable
things like the Aura ring, the fitbit, rings, bands, watches,
and even clothes that use tech to track human vital signs.
It can track how many steps you take, your heart rate,
(23:50):
how many calories you've burned, all that kind of stuff.
So RFK Junior said that Americans buying wearables are one
of the keys to his plan to making America healthy again.
He said, we think that wearables are a key to
the MAHA agenda making America healthy again. My vision is
that every American is wearing a wearable within four years,
they can see what food is doing to their glucose levels,
(24:12):
their heart rates, and a number of other metrics as
they eat it. He also tweeted that wearables put the
power of health back in the hands of the American people.
This is horseshit. This is absolute horseshit. And I say
this as somebody who wears a wearable. I went through
quite a lot of trouble to find one that I
felt like was like the least smart, like the you know,
(24:33):
if you've got a smartphone, like the opposite, the dumbest
wearable I could find.
Speaker 2 (24:37):
But so I'm not anti wearable, but this idea that
he could gut public health infrastructure and then move that
responsibility onto individuals by saying it's our responsibility to buy
a consumer product that monitors our health is absolute horseshit
for so many reasons. First, philosophically, the idea that health
and optimizing one's health means giving hundreds of dollars to
(24:59):
a private company for them to be able to access
sensitive information about your body and your health is absurd.
I am not buying that.
Speaker 1 (25:06):
And so what these products sometimes monitor is so much
more than like your heartbeat and how much you sleep.
Kevin Johnson, the CEO of security testing and consulting at
the company Secure Lab, said, we are not just talking
about heartbeat. We're not just talking about your sleep schedule.
We're talking about your location. We're talking about most of
these apps tie into your contacts.
Speaker 2 (25:26):
Right.
Speaker 1 (25:26):
So the fact that RFK Junior is saying that we
should all be moving toward wearing wearables in four years
and that that is going to be the ideal way
for us to keep to take charge of our health,
I completely reject the idea that giving more of my
private intimate information to private companies and paying for the
pleasure is me taking charge of my health and physically,
(25:50):
wearables are notoriously not reliable. Seenet did a study and
found that even good wearables, wearables that you that like
have a pretty good reputation, are often an accurate. So
if you're trying to like casually measure your sleep, casually
measure your steps, casually measure how much calories you've burned
in a day, fine, But if you are relying on
(26:11):
a wearable in lieu of actual access to medical testing
or information to meetingfullet monitor your health. No, bad, it
does not work that way. Wearables and the kind of
information they provide is simply not a substitute for a
robust public health infrastructure.
Speaker 3 (26:27):
Yeah, not even close. It's it's such a joke. I mean,
it falls apart in multiple ways. For one, his whole
thing is like make America healthy again, where he wants
to return us to some previous state of the glory
days of health where people wearing wearables during that previous heyday. No,
(26:49):
not at all. His big initiatives are like getting people
to not take vaccines, getting people to eat more beef
tallow and less seed oils, stuff like that. Like, none
of these things are gonna be immediately visible by wearables.
(27:13):
Like wearables are fine, they're great. You know people who
like them. I use one. It provides interesting information, but
it's not gonna be the like huge difference maker in
the health of a national public population.
Speaker 1 (27:33):
Honestly, it's not even worth me going down. That's sort
of like it's so ridiculous on its face that getting
into the specifics of how it won't work and why
it doesn't make sense, it's almost not even worth it.
Speaker 3 (27:45):
Yes, that is exactly right. That's how, unfortunately, how we
have to treat everything that comes out of his idiot
mouth like. He just says such stupid stuff all the time.
At first, I thought he was just like a dumb person,
but I no longer think that. I think he knows
exactly what he's doing. I think he is performing the
(28:05):
like shapes of an informed health official, knowing full well
that he is doing so deceptively. His references don't back
up what he says. He talks in the same meeting.
He'll talk about like demanding a gold standard for vaccines,
and then he'll turn around and pull up a bunch
(28:27):
of bullshit about wearables when like, there's no evidence that
where you know, putting on wearables is going to improve
the overall health of some population. There's zero evidence for that.
He's just all over the place. And you're absolutely right
that like engaging in his the things he says in
good faith is a losing battle. It's like more whackable.
(28:50):
Everything that comes out of his mouth is a goddamn
mold that needs to be whacked.
Speaker 1 (28:53):
So I absolutely agree with you that I don't think
he's just like a stupid person. I think, you know,
with exactly what he's doing, and this is all part
of a larger agenda. There is a very good episode
of one of my favorite podcasts, Conspiratuality. It's at like
I don't recommend other podcasts a ton on this show,
but like, I will put the episode in the show
notes because you need to hear it.
Speaker 2 (29:14):
It's it's so fascinating if you care about this stuff.
Speaker 1 (29:16):
But basically, they were saying that make America Healthy Again
is really all about shifting public health from a public
concern to a private concern that will be managed by
a network of loosely slash, if at all regulated private companies,
whether it's bogus health testing companies or supplement companies and
(29:36):
now wearable. So I listened to that episode maybe a
month ago, and I was like, that makes so much sense.
That makes so much sense. And then lo and behold
today he's like, oh, you know, everybody should be buying wearables.
In four years time, every American should have a wearable.
That's really what that's all it takes to make America
healthy again. And oftentimes we'll have these make America Healthy
Again influencers getting a cut of that, because so those
(30:00):
influencers are now legitimately in the administration. So then you
have situations like the administration's nominee for source in general,
doctor Casey Means, who co founded a glucose monitoring company
called Levels and sells a monitoring app as well as
other kind of bullshit wellness products.
Speaker 2 (30:14):
And so that the whole idea is shifting the.
Speaker 1 (30:18):
Burden and responsibility of health from the public sphere to
the private sphere selling. Instead of there being okay, like
robust access to public health, robust access to healthcare, all
of that, it's.
Speaker 2 (30:31):
Like no, no, no, no scam.
Speaker 1 (30:33):
Testing that is loosely regulated and supplements and also we'll
make a little cut of that on the side.
Speaker 3 (30:39):
Yeah. And also these grifters have built their whole enterprise
on a foundation of like flooding the zone with scammy
information on social media and just overwhelming people with information
and like shiny statistics and stories that like feel one
(31:04):
way even though maybe they don't like actually connect to
a deeper impact in terms of health. And the thing
with wearables is that they produce a ton of information
and it's it truly is a way of taking that
scammy online universe of health misinformation that has so enriched
(31:28):
these scammy people like RFK Junior and the whole network
of his friends who sell supplements online. It takes that
whole scammy information ecosystem and moves it offline onto people's
(31:49):
bodies and into people's health. And it's bad. It's bad.
We shouldn't do that.
Speaker 1 (31:55):
And there's been so much research in reporting about the
fact that wearables and again I'm not anti way. I
were one of myself to measure my physical activity in
my sleep, Like I'm not anti wearable, but there's so
much research about the fact that simply having more access
to information about your vitals and your physicality does not
actually make you healthier, and in some ways that level
(32:16):
of surveillance actually might be perhaps counterintuitively, making you less healthy, right,
And so more information is not always the thing that
makes you healthier. And again, even if it were, wearables
are not a substitute for having access to healthcare, being
able to see a doctor, being able to get actual
medical tests from a doctor, not some sort of scam
(32:38):
testing company, actual public health infrastructure.
Speaker 2 (32:42):
And I just hate the way that.
Speaker 1 (32:43):
This has turned our health into just another thing these
people can grift off, another way to scam.
Speaker 3 (32:49):
Yeah, I'm not anti wearable either. I think wearables are
very valuable. And you know, if he were out there
talking about the importance of wearables as part of a
new initiative that's going to connect people and their bioinformatics
(33:11):
with healthcare providers who will monitor them and help them
make health decisions as part of some sort of cohesive
plan to improve people's health, that would be something I
would want to listen to. But there's no there's none
of that follow through and what he talks about. He's
(33:32):
just like, oh, people should have wearables. That's where it
begins and ends.
Speaker 2 (33:37):
Yeah, that's not a healthcare plan.
Speaker 3 (33:39):
No, it's not. That's a plan to sell stuff.
Speaker 1 (33:43):
Speaking of selling stuff, I have to talk about this
story about Mattel and open Ai. So are you ready
for an AI enabled Barbie Doll or an AI enabled
Hot Wheels car? Because Mattel, a toy maker behind Barbie's
and Hot Wheels, announced a partnership with open Ai that
would result in AI products marketed to kids. So to
be clear, we don't fully know what the product will
(34:06):
be just yet. They're keeping it pretty tight lipped, but
I already.
Speaker 2 (34:08):
Can tell you this. I hate it.
Speaker 1 (34:11):
One anonymous source told Axios that Mattel's plans for the
AI partnership are still in early stages, so we'll probably
no more soon. That source also said that the first
product would probably not be marketed to kids who are
under thirteen. You're probably thinking, Oh, they probably don't want
harmful AI impacting very young kids.
Speaker 2 (34:29):
Don't get too excited.
Speaker 1 (34:30):
They're probably capping it at kids thirteen and up because
of open AIS age restrictions on its API, which prohibits
users under the age of thirteen.
Speaker 3 (34:40):
Yeah. And also, just because the company says that they're
not going to explicitly market some product to kids, that
doesn't mean that kids still won't see that marketing and
still want that product and obtain and use that product.
For example, vapes are illegal to market to kids anybody
under the age of twenty one, and yet many high
(35:03):
schoolers still use them. The most recent national survey suggested
that eight percent of high schoolers are using them regularly. Right,
and so, just because something's not going to be marketed
to kids, that is no guarantee that kids won't use it,
not by a long shot.
Speaker 2 (35:23):
Yeah, exactly, so.
Speaker 1 (35:24):
Ours Technica spoke to Public Citizens co president Robert Weissman,
who really laid out just how potentially.
Speaker 2 (35:31):
Harmful this could be to kids.
Speaker 1 (35:32):
He said, Buttel should announce immediately that it will not
incorporate AI technology into children's toys. Children do not have
the cognitive capacity to distinguish fully between reality and play.
Mattel should not leverage its trust with parents to conduct
a reckless social experiment on our young children by selling
toys that incorporate AI. So, when asked about the specifics
(35:53):
of what this toy might be like, both Mattel and
open Ai were like, it's gonna be fine, trust us,
it's all good.
Speaker 2 (36:00):
Don't worry about it. We got it.
Speaker 3 (36:01):
Oh, why are we even talking about this? Then they
said I was gonna be fine.
Speaker 2 (36:04):
Yeah, they were like, it's we got this, don't even worry.
Speaker 3 (36:06):
All right, next story.
Speaker 1 (36:08):
They both put out statements where they really glossed over everything,
And I will say the statements kind of like said
the right words to signal that they're like, don't want
to harm kids with this product. What's also funny to
me is like, how nothing the state the statement from
open ai is. So in their statement they promise quote
to bring a new dimension of AI powered innovation and
(36:29):
magic to Mattel's iconic brands.
Speaker 2 (36:32):
Like I'm sorry, what does that mean? Like what does
that mean? Like? Like break that down for me?
Speaker 1 (36:37):
It really is giving, Like it's Barbie but now she
harnesses the power of AI or like but now she's
got a new hat.
Speaker 2 (36:46):
You know.
Speaker 1 (36:46):
It's just it just it doesn't It's like it says
a lot while saying nothing. I guess is what I'm
trying to say.
Speaker 3 (36:51):
Yeah, so curious what sort of toys are you gonna
roll off? The MATEL assembly line? Empowered with AI whatever
that means?
Speaker 2 (37:01):
My god.
Speaker 1 (37:01):
When I was a kid, my brother had a Teddy rubskin,
which was the most terrifying item, probably of my entire childhood.
Speaker 2 (37:10):
I don't know if people have.
Speaker 1 (37:11):
Ever experienced teddy rubskin, but it was this nightmare bear
that I think. You would put a tape. He had
a tape cassette in his stomach, and you will put
a tape in it, and the tape would play, and
his mouth would move in this very natural, animatronic way,
and then it would play the tape as if he
was speaking, but it was you could tell he wasn't
speaking well because it's like a stuffed animal, but also
(37:34):
like it didn't sync up right, like I didn't look right.
And my brother loved this fucking thing and it was terrifying.
And I just hope for the sake of the next
generation that they're not making some sort of AI enabled
nightmare style Teddy reubskin where the mouth moves but it's
fucking open AI chat gpt my god, nightmares just.
Speaker 3 (37:55):
Like feeding you sycophantic like narcissism fuel about how much
smarter you are than all the rest.
Speaker 1 (38:02):
Oh my god, I mean have you seen that? I mean,
I know you've seen it because we saw it together
the movie Megan. Yeah, so I love Megan and it
was probably the most fun I had at the.
Speaker 2 (38:13):
Theater of whatever year it came out. If you have
not seen Megan, it's about.
Speaker 1 (38:17):
A toy like like a doll that is for kids,
that is like AI enabled.
Speaker 3 (38:23):
But like it's basically this story. Yes, it's basically.
Speaker 1 (38:26):
This story, but also Meghan is cunty, like she like
like it like it's this story, but she serves cunt Also,
they're making a sequel, by the way, oh, which I
will be like first in line to see.
Speaker 2 (38:42):
I'm sure. I'm sure it's gonna be terrible, but I'll
be first some line to see.
Speaker 1 (38:45):
But I mean so that I'm glad you compared this
because it is very I think that the movie Megan
does present a version of like what this might be like,
because we know so much about the ways that AI
can be unsafe for adult even let alone for kids.
Adam Dodge, the founder of a digital safety company that
prevents cyber abuse called n TAB, pointed to a lawsuit
(39:07):
where a grieving mom alleged her son died by suicide
after interacting with hyperrealistic chatbots. He said, AI is unpredictable, sycophantic,
and addictive. I don't want to be posting a year
from now about how a hot Wheels car encourage self harm,
or that children are in committed romantic relationships with their
AI barbies, like come on a hot Wheels that convinces
(39:29):
a kid to self harm?
Speaker 2 (39:31):
No, thank you.
Speaker 3 (39:32):
Yeah, it sounds ridiculous, but it's not outside the realm
of possibility, right because it's people who are on the
borderline of the danger zone already who are most susceptible
to being harmed by products like this. And I think
everybody should be concerned about privacy. Hopefully Mattel is gonna
(39:55):
build in some good privacy protocols into their AI toy
I guess I don't know. I would think long and
hard about buying any of them for any kids in
my life. But even aside from privacy, I think those
other harms are even scarier. Like encouraging suicides. Obviously about
(40:20):
the worst, but lllms are already causing mental problems for
a lot of people. You know. Last week on the
News Roundup, You and ed Zatron talked about chatbots falsely
claiming to have therapy degrees and doling out therapeutic advice inappropriately.
Who knows what harm might come from that. But there
(40:41):
was another story that came out about the same time
about last week about people just straight up losing their
minds on Reddit. The moderators of a pro AI subreddits
that they've had to ban over one hundred people who
kept spamming the subreddit with claims about how they created
a new form of God, are super intelligence. You know,
(41:03):
all that sycophantic behavior and flattery from the chatbots just
reflecting back what they what these people wanted to hear.
It broke their brains and matel thinks that they can
safely put that stuff in the hands of children. I
don't know. I just don't buy it, Like yikes.
Speaker 1 (41:23):
Yeah, And I mean you make a good point, like
we don't even fully know how AI is impacting adults yet,
so why give why like encourage children.
Speaker 2 (41:32):
To be to be mixed up with this?
Speaker 1 (41:35):
And I do think it just I mean, we've talked
about this on the show before, but just the sort
of philosophical idea of shouldn't some things be protected like
children and play, I think that's really sacred. And the
fact that open AI sees this as just another thing
to harness, another way to you know, mine, Our kids
(41:56):
exploit their privacy, probably use whatever they're able to glean
from those experiences with children to train their AI further
for their own benefit. It just feels very exploitative. And
I mean I would have thought for a society that
so often gets up in arms about protecting the children,
(42:17):
and nobody wants to protect kids more than me, But
that is often our rally and cry when it comes
down to it, we fucking hate children. We don't want
to protect children. We will not do the bare minimum
to protect children. The bare minimum is don't let open
AI give them this technology that we already know harms
adults and we don't even know that the full scale
(42:39):
of that yet. That would be the very least we
could do, and we're not willing to do it. And
so yeah, I just I hate this. It reminds me
a lot of the way that we know that social
media platforms like Facebook knowingly harm kids, girls as young
as thirteen, and continue to knowingly do so because it
makes them a profit.
Speaker 2 (43:00):
You know.
Speaker 1 (43:00):
I wish that we had a country where tech companies
making more money was not as important as protecting our youth.
Speaker 2 (43:09):
But we don't live in that country.
Speaker 1 (43:10):
But we love to grandstand about protecting the kids when
it's convenient. When it's just the bare minimum, we do nothing.
Speaker 3 (43:22):
More.
Speaker 1 (43:22):
After a quick break, let's get right back into it, Okay,
So to say, really clear up top, this is one
of those stories where I suspect I know what's going on,
(43:44):
but I don't have any.
Speaker 2 (43:45):
Hard proof yet.
Speaker 1 (43:46):
So really I'm just asking questions about what the fuck.
Speaker 2 (43:51):
Is going on here? You got me?
Speaker 1 (43:52):
Yeah, And that is the question of is Spotify pushing
AI generated musicians who do not actually exist, but acting
as if they do exist, so I suspect, and lots
of the Internet suspects. The answer is yes. So we
have known that Spotify has had AI generated music on
their platform for a while.
Speaker 2 (44:14):
It hasn't gotten much traction.
Speaker 1 (44:15):
It's been kind of a quiet under the radar thing
because people don't really enjoy one hundred percent AI generated music. However,
now it might be that they are pushing this music
in some kind of shady ways while not disclosing that
it's entirely AI generated and these bands do not exist,
while kind of pretending like these bands are actually human.
(44:36):
So Paul Bender, who is the basis for the Australian
band Hiatus Coyote who I Love, released a new solo
project called the Sweet Enoughs and Spotify pushed his new
song to all of his followers. He has locked lots
of followers on Spotify. The Spotify was like, Hey, this band,
this guy who's in a band that you like, has
a new solo project.
Speaker 2 (44:53):
Here's the song.
Speaker 1 (44:54):
Only one problem, this was not his song and in
fact he never authorized this. He said it was some
of the most insanely clunky, amateurish, bizarre pieces of audio
I have ever experienced. And then it happened again the
next time with a distorted mumble wrap track that appeared
in his Spotify profile. That another which he said, quote
(45:16):
basically sounded like crazy Frog era eurotrash. Gotta say that
kind of makes me want to like it, doesn't make
me not want to hear it.
Speaker 3 (45:26):
It's not a genre I'm familiar with, but I am
curious about.
Speaker 1 (45:30):
Crazy Frog era euro trash is not a genre you're
familiar with.
Speaker 3 (45:33):
Yeah, am I missing out?
Speaker 1 (45:35):
I think you might be missing out. So in the
following days a fourth track surface. So he was convinced
these songs were AI generated and it made him realize
just how easy it was for platforms to do this,
to create AI generated music and then push it out
to his followers, lying saying that it was him. He
(45:55):
spoke to the Australian outlet ABC and said that he
realized how easy it was to create an AI generated
song and upload it to an artist's profile within ten minutes,
with zero hacking or authentication required. A track can be
cleared and approved to be released in five to seven days.
The whole process is effectively functioning on an honor based system,
which he says is incredibly inappropriate for the music industry
(46:17):
one of the slimiest, most parasitic places in the universe.
Speaker 3 (46:21):
Yeah, that's wild. I had no idea that. So, like
I could just upload a track and be like, oh,
this is this is the new Ariana Grande track.
Speaker 1 (46:32):
Yeah, you can upload a track of you like farting
into a microphone and being like, this is the new
AREAA just dropped it. Spotify will be like sure, okay, yeah,
that's according to Paul Bender, that's how it works. So
this was back in twenty twenty.
Speaker 4 (46:46):
Right.
Speaker 2 (46:46):
The story gets even weirder because.
Speaker 1 (46:48):
Now we might have another AI generated band being populated
via Spotify's curated listener playlist.
Speaker 2 (46:57):
The band is called the Velvet Sundown, which that name.
Speaker 1 (47:01):
Kind of made me like something about it says a
the fact that it's like very close to the Velvet Underground,
like it just.
Speaker 3 (47:09):
Oh, I went down the rabbit hole, Like there's this,
Oh my god, great story about this. I don't want
to like scoop your thunder here.
Speaker 2 (47:16):
But so.
Speaker 3 (47:18):
It's the name comes from this video game from like
ten years ago, this like super uh like low profile
video game there's released on Steam. It's like a murder
mystery on a cruise ship and the tagline is something
like like you can't trust anyone and nothing is what
(47:40):
it seems, and it's like, what a perfect name? AI bad?
Speaker 2 (47:46):
This is some like taunting letters to the police. Oh J.
Speaker 1 (47:51):
Simpson if publishing a book called if I didn't nonsense like,
this is some.
Speaker 2 (47:56):
Like do you know what I'm saying?
Speaker 3 (47:58):
Like, Yeah, listeners should really uh read that piece that
we're gonna link to about the Velvet Sundown. It is
truly an interesting, weird little rabbit hole.
Speaker 1 (48:10):
Okay, so I did not find that information about the
band's name, but I did do a deep dive. I
am confident in saying that, in my opinion, this band
is AI generated. There is just there is not a
stitch of evidence that this band exists outside of music
streaming platforms. They say the names of the individual musicians
(48:30):
wink wink.
Speaker 2 (48:31):
Who are in the band. Did a Google on them.
Speaker 1 (48:33):
There's not a stitch of evidence that any of them
has existed ever online. I know that I sound like
freaking Charlie and always Sonny. There is no Pepe Sylvia.
But I am confident in saying that, in my opinion,
there is no Velvet Sundown. This band doesn't exist. All
of the names that they have given to us are
made up. All the images that they give us are
(48:53):
AI generated. Like, well, we'll put it in the show notes.
But like when you look at this image, it's like,
this is an AI generated band, Like.
Speaker 2 (48:59):
These are not real people.
Speaker 1 (49:00):
Well, now that doesn't necessarily mean the band is fake,
Like you bet this could be their like visual style.
This is an AI generated band. This band don't exist.
Like I'm I'm very confident in saying that. So, this
band has more than three hundred and twenty five thousand
monthly listeners on Spotify, and it's not only on Spotify,
it's also on Amazon Music, YouTube, and Deezer and other
streaming services. So questions about this band first gathered steam
(49:23):
on Reddit and then later TikTok, where people were wondering
why this random band with an obviously AI generated image
and zero footprint across all of social media was being
included on Spotify's curated playlists. So Velvet Sundown have one thousand,
five hundred and thirty three followers on Spotify, but three
hundred and twenty five thousand, three hundred and eighty eight
monthly listeners at the time of the publishing of this
(49:47):
like really good deep dive article on music Ally. Their
bio on Spotify includes a very glowing quote from the
music magazine Billboard saying, quote they sound like the memory
of something you never live and somehow make it feel real,
which a Google search suggest has never been published by Billboard. Ever,
(50:08):
that is a that is not a quote, it is
it is.
Speaker 2 (50:10):
Attributed to Billboard. Billboard has never published it.
Speaker 1 (50:14):
And also again like if you were making a fake band,
it's like, oh, they sound like a memory of putting
it doesn't exist, Like what are you trying to say?
Speaker 2 (50:20):
I mean?
Speaker 3 (50:22):
That is like the best description of AI I have
ever heard.
Speaker 2 (50:26):
It really is. It absolutely is.
Speaker 3 (50:28):
It feels like a memory of something that I never experienced.
Speaker 2 (50:31):
But somehow it feels real.
Speaker 3 (50:33):
It feels real, it feels right.
Speaker 1 (50:35):
So their music is on the streaming platform Deezer, which
is actually kind of helpful because that service has been
developing technology to identify AI generated music and tag it publicly. So,
according to Deezer, some tracks on this album may have
been created using AI.
Speaker 2 (50:50):
So for me, it's case.
Speaker 1 (50:51):
Closed, right, So we'll link to this music Ally piece
that they did a very impressive deep dive and they
found that on Spotify. Essentially, these potentially AI generated bands
and songs are getting lots of play from being featured
on the Spotify playlists. If you use Spotify like I do,
you know, their playlists are a big part of the platform.
(51:11):
In my opinion, it's like the only thing that sets
it apart from other music streaming platforms. But essentially, this
fake band is being included on playlists that they have
really no business being on. For instance, a bunch of
their songs are on a Spotify playlist for the OC soundtrack.
Remember that Fox show Orange County, Oh My God, which,
(51:32):
by the way, I watched every episode I had, I
had the soundtrack. They put out a CD for that show.
That's that's how That's how deep into the trenches I.
Speaker 2 (51:39):
Was into this.
Speaker 1 (51:40):
So the OC was a show that was sort of
known for its music, and like big important moments of
the show would have like a song, and that song
would be like the hit song for the rest of
the week. So they have a playlist, it's like an
OC playlist which includes all of the famous musical moments
from the show like Phantom Planet, Image and Heap, Jeff Buckley, Oasis,
and in t two tracks from the Velvet Sundown, So
(52:03):
that means that thirteen point three percent of the entire
Orange County playlist on Spotify is Velvet Sundown, a potentially
AI generated band that don't exist, which is also especially impressive.
Music Ally points out given that the show The OC
ran from two thousand and three to two thousand and seven,
while both of the Velvet Sundowns albums today came out
(52:26):
in twenty twenty five.
Speaker 2 (52:27):
That's suspicious.
Speaker 3 (52:29):
It's one of the really interesting things that hit me
as I was like reading about this band, because like,
it's just a very interesting story, and it's nice that
it's not like fucking up kids or like destroying democracy,
So it feels like kind of a nice thing to
engage with. But I had never thought about the power
(52:51):
that the people who are creating these Spotify playlists have,
Like it's a lot of power to shape what music
people are listening to, and I guess I had just
never thought about it. I don't know, maybe I'm like
late to the party, but I kind of feel it's
like an under the radar thing.
Speaker 1 (53:11):
Oh, there is absolutely a lot of power in terms
of curating these playlists. I've heard from bands who will
randomly get a song added to a playlist and it's
like that song represents like more streams than they've ever
had in their career. Now, Spotify don't have the best
reputation when it comes to paying artists, So I don't
know if that, like that probably translates to like, here's
(53:31):
two dollars because they don't pay artists very well. But
if this whole saga with this AI generated band that
doesn't really exist is to say anything about it, I
think Spotify is probably trying to find a way that
to see if they can sidestep human artists altogether. Right,
Like one little pesky thing about humans is that we
(53:51):
really do prefer to be paid for our work.
Speaker 2 (53:54):
And so I think if they're like, how can.
Speaker 1 (53:56):
We cut out this whole humans liking to be paid
for their labor thing? And obviously Spotify would be doubling
down on AI because they're CEO Daniel Eck just announced
a seven hundred and two million dollar investment in Helsing,
a German defense tech startup that develops military drones and
AI battlefield software. Which you know what I think when
(54:18):
I think like Spotify and like music and streaming.
Speaker 2 (54:22):
I think military drones and AI battlefield software? Sure, who doesn't.
Speaker 3 (54:26):
Why does everything have to be military drones?
Speaker 1 (54:30):
Yes, everything is military drone. Everything is AI and military.
It went from everything's computer to everything is AI and
military drones.
Speaker 3 (54:39):
God damn?
Speaker 2 (54:41):
Okay, well, can will you indulge me? Can we talk
about Jeff Bezos' wedding?
Speaker 3 (54:49):
Do you think he has some drones? Oh?
Speaker 2 (54:52):
I haven't heard about any aboudy you know they're in
the mix somewhere.
Speaker 3 (54:54):
All right, Yeah, let's talk about Jeff Bezos' wedding. Is
it going really well? Does everybody love it?
Speaker 2 (55:00):
It's going?
Speaker 1 (55:01):
I mean, I'm loving gawking at what a train wreck
it's been. So Jeff Bezos is marrying his partner Lauren
Sanchez this week, which, by the way, for people who
watch Housewives, you like, when I first saw her picture,
I was like, damn, is that Mia Thornton from Housewives?
Look up a picture of Mia Thornton and look up
a picture of Lauren Sanchez. They could be sisters. They
(55:23):
look so much alike.
Speaker 3 (55:24):
Wow.
Speaker 1 (55:24):
So the wedding is scheduled for June twenty six through
June twenty eighth in Venice, Italy.
Speaker 2 (55:29):
So first of all, just.
Speaker 1 (55:31):
Have to applaud what attacking menagerie of fucked up rich people.
Speaker 2 (55:37):
Bullshit, this whole thing has been.
Speaker 1 (55:39):
Like when you look through the way that Lauren Sanchez
and Jeff Bezos met, it's very scandalous when you look
through who's coming to this wedding, Like, the whole thing
is just a I mean, it should really put to
bed that wealthy people like wealthy wealthy people have taste,
(56:01):
have refinement, have poised, because this is the tackiest menagerie
I have ever seen. And I love attacking menagerie. Like
I'm not even saying this as somebody who's looking down
on this, but even for me, I'm.
Speaker 2 (56:15):
Like, wow, these people are trash. And you know who
else agrees with me?
Speaker 1 (56:19):
The entire country of Italy, because the Italians are celebrating
these nuptials by.
Speaker 2 (56:24):
Welcoming them with waves of protests.
Speaker 3 (56:27):
Yeah, the Italians people known for rejecting all things tacky.
Speaker 1 (56:32):
I mean, do you know how tacky I mean as
an Italian? Do you know how tacky you have to
be for the Italians to be like you need to
leave too much, it's too gaudy, it's too much.
Speaker 3 (56:44):
Yeah, it's too gaudy, it's too much. Yeah, as an
Italian American, you have to really work to earn that.
Speaker 2 (56:51):
So activists with Green Peace, No Space for Bezos, and
a UK based group called Everyone Hates Elon.
Speaker 3 (57:00):
It's a pretty good name for their group.
Speaker 1 (57:04):
They have all been making their displeasure known this week
with a large banner unfrilled in Saint Mark's Square meeting
if you can rent Venice for your wedding, you can
pay more tax, which like fair point, like honestly, like
I would love to see I would love to see
somebody dispute that, like freaking yeah, understatement of the century.
Speaker 3 (57:23):
Like what if instead of renting Venice for their wedding,
everybody in America got health insurance?
Speaker 1 (57:29):
Yeah, or like yeah, we eradicated poverty, we ere eradicated
childhood poverty. So they actually have already had one victory,
and that is forcing Bezos to change the venue for
the wedding.
Speaker 2 (57:38):
Reception.
Speaker 1 (57:38):
Organizers for No Space for Bezos told the BBC that
they had gotten the venue moved to the Venetian Arsenal
after threatening to fill the canals with inflatable crocodiles, flamingos,
ducks and unicorns so that none of their I don't know,
superyachts or whatever could get through.
Speaker 3 (57:53):
I mean, I have to say that does actually sound
kind of fun. Like maybe not the dream wedding that
Bezos and his bride had imagined, but like I hope
they still filled the canals with inflatable crocodiles, flamingos, ducks,
and unicorns. I think that might be nice.
Speaker 1 (58:13):
Ooh, I almost wonder if we should do a deep
dive into jeff Bezos and Lauren Sanchez's relationship because it's
too I'll just say this for folks who for if
you know, you know, it is so juicy, Like one
of the juiciest bits of it is the fact that
the text messages that they were sending to each other
while jeff Bezos was fully married were published by the
(58:37):
National Inquirer. And I mean it's like one of my like, like,
it's a text that I have sent in Jess many times.
I love you A live girl is one of the
texts that Jeffrey Bezos sent to Lauren Sanchez when they
were like running around behind their partners.
Speaker 3 (58:51):
Is back A live girl.
Speaker 2 (58:53):
A live girl.
Speaker 1 (58:54):
That also was like what for, what's the alternative, like
what kind of girl you running around.
Speaker 2 (59:01):
With that were not alive? I have many questions.
Speaker 3 (59:07):
I guess it was a dig at his wife.
Speaker 1 (59:09):
I don't know, so, yeah, maybe we should do a
full deep dive into their relationship because it's fascinating to me.
Speaker 2 (59:16):
But in getting them to move.
Speaker 1 (59:18):
Where their wedding was going to be, the group no
Space for Bezo said, we are very proud of this.
We are nobody's we have no money, nothing, and we're
just citizens who started organizing and we managed to move
one of the most powerful people in the world, all
the billionaires out of the city. Now, a greenpeace organizer
said that it wasn't so much about protesting these two
specific people, but more what they represent. The riches live
(59:41):
in excess while others endured the consequences of a climate
emergency they did not create. And I have to say,
living in DC, we were going through a historic heat
wave and I did have this moment where I was
reading about this wedding and the huge environmental impact that
it is that it will definitely play and just like
getting this note from my local government that was like, oh,
(01:00:04):
in a heat wave, the best temperature to set your
air conditioner at is seventy eight, and I was like,
the hell you say seventy eight, my ass, And I
just had this moment of like, why are brokies like
me expected to sweat it up in our one bedroom
apartments with our window units in our box fans while
he's able to have this lavish, voluntary wedding display was
(01:00:26):
seemingly no regard to how it might impact the climate.
I will say I love how people are kind of
generally protesting jeff Bezos, Like there are specific issues and
specific groups that people are protesting about, like the environmental
impact of this wedding, which is massive Bezos's aerospace investments,
but also just like generally anti Bezos being in Italy
(01:00:49):
and anti Bezos in general, just like we don't like him.
I saw we were watching that video that was set
to the Bo Burnham.
Speaker 2 (01:00:55):
Song Jeffrey Jeffrey Bezos.
Speaker 1 (01:00:59):
And it was a video of them just unfer like
a massive banner that said Bezos with a big red
X through it, just like we don't like him, Get
him out of here.
Speaker 3 (01:01:08):
Yeah, there's even like the names of their organizations that
you just read. It's funny how personal these protests are.
It's just like we don't like him personally, we don't
want him here, and I get it. I do hope
that once the honeymoon is over that some of this
energy gets channeled into like demanding change in the global
(01:01:32):
system that just keeps funneling an ever increasing proportion of
the world's wealth into the pockets of a handful of
olive arts. Like I love seeing the signs and messaging
that focus on taxes for that reason, you know, like
the one that just says Bezos with a big X
over it. I get it. That is satisfying, But like
the ones that focus on taxes and climate feel like
(01:01:56):
they had the more enduring political message, like forcing him
to move his wedding to a more secure, yet still
intensely opulent than you. It's a nice reminder that these
people are not all powerful. But I think like a
real victory for the people will be getting able to
(01:02:16):
pay his fair share of taxes.
Speaker 1 (01:02:18):
So you don't think the slogan Bezos colon, we just
don't like him, that's not compelling to you.
Speaker 3 (01:02:26):
It's a good start. I think it's a good start.
It gets the people going. But we can't stop there.
Speaker 1 (01:02:34):
Bezos colon, He rubs us the wrong way. So the
guests to this wedding include that people like Bill Gates, Oprah,
of course Gail, She's never gonna like turn out an
event like this, climate activist Leonardo DiCaprio, which like, come on,
Barbara Streisan, Eva Longoria, Robert Pattinson, and Orlando Bloom.
Speaker 2 (01:02:58):
Allow me a quick diversion on our Lando Bloom. So,
Orlando Bloom was.
Speaker 1 (01:03:02):
Very recently, up until recently, in a relationship with the
singer Katy Perry. They have a child together, and I
don't know who at the Daily Mail has it out
for Katy Perry so much, because this Jeffrey Bezos wedding
is really being used to highlight that her relationship with
Orlando Bloom is over, and that the reason why that
relationship is over is because of the fallout from her
(01:03:24):
panned spaceflight that she took with Jeffrey Bezos. It honestly
kind of sounds like Bezos is like ruining her life.
This is how Ola Magazine reported on it. Katy Perry's
recent journey to space may have only lasted eleven minutes,
but the aftermath has gone on for much longer in
her personal life. The pop Star and her longtime fiance
Orlando Bloom are reportedly facing a rough patch after an
(01:03:46):
explosive argument over her Blue Origin space flight.
Speaker 3 (01:03:50):
An explosive argument.
Speaker 1 (01:03:52):
So that spaceflight was poorly received, we'll say, And it
sounds like, according to these like gossip rags, maybe she
like ruined her marriage and now she's being publicly excluded
from like the rich asshole spectacle wedding of the year.
Like I know people that are down on Katie Perry
right now for whatever reason, but it genuinely sounds like
(01:04:13):
Jeff Bezos is ruining her life.
Speaker 3 (01:04:14):
Damn. So she wasn't even invited to the wedding. She
got to go to space but then it was like
you can't come to Venice.
Speaker 1 (01:04:21):
Yeah, I mean that would be another good deep dive
is that space flight.
Speaker 2 (01:04:24):
I did read an article where she was like, oh,
I wish that.
Speaker 1 (01:04:27):
I think our big mistake was letting the video footage
that we took from the space flight go public.
Speaker 2 (01:04:31):
And it's like, well, literally, what did you think they
were taking video footage for? Like what of course people
were to see it? Like what did you say?
Speaker 3 (01:04:39):
Yeah, it wasn't like a scientific mission.
Speaker 1 (01:04:42):
So this wedding seems involved. Wired reports that exclusive private
parties are playing at secret locations and smaller islands of
the lagoon, and it's an event that will leave its
mark on Venice, including in terms of the environmental impact
and the possible inconvenience it could create for the city's
transit infrastructure. Guests will arrive on eighty private j and
travel aboard more than thirty already reserved water taxis, yachts
(01:05:03):
and gondolas. According to some official sources, flights from New York,
Los Angeles, London, Paris and Dubaier planned not to mention
luxuryats coming to Venice, with moorings already planned between different points.
Speaker 2 (01:05:15):
So there is an argument.
Speaker 1 (01:05:17):
That this wedding could potentially help Venice's local economy conveniently enough.
Can you guess where that has been reported that this
wedding actually is a good thing because it's going to
help support Venice's local economy.
Speaker 3 (01:05:31):
Uh. The Bezos Daily Newsletter, I.
Speaker 1 (01:05:36):
Mean essentially the Washington Post, which is owned by Bezos.
The Post declared that about eighty percent of the produests
and services come from local Venetian suppliers. I don't know
if that's true, but it does kind of feel like
me saying there are no girls on the internet reports
that Bridget Todd is actually really nice to everybody all
the time and is super smart, you know what I mean,
Like convenient that the paper that you own says your
(01:05:58):
wedding that everyone hates is actually good.
Speaker 3 (01:06:01):
It's nice. That's a you know, they've just got their
own take.
Speaker 1 (01:06:05):
But in all the pictures of famous celebrities arriving at
this wedding, I will say, like you can kind of
see from the pictures that everybody is sort of like,
oh why am I like, like like I don't know
if I should be here. With the picture of Tom
Brady with his like hat really low, Oprah doesn't look
too like nobody looks thrilled to be going to this wedding,
And it does sort of make me happy that people
(01:06:28):
are really spotlighting the ways that you know, celebrity, even
celebrities that say the right thing, sometimes really just care
about other rich celebrities, like they don't really care about us.
They will go to Jeffrey Bezos's wedding, you know, they
will probably be at a table with Avanka Trump, who
is definitely gonna be there. The Kardashians are going to
(01:06:50):
be there, Like I don't know, I hope this is
if that picture of Ellen DeGeneres from the Grammys with
all of these different celebrities that like famous celebrity selfie,
if that was the welcoming in of celebrity culture of
the digital age. I hope the images of celebrity sort
of shamefully attending this like spectacle, this like rich asshole
(01:07:12):
sepectacle is the nail in the coffin of celebrity worship
online because I do think that like the tides are
turning because nobody wants to sit in their hot ass
one bedroom apartment during a heat wave and watch Jeffrey
Bezos yat in and fly in every celebrity on the
planet for his taxi ass wedding. Well, Mike, thank you
(01:07:33):
so much for being here. Be sure to follow us
around the web. I'm on Instagram at Bridget Marie DC.
I'm on TikTok at Bridget Marie and DC, and we're
trying to grow our YouTube presents. So if you like YouTube,
check us out there at there are no girls on
the Internet. Thanks so much for listening, and I will
see you on the internet if you're looking for waste.
(01:08:00):
To support the show, check out our March store at
tangody dot com Slash Store. Got a story about an
interesting thing in tech, or just want to say hi?
You can read us at Hello at tangody dot com.
You can also find transcripts for today's episode at TENG
Goody dot com. There Are No Girls on the Internet
was created by me Bridget Tood. It's a production of
iHeartRadio and Unbossed Creative edited by Joey pat Jonathan Strickland
(01:08:22):
is our executive producer. Tari Harrison is our producer and
sound engineer. Michael Almado is our contributing producer. I'm your host,
Bridget Todd. If you want to help us grow, rate
and review us.
Speaker 4 (01:08:31):
On Apple Podcasts.
Speaker 1 (01:08:33):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts