Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. Welcome to
There No Girls on the Internet. Where weeks were the
intersection of technology, social media and identity. And this is
(00:24):
another installment of our weekly news roundup where we dig
into all the stories that you might have missed on
the Internet so you.
Speaker 2 (00:30):
Don't have to it has been too long.
Speaker 1 (00:32):
I am so excited to welcome back today's guest co
host producer Joey Joey, thank you for being here.
Speaker 2 (00:38):
Hey, Bridget, thanks for having me on again. I have
to ask what is capturing your attention on the internet
right now? Anything? What do you got?
Speaker 3 (00:47):
That's a great question, Bridget. Actually I've really even wanting
to talk about because this combines two of our favorite things.
Well see, I don't know what your personal relationship to
the Broadway musical book Slash now two part movie Wicked is,
but I have done and I can see your face.
(01:09):
I have been obsessed. There's been so much that has
happened with that movie and their marketing campaign. I'm supposed
see it tomorrow, so excited for that, but I've been
really obsessed with the amount of brand deals that they've
done recently.
Speaker 2 (01:25):
On TikTok.
Speaker 3 (01:26):
I got an ad for Crest, like the toothpaste company
that was Wicked themed. Of course you did, of course,
that is right, and so I was curious. So I
did look this up according to and it took me
a lot of different articles to go through to finally
find an actual number, because there's so many articles right now.
Speaker 2 (01:47):
That are like top ten, top fifty Wicked items to buy.
Speaker 3 (01:52):
According to the International Business Times UK, there's four hundred
and fifty brand deals for this one movie.
Speaker 2 (01:58):
This is record breaking. That was gonna say that has
to be more than ever.
Speaker 3 (02:03):
Right, they're breaking the record from the first movie, which
was about four hundred.
Speaker 2 (02:08):
And it looks like before.
Speaker 3 (02:10):
That, like the like Spider Man movies, the we're the
most amount of money, which makes sense. But yeah, no,
I just I was curious because like I literally it's again.
I was like looking through these articles and there were
like dishwasher soap and like there was a swiffer collapse.
Speaker 2 (02:29):
There were laundry pots.
Speaker 3 (02:30):
There was a mac and cheese one that would either
turn pink or green.
Speaker 1 (02:35):
I just like, oh, their products are so like weird
and random.
Speaker 3 (02:39):
Also they're so weird and then but like, here's the thing,
I'm not gonna lie again. I personally grew up with
this musical. Like I'm a big wicked fan. Me, me
and my sister want to see Tomorrow. I have been
following for some of these. There are a couple that
I'm like. I did buy the the like Pillsbury cookies
that they had. I actually was going to get the
(03:01):
like Halloween cookies they usually have, and they had sold out.
Speaker 2 (03:04):
This was four Halloween.
Speaker 3 (03:06):
They were out of them, but they had Wicked ones,
so I got those instead.
Speaker 2 (03:10):
They were fine.
Speaker 3 (03:11):
They tasted like the cookies usually do. But the one
that I keep seeing that I haven't gotten any ads for,
but I just keep seeing on TikTok is.
Speaker 2 (03:20):
The Chilies one.
Speaker 3 (03:22):
And this happened before I left to California, so I
was still in New York.
Speaker 2 (03:27):
I don't know if you know this bridge it there
is one.
Speaker 3 (03:29):
Chilies in all of New York, which first of all,
I didn't know.
Speaker 2 (03:32):
Wait where is it. It's not Queens, it's idea.
Speaker 3 (03:36):
It's like, so I beat a couple of my friends
went and we got it, like it was a Saturday,
but it was like we were like, we're gonna go early.
We're gonna go like five pm. Uh, we get there
this one chilies. Do you want to guess how long
the wait was?
Speaker 2 (03:52):
Oh my god, hell, I can't even.
Speaker 3 (03:54):
It was an hour and a half to get it.
Chiliens just chilies, Like I. There was like a family
in front of us in line that was also like
what the hell? And I was like, yeah, over like
here to get the Wicked drinks. They were like, we
aren't too, So I was like, I literally this advertising
is working. And again I haven't actually seen any Chilis
(04:16):
advertisements for Wicked specifically. I just keep seeing people making
tiktoks about going to Chili's and like with their drinks
and like making little jokes about.
Speaker 2 (04:25):
The Margarita's and whatever. I just wanted one of them.
Speaker 3 (04:28):
I didn't even I don't even like Margarita's, like I
wanted just I was like, I will go. I just
want the wand there's this little Glinda wand you can
get but yeah, I was like I'm I'm sorry, I'm
not waiting an hour and a half for chili. So
I have yet to get my Wicked Drinks. Heartbroken, I can't.
Speaker 1 (04:46):
I'm so curious how this came to be, How this
Wicked Chilies, what came to be.
Speaker 3 (04:53):
That at least kind of makes There's so many aspects
of this, like I'm I'm still like, how did the
crest one happen?
Speaker 2 (04:59):
How did the swiper one happened?
Speaker 4 (05:01):
Like?
Speaker 2 (05:01):
What happened? First?
Speaker 3 (05:02):
These companies approached the Wicked marketing team or did the
marketing team approach them?
Speaker 1 (05:06):
Like what happened? I have a question for you about this.
So you you self identify as a theater kid.
Speaker 2 (05:13):
I do.
Speaker 4 (05:15):
I do.
Speaker 3 (05:15):
I'm a former theater kid. I haven't actually done theater
in a long time. I mean, once you're in your in,
you're in for life. Anybody wants to cast me and anything,
you know.
Speaker 1 (05:24):
Yeah, for your consideration, Joey, exactly, you are in La.
The one will discover you on the street. Okay, So
I let me just make it very clear all the caveats.
I love Ariana Grande much respect. I love Cynthia Arriva
was my girl. Nobody has pipes like her. Nothing but
(05:45):
respect a queen. I love Wicked. I saw it on Broadway.
I saw the first movie. Love it no problems here.
I know it's I know it's a deep fan base,
so I.
Speaker 2 (05:56):
Also really like it.
Speaker 1 (05:58):
I do feel like when I watched the interviews between
Ariana and Cynthia, I feel like I'm missing something and
I don't understand what it is. And someone was like,
you just don't understand theater energy, and I feel like
I get theater energy.
Speaker 2 (06:15):
But what am I am? I way off base here?
Do you know what I'm talking about?
Speaker 4 (06:20):
Like?
Speaker 3 (06:20):
The thing is you just have to go into it
knowing that they're both kind of insane, and I'm I'm no,
I agree with you though, it is like something weird
has been happening this entire press to or this was
happening here like last year, because it is so we're
having the same rounded discourse that.
Speaker 2 (06:37):
We had a year ago.
Speaker 3 (06:38):
So I'm like, I'm ready again, this is not what
you guys. Can't see what I'm doing. The thing where
Cynthia was holding Ariana's finger or the other way around,
I can't remember who was holding Who's fittinger fixing thee
We'll say the like I am in queer media. That
interview moment that was If anybody follows me on Instagram
my bio has been I am in queer media for
the past two year.
Speaker 2 (06:58):
From that interview, I don't know. I really don't know.
Speaker 3 (07:01):
I wish I could get a glimpse into both of
their psychics. I'm the same or I love Ariana Grande.
I like listen to her music for years now. I'm
not like a huge like she's one of my top artists,
but like I'll always listen to her.
Speaker 2 (07:15):
She's fine.
Speaker 3 (07:17):
I really liked her new album that she put out
that was really good. And same with Cynthy Revio. Amazing voice,
you know, I do. And I just was rewatching the
first movie before I left and watching it. It was
sort of like me and my friend were watching it.
We both kind of were talking about this too. And
my point was kind of like, you know what, here's
(07:38):
the thing. It is like the old Hollywood diva thing
where I'm like, but they're so talented, like they have
such amazing voices that I'm kind of like, yeah, sure
they can do whatever they want.
Speaker 2 (07:47):
Yeah, And I don't.
Speaker 3 (07:49):
Know Cynthia Reva, I just the amount of like gifts
that we've gotten from.
Speaker 2 (07:55):
Saye.
Speaker 3 (07:57):
So yeah, again, like I think they both are just
like they're like Broadway type divas, and we haven't gotten
a lot of those like this famous in a while,
so maybe that's maybe that's what they meant by like
the theater kid thing.
Speaker 2 (08:11):
I don't know see it.
Speaker 3 (08:12):
I don't know, but it's fascinating to watch unravel in
front of everybody.
Speaker 1 (08:18):
I completely agree. I also think part of it is that.
So when I saw Wicked in the theater, I did
not know it was going to be a two part
movie until oh, really.
Speaker 2 (08:29):
Yeah, I was like, oh, that's a sequel. Okay.
Speaker 1 (08:32):
Also, when I was in the theater, I was like,
I swear I respect the craft of and musical theater.
Speaker 2 (08:38):
I was in a.
Speaker 1 (08:39):
Showing where everybody around me was singing, and I was
the one person who was like, wouldn't it be cool
if we could hear them sing?
Speaker 3 (08:45):
Though No, was it one where it was like specifically,
like you're supposed to sing in this one?
Speaker 2 (08:50):
Or was it just a normal showing?
Speaker 1 (08:51):
So I think it was just a normal showing. It
was just a very okay And that was the one
that grow who was like, I.
Speaker 2 (08:59):
Agree with you. I agree with you.
Speaker 3 (09:00):
I think that's a cool I think look listen, there
were plenty of screenings that were like sing along, like
come here and yeah, like again.
Speaker 2 (09:08):
I I went to one of.
Speaker 3 (09:10):
This is a whole other story, but when the movie
Cats came out, I went to one of the ones
where you were supposed to like yell at the screen.
Speaker 2 (09:17):
That was an experience. I don't know if I would.
Speaker 3 (09:20):
Repeat that, but you know, there are specific showings that
are meant for you to sing along, So why not just.
Speaker 2 (09:24):
Do it anywhere?
Speaker 3 (09:25):
I don't know, I agree with you. I hate it
when people talk over movies.
Speaker 2 (09:28):
Don't do that.
Speaker 1 (09:28):
I once almost came to blows it a screening of
the Menu because a woman was watching full volume tiktoks.
Speaker 2 (09:35):
Right next to me, and I was like, we're gonna
have to physically fight in this theater. Oh my god, weird.
Speaker 3 (09:41):
You know we're bridget Speaking of music, I've been seeing
on TikTok a lot lately that I think really embodies
this moment.
Speaker 2 (09:51):
Have you, just because this has been stuck in my
head all week?
Speaker 3 (09:55):
Have you seen the like we Are Charlie Kirk song
that's going around TikTok?
Speaker 4 (10:01):
I have not.
Speaker 1 (10:02):
You haven't, Okay, because I don't learn. I haven't been
on TikTok it's so.
Speaker 3 (10:07):
Here's the thing. I heard this song because a why
people were like, oh my god, this is in sane.
I was like, this has to be AI. This sounds
like it's AI. I did just look it up. According
to Forbes, Yes, it is Ai Bridget. I'm gonna send
you this song because, oh my gosh, this has been
stuck in my.
Speaker 2 (10:25):
Head all week. We are Charlick, that's black.
Speaker 4 (10:43):
Charlie Cakar.
Speaker 3 (10:48):
Chiro, we are Cholly Kirk.
Speaker 2 (10:57):
Oh my god, I'm giving that sound fine enough, So.
Speaker 3 (11:00):
Oh my gosh, Yeah, that's all you need to listen to.
Speaker 2 (11:06):
That's the yeah.
Speaker 3 (11:08):
We are Charlie Kirk chorus has been I have heard
it way too many times over the past week. I
didn't need to hear it once. And yeah, now it's
been stuck in my head again. Apparently it is AI generated.
I'm not surprised. It does kind of sound like it's
a I've heard it compared to like Alex Warren, who
(11:33):
I've been told as a musician, I actually have not
heard any of his songs. I just know he's like
a Mormon music guy, or like imagine dragons or I
want to do an investigative series at one point about
like the Mormon influence on the music industry.
Speaker 1 (11:47):
Oh yeah, Mormons are very hot right now. Yes, it's
it is a thing, you know what I'm talking.
Speaker 3 (11:54):
Yeah, but even like looking back, Mormons are really hot
right now.
Speaker 2 (11:58):
I don't know what's going on with that.
Speaker 3 (11:59):
But even like, look, I was like, like I keeping
like Ben's and Boon kind of reminds me of Panic
at the disco, and I'm like, oh, yeah, because they
were ex Mormons, and like, imagine dragons also ex Mormons.
There's too many, like all rock bands that are ex Mormons.
Speaker 2 (12:15):
I didn't know imagine dragons or ex Mormons. Magic dragons
are ex Mormons.
Speaker 3 (12:18):
Yeah, apparently they left the Mormon Church because of their
stance on like gay people. So you know, I guess
go imagine dragons, allies ally magic dragons. Who fucking would
have thought, you know what when I was like twelve
(12:39):
radioactive slabs. So I don't know, it's still kind.
Speaker 2 (12:43):
Of those actually they they.
Speaker 1 (12:45):
I know they're so cringey, but I have the us up.
I will be lying if I said I have a
little bit of spot for them, right.
Speaker 3 (12:52):
No, it's just like they're they're like a nostalgic band.
I think I just yes, like whatever I do hear
their music, I'm like, oh, like this is kind of
fun though.
Speaker 4 (13:01):
Yeah.
Speaker 1 (13:01):
At the Democratic National Convention in like twenty whatever, the
Tony like like the Clinton one. I straight up cried
while on drugs during a performance of the Black Eyed
pieces where.
Speaker 4 (13:14):
Is the Love?
Speaker 3 (13:15):
And I was like, Bridget, I am really on drugs.
Speaker 1 (13:20):
So I I was just like, they're so right, where
is the Love?
Speaker 2 (13:29):
Well, speaking of music.
Speaker 1 (13:31):
We should talk about what is going on with the
rapper Meg thee Stallion because she has been straight fighting
racialized and gendered misindisinformation online. She's been doing it for
kind of a while. During the criminal trial against rapper
Tory Lanes. This is what a lot of this stuff started.
This week, Meg thee Stylon was in court trying to
(13:52):
fight AI generated deep fakes of herself. So a little
bit of backstory for folks who don't remember. Canadian rapper
Tory Lanes got sentenced to ten years in prison after
he was found guilty of shooting rapper Meg the Stallion
in the foot back in July of twenty twenty while
they were all in a car leaving a party at
Kylie Jenner's house. So this trial, like, I followed it
(14:13):
pretty closely because it was one of those trials where
it just.
Speaker 2 (14:16):
Really was like a.
Speaker 1 (14:18):
Wave of gendered and racialized misin disinformation, like basically lies
to smear Meg. Honestly, it quite reminded me of the
Amber Heard Johnny Depp defamation trial. The thing that stood
out to me is how much like content creators online
and on social media blogs, YouTubers, podcasters, they played such
(14:41):
a role in spreading just flat out incorrect information about
this case in a way that really discredited Megan as
this like victim of violence, and it created this climate
where lies generated so much engagement that like, ultimately it
kind of didn't matter what the truth in quotes was
because there was so much smoke and noise and distortion,
(15:03):
which ultimately harms us all in the process. And so
that trial, if folks remember it, I feel like it
really calcified just flooding the zone as a tactic for
abusers to avoid accountability. That Honestly, again, I really feel
like Johnny Depp just perfected this and now we're all
stuck with it. But all of these online contact creators
(15:24):
were able to create a lot of smoke and distortion
and distrust and lies and confusion.
Speaker 3 (15:29):
You know.
Speaker 1 (15:30):
It was this cottage industry of like bloggers and like
so called body language experts and media personalities and TikTokers
who you know, we just live in this climate where
not everybody is going to recognize, oh, this random YouTuber
is not the same thing as NBC News or something.
Speaker 4 (15:47):
Right.
Speaker 3 (15:48):
The body and language experts were my I don't want
to say were my favorite, because it was just one
of those things every I think I was a little
bit more present for the johnnyt member her trial, but
for that one specifically, I mean, like, and I know
that the Megama sality one was kind of similar.
Speaker 2 (16:03):
Some of the stuff people were saying online was insane.
Speaker 3 (16:06):
Also, all of a sudden, it was like everybody had
a psychology degree, all of us.
Speaker 2 (16:10):
Everybody had a criminal psychology.
Speaker 3 (16:12):
Degree also, and this is me also kind of having
my armchair e whatever. I took one forensics class in
high school because it was like the easiest science class
I could take. But I literally remember and that they
told us how they were like, yeah, a lot of
this like body language, eye contact reading, whatever, stuff is
(16:33):
fake because it's there's so many other factors that can
comment anyway.
Speaker 2 (16:37):
I don't know. It just was like one of those
segad words.
Speaker 3 (16:39):
It's not only just people not really knowing what they're
talking about and claiming to because they're on TikTok and
they're a loud about it, but also like the thing
they're claiming to know a lot about, is it really
like a science or an exact science with It's kind
of just like a an idea that sometimes works exactly.
Speaker 1 (16:58):
But when people are just fending for any information that
scratches the itch of telling you this person who you
don't like is in the wrong, they'll watch it. And
so even like anybody can claim to be a body
language expert and get out on YouTube and say because
so and so held their face like this, it means xyz.
(17:19):
That kind of content when you make it about somebody
who is the target of this kind of like misin
disinformation campaign is.
Speaker 2 (17:25):
Always going to do numbers, right. It does like it
sort of like doesn't.
Speaker 1 (17:28):
Matter that it's a complete bunk science. It's like that's
like made up that anybody can say anything. It's essentially
like fan fiction that people project onto people, and there's
always going to be an audience for it.
Speaker 2 (17:40):
It's it's it really is.
Speaker 1 (17:41):
I mean, like anybody can claim we can start a
body language YouTube channel tomorrow and it will probably do
numbers because people love it.
Speaker 3 (17:48):
It's funny that you said fan fiction specifically, because this
is making me realize that feels very similar to right
now with the other epic internet news story or guess
not on internet just politics news story of the week
is uh, the alleged Bill Clinton Donald Trump affair and
seeing some of the clips from that that people now
(18:09):
are posting and being like see like comparing clips from
the twenty sixteen presidential debates to like challengers, like it
feels like it's that level of like investigation where it's like,
this is a funny joke people are making in this situation,
but unfortunately this is a thing that people do use
and realize, Like I've seen when the like I've people
see people do that with the Joe Biden and Trump,
(18:32):
Like when you see them talking off camera, people like
try to be like lip reading, here's what they're saying.
It's like a it's a joke, Like it's all stuff
that's fake. Anyways, Yes, bridget back to Megan thee Stallion.
Speaker 1 (18:43):
Yeah, but it's so true that you bet that people
will people will and do project anything onto public figures,
and there's always going to be an audience for that.
And like during the whole Meg the Stallion Tory Lane's trial,
the real meat of it was that people online made
(19:05):
it sound like it was Meg against Tory, but in
reality it was Tory against the state of California.
Speaker 2 (19:10):
These were like state charges, and.
Speaker 1 (19:12):
So they were able to do this in a way
that put Meg her reputation and her history and her
credibility on trial, even though she was the one who
had been shot, right, And so it created this legacy
of all of these outright falsehoods about Meg fueled by massage,
noir and just sexism and racism, including Milagro Grahams, who
(19:35):
is a hip hop news personality who was on YouTube
who pushed this theory that Meg was never actually shot,
that she just stepped on glass, even though a surgeon
testified that they had found bullet fragments in her foot,
so she was definitely shot, no question about it. So
Grahams even testified that she had been on a call
with Tory Lanz and his father who wanted her to
(19:56):
essentially continue to use her YouTube channel and social media
presence to essentially make up stories about Meg the Stallion
on social media. When NBC News talk to Grahams, she said, quote,
on my end, everything is not going to be something
that was intended to be a factual statement. It might
have a comedic effect. So when this happened, Meg vowed
(20:18):
to sue bloggers who continued to spread false information about
her during and after this trial. And now Meg is
taking this YouTuber molagro Grams to court.
Speaker 3 (20:27):
Good for her as she should.
Speaker 1 (20:29):
During the trial this week, when Meg took the stand
per NBC, she was accusing Grahams, this YouTuber of encouraging
her thousands of followers on x and Instagram to view
an unauthorized sexual explicit, deep fake video of Meg the
Stallion that had been circulating on social media. The cessimony
(20:49):
was like pretty emotional. Meg shared that quote, I feel
like to this day, I feel a little.
Speaker 2 (20:55):
Defeated, often sobbing.
Speaker 1 (20:57):
When describing the image the Aigen image of depicting her likeness,
she says, quote because no matter what, no matter if
the video was fake or not, Grahams wanted it to
be real. And so you know, this person has basically
been making content about Megna Stallion all through this trial,
(21:17):
and already the courts ordered this person to pay meg
the Stallion's legal fees. Interestingly enough, the lawsuit does not
accuse Grahams of creating or posting the video, but suggests
that she quote willfully and maliciously promoted it to her followers,
pointing them to a post that had directly shared it.
And so, I do this is something that I find
(21:38):
I'm interested in whether or not a YouTuber can be
held accountable for spreading a fake video like this that
they themselves did not create. But I think it's, you know,
bigger than this one content creator. I think like meg
is right for trying to refuse to let this digital
machine that we know is built on the massage noir
(21:59):
sex as degrading women, particularly women of color, letting this
machine be the thing that decides the truth. And I
think that she is really trying to show what happens
and how easily conspiracy, profit, online engagement, all of that
can spread quickly online and drown out the facts when
(22:20):
a black woman has been harmed. And so I'm curious
to see how this will will shake out. But I
think that we have to as a society kind of
decide whether or not we're going to accept a world
where using AI to violate a black woman who was
already like publicly the victim of a shooting, like horrible abuse,
(22:42):
whether or not that's just okay fodder for content, right, Like,
I think that's really kind of what's at stake here.
Speaker 2 (22:48):
Yeah, absolutely agreed.
Speaker 3 (22:49):
I mean, Mega Nostalia is one of those celebrities that
because she is so prominent in the public eye, a
lot of artist mession around women and black women in particular,
like she's gonna kind of end up becoming the face
of that, whether she wants to or not. I mean,
I don't think she wants to, because I can't see
why anybody would want to. Again, so much love for
(23:10):
her for like going through all of this and not
shutting up and not taking a step back and actually
fighting like for justice here.
Speaker 2 (23:21):
Nobody should have to go through that.
Speaker 3 (23:23):
I feel like every time I come on the show,
there's a story about like AI nude deep fakes or
porne fakes or something, and it is like it's something
I mean, I have to think about. It's something that
I think most like nonsens men have to think about.
I grew up with the rhetoric, you know, growing up
of like never take nude photos, never send nude photos.
(23:43):
It's gonna be really bad, They're gonna get leaked, they're
gonna be whatever. At this point, you don't need to
be the one doing it, Like it is sort of
like I think that's bull should to begin with, Like
it's complicated, intimacy is complicated, but also like we live
in an earrow where it's like, yeah, it doesn't matter
what you do because somebody could easily leak fake like
a fake sex tape or fake nudes of you.
Speaker 2 (24:05):
So at this point, like it's it's who cares anymore?
Speaker 3 (24:10):
But also yeah, I mean I think, like, good for
her for for bringing the support, for making this, for
not just sort of letting it side.
Speaker 2 (24:18):
I think.
Speaker 3 (24:19):
I mean, we've talked before about like I still stand
by and my maybe I'm a little bit psychic. I
did predict that Taylor Swift because was going to be
somehow related to this whole AI news thing, and at
the top.
Speaker 2 (24:31):
Of twenty twenty four she did.
Speaker 3 (24:34):
She was picked up. A lot of bridget was there
in the episode when I was sure, I was like
dates on this notes app but say I wrote this
is my twenty twenty four prediction before it happened, and
then yeah, Like I think Taylor Slift is another example
of like somebody who, regardless of how you feel about her, uh,
the way we talk about her is very reflective of
like how we think about women, which has been discussed
(24:56):
on the show before. I think it is really some
like it is really admirable to me and really worth
noting that, like all of these famous women have actually
been kind of like taking these issues to court because
a it is like if it happens to then it
can If it happens to them, it can happen to
any of us, Like Megan thee Stallion has way and
(25:16):
more money and resources than I ever will have. But
also it's like if somebody's gonna have to do, if
somebody's gonna have to be the one to like stand
up and argue this in court, Like yeah, good for her,
Good for Megan the Stallion for like standing on her
ground here.
Speaker 1 (25:28):
I completely agree, and it's one of the reasons I
always worry when I talk about this and people are thinking, oh,
this is like celebrity fluff, But to me it's not
because if this is how they'll treat Meg the Stallion
like someone people love, someone with lots of resources, access, money,
Da da da. This, If this is how they treat
Amber Heard, how will they treat a bridge a toad?
(25:49):
How will they treat a Joey pat How will they
treat other people who do not have the resources and
the access. And so if someone with as much resources
as Meg, you or Amber Heard have will be treated
this way in court when they get their day in court.
Speaker 2 (26:07):
Does not bode well for any of us.
Speaker 1 (26:08):
And so I don't talk about it because it's a
celebrity fluff story. I talk about it because I think
it does illustrate something for the rest of us that
do not have the privilege and access of celebrities. And
if these celebrities could not escape this climate that is
so toxic and is so full of lies, none of
us will. And to your point about you know, I
(26:32):
got the same guidance, don't take pictures of yourself.
Speaker 2 (26:34):
Da da da da.
Speaker 1 (26:35):
When there was the iCloud photo hack in twenty fourteen,
so many prominent voices in tech echoed that they said, well,
these starlit should have never taken those photos, never mind
the fact that those photos were illegally hacked by a criminal,
and the hack of those photos was a sex crime.
We never really talked about it like that, but this
(26:56):
person committed a crime. Why you're blaming the victim for
taking those photos?
Speaker 2 (27:00):
Don't know? However, now you know, ten years later, we
see how.
Speaker 1 (27:05):
Not sufficient of a public conversation it was to just say, oh, well,
it's their fault for taking these photos, because now you
don't have to take photos. Now anybody can just get
your yearbook photo and use AI to make sexualized deep
thike images of you. It doesn't matter if you're a
fourteen year old New Jersey student or Meg the Stallion.
And so currently, that opportunity for a public conversation about
(27:26):
our tech climate when it becomes to identity and gender,
we did not have the conversation that we needed to
have because here we are ten years later and it's like, well,
it doesn't really matter if if you choose to take
your clothes off or not, they're gonna use AI to address.
Speaker 2 (27:38):
You anyway, exactly exactly.
Speaker 3 (27:40):
I mean, I do think the Google iCloud hacking woman,
I will say I think that was like a radicalizing
feminist moment for like fifteen year old me. Would that
puts out where that's sort of like shattered, because yeah,
no exactly. That was like the rhetoric and the news
all like oh my god, like how dare they take
these pictures? And I was like, well, you're talking about
(28:02):
all these actresses that I look up to and whatever.
Speaker 4 (28:06):
I don't know.
Speaker 3 (28:06):
It is like we're laying in a time where where
people are trying to police sexuality and particularly women and
like non suspense sexuality, and like.
Speaker 2 (28:16):
We do also like this is this is one aspect
of that. I do always love.
Speaker 3 (28:21):
To point to the the Uh Shit's Creek episode where
Moira's like old nudes that she took got leaked and
she's like she has like she's talking to Alexis and
the what's her name?
Speaker 2 (28:37):
Cv?
Speaker 3 (28:37):
Is that the the other girl in the show, and
she's like take pictures now, well you're young and hot,
And I was like that changed, that changed how I
thought about it. I was like, you're right, You're right,
because you know what, they're gonna get leaked no matter what.
Maybe we should all just be taking hot naked pictures
of ourselves, Like I don't know. Maybe the solution all
of this is we need to just like become more
chill with nudes being a thing.
Speaker 1 (29:01):
Moira tech Icon ahead of her time, her time, Catherine O'Hara,
if you want to come on the podcast please.
Speaker 2 (29:08):
Oh my god, it will be it would.
Speaker 3 (29:10):
Be anything else, anything anything.
Speaker 4 (29:19):
Let's take a quick break at our back.
Speaker 1 (29:37):
Well, speaking of AI, I am really riled up about
this potential AI executive order. So we talked a while
back about how Republicans are trying to unsuccessfully but did
try to sneak a moratorium on any state level AI
laws into their so called Big Beautiful.
Speaker 3 (29:58):
Bill because we need more more music, Like I just
showed you its top We yeah, songs honoring Charlie Kirk's legacy.
Speaker 1 (30:04):
We can't curtail innovation. Okay, So the White House is
back on it. They are trying to make sure that
we get as many of these Charlie Kirk AI generated
songs as possible. They're preparing to issue an executive order
as soon as Friday that enlists the power of the
federal government to block states from regulating AI. This is
(30:25):
according to four people familiar with the matter. In a
leaked draft order obtained by Politico. The draft document, confirmed
as authentic by three people familiar with the matter, would
launch several efforts to challenge state level AI laws, including
forming an AI Litigation task Force run by the Department
of Justice. Now, this is a real problem for a
(30:46):
lot of reasons, the first of which is that the
state level laws are a necessary step to get any
kind of federal legislation, which, let's be real, this administration
seems very keen to avoid.
Speaker 2 (30:57):
But if you have like a.
Speaker 1 (30:58):
Network of packed work state level laws, eventually that kind
of thing, historically can lead to some sort of meaningful
federal legislation.
Speaker 2 (31:06):
And it will be one thing.
Speaker 1 (31:08):
If the federal government was trying to prevent state laws
that address AI harms because they wanted like a more
comprehensive federal approach. Maybe, However, that is not what this is. Instead,
this is the Trump administration wanting to make sure that
no one, not states nor the federal government, is doing
anything that might possibly impact the profits of AI companies.
(31:28):
They're essentially all in on AI companies as the future
of the American economy, so they don't want them regulated
in any way by anyone.
Speaker 2 (31:37):
Here's how Politico describes it.
Speaker 1 (31:38):
The executive order would be a different approach than the
previous attempt because it is deploying the muscle of several
federal agencies to quash state AI regulations. Government lawyers would
be directed to challenge state laws on the grounds that
they unconstitutionally regulate interstate commerce, are preempted by existing federal regulations,
or otherwise at the Attorney General's discretion. So this task
(32:00):
force would consult with administration officials from the White House,
including Special Advisor for AI and Crypto, which is a
role that is currently occupied by investor Comma hack All
in Tech podcast or David Sachs. He would be in
charge of determining which state AI laws would be worth challenging,
according to this document.
Speaker 2 (32:21):
So if you're wondering what does all of this mean,
why do I care?
Speaker 1 (32:24):
Think about it like this right now, several states have
some kind of state level legislation on the books to
prevent the thing that we were just talking about with
the Magma stallion story, AI generated deep figs and their
spread Pennsylvania, Washington, Michigan, Tennessee, California.
Speaker 2 (32:39):
This executive order would.
Speaker 1 (32:41):
Threaten those kinds of laws or legislation that tells job
seekers that they have a right to be notified when
AI is being used in the job application screening process,
or legislation that outright prevents it from being used in
the job application process because we know AI is biased,
that states like New York, Colorado, Illinois, California have laws
pertaining to that right now, or how just this week
(33:03):
Portland pass legislation banning AI rent price fixing, which, by
the way, just this week, North Carolina Attorney General Jeff
Jackson reached a seven million dollar settlement with North Carolina's
largest landlord because that company was using AI to illegally
coordinate rent prices. So we currently right now have all
kinds of state level AI legislation that dictates how institutions
(33:30):
can use AI in ways that are meant to help
protect citizens and consumers. So if you zoom out and
imagine what a landscape looks like if this executive order
goes through and is meaningful.
Speaker 2 (33:41):
You know, executive orders are not laws. But the Supreme
Court in Congress seem very keen to just let Trump
do whatever you want. So we'll see.
Speaker 1 (33:47):
But instead of states kind of experimenting and responding to
the real harm happening in their communities and building in
some kind of guardfrail against it that Congress has, through TOOFO,
refused to build. The federal government is stepping in and
saying wait, no, stand down, wait for us, and wait
for what. Because there is no current federal AI law
(34:10):
on the table, it's just a vacuum. And in that vacuum,
the White House is basically stepping in not to protect
the citizens, Americans consumers, but to protect corporations. And I
think especially we're probably going to see this be the
kind of thing where it is corporations who are willing
to play ball and let's keep it real, pay bribes
(34:31):
to the Trump administration that will be able to continue
to operate without this friction. It's just such a shocking, blatant.
Speaker 2 (34:39):
Disregard for what is good for your average American.
Speaker 3 (34:42):
You know, this is like going back to like the
celebrity culture element of it in our last story. I
do wonder because I don't know if Bridget you saw
the Kim Kardashian video this week of her crying because
she failed the California bar exam because she claimed it
was because of Chat Gibt and because she was using kashibt,
(35:03):
can Kame Kardashian, pull the fuck up and please become
like if she becomes our anti like or like our
pro AI regulation, if she becomes the champion for that,
I will forgive you for everything else you have done
in your career.
Speaker 2 (35:25):
She could do it.
Speaker 3 (35:26):
She could do it.
Speaker 1 (35:27):
And I think it's I mean, this is a little
bit above my pay grain, but I have to imagine
it's related to you know, AI essentially propping up our
economy right now. And I think the government is very invested,
no pun intended, but appreciated in the idea that this
AI bubble is not going to burst. And so I
(35:47):
think being like, oh, what if we said that nobody
can do anything to purtail this technology, that might help
foster this idea that it's not going to burst?
Speaker 2 (35:56):
And yeah, just I.
Speaker 1 (35:58):
Hate how much of our democracy in our economy has
just turned into fucking like casino gambling and robbing Peter
to pay Paul shell games.
Speaker 2 (36:10):
Just like I think this.
Speaker 1 (36:11):
I think something about this executive order just really makes
that clear how much it is just setting things up
for bribes and scams, and the American people, we're the
ones who are losing ultimately, and it just pisses me
off that there are still people who will see this,
people who will never like they'll never see that white house, ballroom, whatever, whatever,
(36:32):
who are like, yeah, it's good that AI companies can
illegally collude to raise my rent using AI. It's like,
in what way you're a renter babe?
Speaker 2 (36:42):
In what like? I don't see how your average.
Speaker 1 (36:44):
Everyday American can see this and be like, oh, yeah,
this is gonna be great for me. It's it's better
if they can use AI to raise my rent. I
love paying more in rent.
Speaker 2 (36:50):
Who doesn't I love paying more in rent?
Speaker 3 (36:53):
Personally, I want my life to be more expensive and
with a like shittier result.
Speaker 2 (37:00):
I don't know it is. It is really weird.
Speaker 3 (37:02):
Yeah, the whole conversation around AI is just like it's
gotten to the point where, again, yeah, we have like
Kim Kardashian of all people coming out here and being like,
oh my god, chash DBT is the reason I fail
at my logs whatever. I'm like, why am I out
here agreeing with Kim mart Like, You're getting people from
all like levels of society.
Speaker 2 (37:22):
You're getting people that work.
Speaker 3 (37:24):
White collar jobs talking about how they're frustrated with AI.
You're getting you know, people that are getting screwed over
across the board because their rent is going up because
of AI, or because you know, it's harder to just
google something and find an answer because of AI. Like
I don't know, it is it is. It is a
really strange time to to just be like, we all
(37:46):
acknowledge this as bad, we're just not doing anything about it.
Speaker 4 (37:50):
Here here, let's take a quick break at her back.
Speaker 1 (38:13):
So I do have to throw a little bit of
a trigger warning on top of this one because it
is a very tragic story. In Minnesota, a young welder
named Amberchech, who was only twenty was murdered by her
fellow employee at their equipment and Systems design and fabrication
factory work site. She was bludgeoned to death by her
male co worker, David DeLong, who was forty years old,
(38:37):
at the Advanced Process Technologies manufacturing facility in Minnesota, which
specializes in services to the food and dairy industry. So
it is an absolutely heartbreaking story. Auth already said that
this guy had been probably planning on murdering Amber for
a while, because he said that he quote just didn't
like her. County attorney Brian Lutz said that DeLong felt
(39:00):
that Amber had quote given him a bad look and
that he was upset about that. Now, mind you, Amber
was twenty years old and this man was forty, so
someone half his age. He was upset, according to him,
by just a quote bad look that somebody half his
age gave him. So he was reportedly charged with second
(39:24):
or murder, but Wright County Attorney Brian Lutz told kar
the Minnesota local news affiliate, in a statement that he
plans to evaluate the case for a possible charge a
first degree prematitated murder, which requires conveniing a grand jury,
which again I'm no lawyer, but somebody who officials say
had been planning this murder for a while. I don't
(39:44):
understand what that wouldn't be a first degree murder. But
I'm no attorney, and so I wanted to talk about
this both as just a terrible tragedy, but also to
spotlight the roles of women in the trades like Amber.
You probably already know that women are very underrepresented in
trade work like welding and electrical work. According to the
(40:06):
Institute for Women's Policy Research, tradeswomen were only four point
three percent of all construction trade workers as of twenty
twenty four, and in certain trades it's even lower. Three
point two percent of plumbers and pipefitters and two point
nine percent of electricians are women. So unsurprisingly, women face
challenges in the trades like discrimination and a hostile workplace culture.
(40:31):
There is this viral post going around Breddit that I
saw claiming that Amber had actually made several complaints about
this person to HR, But I could not actually verify
that myself or find a source for that claim other
than the post itself. But honestly, it would not surprise me.
In a statement about Amber's seat, the National Association of
Women in Construction president Rudi Brown said, we must confront
(40:54):
the truth that too many tradeswomen have endured hostility, intimidation, harassment,
and threats on job sites where warning signs were visible
but unaddressed. This tragedy is not an anomaly. It is
part of a disturbing pattern that we as an industry
can no longer deny and will no longer tolerate. And
so it is just something that I think, you know,
(41:14):
we often talk about how people shouldn't go to college
and should go into the trades, and I'm very supportive
of the trades. I have often thought about going back
and becoming an electrician. If I ever stopped podcasting about technology,
I would probably be an electrician.
Speaker 3 (41:28):
Yeah, this is just such a horrifying story on its own,
But yeah, I think, like to the point you were
making about if we're saying yes, more people should go
to trade school, for encouraging people to go to trade school,
like when you say people, women are half of people,
So it's like, if you want people to do that,
you need to make it safe for everybody.
Speaker 2 (41:50):
Exactly.
Speaker 1 (41:51):
Yes, And I don't even really know how to talk
about this because I don't want to speculate on stuff
I don't know. But you know, in looking at information
about this case, Amber was young, twenty and she had
like short hair and all these pictures, and part of
me wonders if that is part of what's going on
(42:14):
here that people aren't really reporting.
Speaker 2 (42:15):
On that one.
Speaker 1 (42:17):
I think that the fact that this guy was forty
and she was twenty, there was a huge age discrepancy there,
and so like I think that gen Z in the workplace,
I think that like that might be part of something
that made her vulnerable in this workplace that I that
like should have been addressed and she should have been
supported on.
Speaker 2 (42:37):
And then two part of me wonders if this guy.
Speaker 1 (42:40):
Perceived her as whether or not she was as like
queer or LGBTQ in some way because she had quite
short hair. And so I can't imagine, dude, you know
what I'm saying, Like I don't want to speculate, but like.
Speaker 3 (42:52):
No, I agree, I mean I think even beyond like
not speculating, I'll say, somebody who like I have pretty
short hair like a bridget. You can see me where
my hair is out, like this is the longest I spent.
This is usually like I'm probably gonna get a hair
cu like.
Speaker 2 (43:06):
A month like this is.
Speaker 3 (43:07):
But like I can tell that I when my hair
is this even just a little bit longer, where like
it could it's closer to like a bob, I get
treated so differently. Like when my hair is like as
short as I usually cut it, that is a very
different experience from when my hair is a little bit longer,
and maybe I'm dressed a little bit more feminine. It's
still uncomfortable, It's still sort of the meaning the way
(43:27):
that I'm treated, but it's like safer.
Speaker 2 (43:30):
In a weird way. Like I know that I'm like,
I don't know.
Speaker 3 (43:33):
I've like had this conversation before about like if I'm
going into like a space that I don't know if
it's going to be super like queer friendly or whenever.
If I if my hair is longer, when that's happening,
I'm usually like, Okay, yeah, I can get away with it.
If my hair is little shorter, I'll usually try to
like them it up a little bit too, you know not.
But yeah, no, I wouldn't be surprised if that also
was part of this.
Speaker 2 (43:54):
Uh yeah, no, I can go back.
Speaker 3 (43:56):
To the fact that he was forty and she was
twenty is crazy, and I mean I think beyond that,
I'm sure every.
Speaker 2 (44:06):
As somebody who is.
Speaker 3 (44:07):
A a bad person who has worked in male dominated
fields like my whole life. I mean, it's, uh yeah,
you will be.
Speaker 2 (44:19):
You will have an.
Speaker 3 (44:21):
Uncomfortable experience with a man that's twice your age, and
unfortunately that's kind of just a given right.
Speaker 2 (44:26):
It shouldn't be. It shouldn't be.
Speaker 3 (44:28):
I'm not saying that's okay, I'm not saying that should
be the a worm, but like I don't know, having
worked in media, having previously worked in film working and podcasting. Now,
like I I've had plenty of experiences of like men
twice my age that have been really weird and uncomfortable,
and you know, thank god, nothing really bad has happened
(44:48):
to me personally. But it's like, this is a possibility,
this is something that you have to think about.
Speaker 2 (44:52):
It's true, and it shouldn't be true.
Speaker 1 (44:55):
And in the case of Amber, the National Association of
Women in Construction and Building Trades unions are all calling
for like industry support and increase protections for women who
work in key construction site jobs. And you know, I
just think like it's just a reminder that with all
the talk of going into the trades, which like I
(45:16):
love the trades, I will never shot on the trades.
Speaker 2 (45:19):
However, the work, as you said, the work has to
be safe for everybody. The work has to be safe,
whether you're queer, whether you're a young woman, whether you're
a gen z. It has to be safe.
Speaker 1 (45:28):
You have to be able to show up and feel
like you're going to be protected. And if right now
it doesn't sound like we're creating that kind of a
culture that is going to allow for everybody to show up.
And you know, if there are any trades women listening,
like it is tough, it's tough work.
Speaker 2 (45:44):
The people who do.
Speaker 1 (45:44):
This work, they work hard, and they are a workforce
that is very very important, and too often, especially for
marginalized people in that workforce, are doing this critical work
without the protections and support that they need.
Speaker 2 (45:58):
And so I guess I'm hard.
Speaker 1 (46:00):
As tragic as this is, I'm heartened to see these
unions and these organizations trying to at least use this
tragedy to get a little bit of justice and like,
use this, use Amber's legacy to really shine a spotlight
on underrepresented folks who are in the trade, because it's
important is that it genuinely is life or death.
Speaker 4 (46:24):
More. After a quick break, let's get right back into it.
Speaker 1 (46:42):
Well, switching gears a little bit. This is not the
kind of story that we usually talk about, but I
was just sort of very intrigued by it because it
has come to my attention that parents out here might
be getting scammed because I truly had no idea until
very recently how much money is involved in youth sports.
(47:03):
In researching this segment, I came across multiple people who
use this term the youth sports industrial complex, because if
your kid plays hockey, especially hockey, but other sports too, basketball, cheer,
really anything competitive, you are opening your fucking wallet because
this shit is very expensive.
Speaker 2 (47:20):
Oh yeah, did you know this kind ofness?
Speaker 3 (47:22):
So I was a competitive dancer growing up, so I
and that is a whole other world of like dance
industrial complex kind of thing. But I uh, I'm sure.
And I never was a sports person. I never really
played competitive sports growing up. But seeing my friends that did,
or friends siblings that did, uh, I'm not.
Speaker 2 (47:45):
Surprised by all this.
Speaker 3 (47:47):
And again, yeah, I was in the dance world, which
was very similar as much as I though, although they
always would tell us because whenever it would be like, oh,
this is a sport, they would be like, no, we're artists.
This is an art We're not art it's an art forum.
Speaker 2 (48:02):
So I don't know.
Speaker 3 (48:03):
I guess like artists can scam you to I don't know.
Speaker 2 (48:06):
Yeah, yeah, it's.
Speaker 3 (48:07):
All scam We were special. We were artists. We weren't like,
we weren't jocks.
Speaker 2 (48:12):
That's so funny.
Speaker 1 (48:13):
I was gonna say, is dance is competitive dance on
a sport, But I guess that it's an artistry. Well,
according to this new report from the Lever, private equity
is taking over youth sports and as part of that takeover,
companies are even preventing parents from using their own cell
phones to record their own kids game so that they
can squeeze even more money out of these families.
Speaker 3 (48:36):
This is you know what, This feels like a full
circle moment because the the like finance people that are
working for these private equity firms that are moving out
to those suburbs that are having kids, that are having
kids take place in these other are like take part
in these really competitive sports things or whatever. Great, now
you can't record your kid playing soccer because of the
(48:56):
thing that you used to I don't know, using the
machine that you were also in a I'm generalizing here.
I'm sure not all of these people are not part
of this is done. But I have an image in
my head of like the kind of couple that is
taking their kid to soccer practice, and this is happening,
and I feel like we're going in a loopier.
Speaker 2 (49:16):
Yes, the machine, the hold the machine.
Speaker 1 (49:20):
So as this forty billion with ab youth sports industry
is coming under private equity controlled corporate owned facilities and leagues,
from hockey rinks to cheer arenas, have started banning parents
from filming.
Speaker 2 (49:33):
Their own children. What are they supposed to do instead?
Speaker 1 (49:35):
Well, instead, parents are required to subscribe to exclusive, company
owned streaming services that can cost more per month than
actual streaming of actual professional sports. These private equity firms
also lock in exclusive contracts that block any competing video
services from being used. So when I first read this story,
(49:57):
I was like, these people have never met Melvin Toddsey.
In our town, my dad was like known for coming
onto the field during our like if I'm a very
particular memory of my brother of him coming onto the
field to like I think my brother was being unsportsmanlike
or something like. He was known in our town for
like coming onto the field during our games. So I
(50:18):
was like, oh, this would never fly with Melvin Tid Senior.
So I'm thinking, like, Okay, how are they gonna enforce this?
What are they really gonna do if parents are just like,
fuck you, I'm gonna record anyway.
Speaker 2 (50:27):
Well, according to the lever, they.
Speaker 1 (50:29):
Will confiscate parents' phones if they record anyway. Parents can
be blacklisted from attending their kids games and in some
cases the kids might face penalties if their parent is
like fuck you, I'm gonna record on my phone anyway.
This actually happened to a sitting US senator, Senator Chris
Murphy of Connecticut. He said, quote, I was told this
(50:51):
past weekend that if I had live streamed my child's
hockey game, my kids team would be pedalized and lose
a place in the standings. Why because a private equity
company has bought up the rinks.
Speaker 2 (51:01):
Can you imagine this is giving?
Speaker 3 (51:03):
Like black Beer episode, which I know is like every
episode of this podcast, but like, for real, this is giving.
Speaker 2 (51:10):
Black Beer episode. Also like I'm sorry, I just want.
Speaker 3 (51:14):
To go back to the fact that like parents' bones
can get confiscated, Like yeah, it feels like like are
we putting the parents in detention? Like what's going I
feel like the state of the world right now is
we are all stuck in like a bad eighties movie,
and like the villains of the bad eighties movie are
(51:35):
all of the people in charge. It's giving like the
Breakfast Club, like the teacher from the Breakfast Club and
being like like like that is who we're all under
right now, being like no, you can't have your phones out,
you can't be filming this, you can't be celebrating your kids.
We need to stream this on.
Speaker 2 (51:54):
Youth Soccer Plus or whatever. Yeah.
Speaker 1 (52:00):
Yes, this story about the youth sports really shows what's
that stick here? Like, do you want this private equity
company to ban you as a parent from taking a
video on your phone of your kids youth sports? But
you already pay out the ass for them to be
able to do not to mention, you're probably after we're
probably at the drive them halfway across town.
Speaker 2 (52:22):
I don't even get me started.
Speaker 1 (52:23):
On top of that, do you want to have to
be forced to pay for black Bear Sports Group TV
package just to watch your kid play hockey?
Speaker 2 (52:32):
I don't think so so.
Speaker 1 (52:34):
Black Bear Sports Group is the largest private equity youth
hockey organization in the United States. And basically, when asked
by the lever about like this policy, why parents can't
just record their own kids, they said it was because
of quote significant safety risks like kids being filmed without
their consent. But when the lever pressed them and was like, okay, well,
why would parents be punished for recording their own kids
(52:58):
if you're worried about the safety risk of kids being
film without their consent.
Speaker 2 (53:01):
Lever was like, Uh, we don't really have an answer
to that. No, no answer. They didn't have anything to say.
Speaker 1 (53:06):
So the black Bear streaming package service costs between twenty
five dollars and fifty dollars a month, depending on the package,
and they're also on top of that adding new fees
like a fifty dollars registration and insurance charge, on top
of the normal fees that parents already pay the USA
Hockey membership and the already sky high cost of the equipment.
(53:28):
It's the kind of price creep that makes hockey, which
is already a very expensive and exclusionary sport, even less
accessible for regular families who might want to participate.
Speaker 3 (53:39):
I have a question about so if they're worried about
kids getting quote unquote filmed without their consent or whatever.
But also, this is a streaming service you can pay for.
What's stopping like randos from watching thing? Like I almost
feel like having a parent fled your game and then
(54:01):
posted on their Instagram, their private Instagram with like ninety followers.
That seems more private to me than just having it
out there and anybody who has fifty bucks can watch it.
Speaker 1 (54:14):
Oh, you think that sounds more private than black Bear TV,
the streaming service that Black Bear recently launched.
Speaker 2 (54:22):
Just my opinion, and all I know, it's really out there.
So basically, they through black Bear TV.
Speaker 1 (54:29):
Black Bear, this company that runs most of the youth
hockey in the United States, is going to be exclusively
streaming service for every league game. So basically they're like
literally monetizing children as content while also extracting more money
from the parrots who just want to watch their kids
play the sports that they're paying tons and tons of
(54:49):
money to sign their kid up for. And it is
not just hockey. Varsity Brands, which is the dominant company
and competitive cheer based a massive antitrust lawsuit accusing it
of lacking competitors and forcing teams to buy only Varsity uniforms, equipment,
and tournament access. The company, which no surprise to anybody,
was also tangled up in a high profile sex abuse scandal,
(55:12):
ended up settling a lawsuit for eighty two million dollars
in twenty twenty four. Similar to Black Bear of a
hockey company, Varsity also owns Varsity TV, the exclusive streeting
platform for cheerleading competitions and if folks watched that docuseriies
on Netflix Cheer. They shut the Netflix crew out because
the venue was theirs and they were like, no, we
(55:34):
have the exclusive filming rights here. So it's just this
grotesque situation where these private equity companies are just finding
every little coin, every little dollar they can extract from
these families who just want to have their kid play
basketball or cheer or do hockey.
Speaker 2 (55:52):
Yeah, this is this is insane. Also, it's kids sports.
Speaker 3 (55:59):
I am thinking back now too because I do remember so,
Like again, I was a competitive dancer. I do remember,
so they wouldn't let parents film are when we would
have our big like end of the year showcase where
we would have like every like recital, they wouldn't let
parents film there, but they would like sell DVDs that
were like professional shop, which like that. It was one
(56:20):
thing though, it was one thing every year, and it
was like these dance studios selling these cvds and the
money was going back to them. It wasn't like some
big private equity company.
Speaker 2 (56:34):
Dah is weird.
Speaker 3 (56:35):
This is a weird, bleak, dark story.
Speaker 1 (56:37):
It is, and I think anything in evolving because it's
not just sports, it's it's youth sports, right, Like, Yeah,
when something is involving your kids, it's emotional. It's highly
monetizable environments where parents probably feel pressure to pay anything
to support their kids.
Speaker 2 (56:54):
You know, you've.
Speaker 1 (56:55):
Got miners being used as content by corporations revenue streams.
It's just it's not just that these companies are squeezing
these families financially, which they are. They are also reshaping
youth activities and youth sports into these paywalled digital products
and content, one penalty and subscription at a time. And
(57:16):
I just think, like, we have to have spaces that
are just for joy.
Speaker 2 (57:21):
And youth and community and childhood.
Speaker 1 (57:23):
If those spaces just get turned into another place of
copyright policing and predatory monetization and extraction, but what do
we have, Like what is all of this for If
you can't even get a grainy video of your kid
doing something cool and hockey and post it on your
Facebook for your mother in law to.
Speaker 2 (57:41):
Be like, oh, they look so great, what is this for?
Speaker 3 (57:43):
Right?
Speaker 1 (57:43):
Like, not every space needs to be extracted for every
single resource in this way, especially when it involves you.
Speaker 2 (57:52):
It's just it's you're so right, It's just like a
very bleak story how did.
Speaker 3 (57:56):
We get from like, again, I'm not a sports person,
so i've this is not I'm not going to give
the most detailed description of this, but whenever I remember
hearing the debate about like should college athletes be paid
for because of the fact that we're broadcasting these games,
they are like.
Speaker 2 (58:12):
They're big money makers.
Speaker 3 (58:13):
People are coming in to pay and see these games
or whatever. How do we get from that conversation of like,
should these like again, also still at this point usually
like sometimes teenagers, sometimes twenty twenty one year olds still
pretty young?
Speaker 2 (58:25):
Should they be paid for this?
Speaker 3 (58:26):
Like labor they're putting in that can be very dangerous,
by the way, how do we get from that to
your five year old playing soccer that is streamable content?
Speaker 2 (58:40):
Yes?
Speaker 3 (58:42):
Okay, sure should we Like, if you're gonna do that,
then start paying the kids that you're making content. I
don't know, it's it's so yeah, this is such a
I get I can't. I The only thing I can
keep going back to was just beking like this is
such a bleak outlook like, so, what is the topenic
to our world?
Speaker 2 (59:02):
Very bleak stuff?
Speaker 1 (59:04):
Okay, Well, the company that runs the youth hockey in
the States is black bear. And speaking of bears, hey
don't buy this AI bondage teddy bear for your kid
for the holidays, or actually, I don't know, maybe do
depending on your kid. I don't know what your kid's into.
If you live in DC like I do, and you,
like are the kind of person who celebrates the holidays
(59:25):
at our local leather and kinkbar, the District Eagle, maybe
this is a good choice for you.
Speaker 3 (59:31):
I was gonna say when I saw the headline and
that or that the outline you sent me, I saw
the words AI fetish bear, and I thought this was
going in a very different direction.
Speaker 2 (59:42):
I wish.
Speaker 1 (59:45):
Okay, So here's what's going on. Researchers at the Public
Interest Research Group found that an AI powered teddy bear
from the children's toymaker Folow Toy was giving kids instructions
on how to do things like light matches, engage in
knife play, and even information about sexual fetishes.
Speaker 3 (01:00:07):
Wait wait, wait, engage a knife plays separate from the
sexual fetages or the late to the fetish really okay, Yeah,
it was like.
Speaker 1 (01:00:15):
First of all, nice to meet you, Timmy, go grab
a knife.
Speaker 2 (01:00:20):
Second of all, what are your cakes?
Speaker 3 (01:00:22):
All?
Speaker 2 (01:00:23):
Right? Back in my day.
Speaker 3 (01:00:25):
Back in my day, we would I you learn all
this from being on tumbler at too young of an
age or on omegle or whatever.
Speaker 2 (01:00:33):
Uh but uh whoa just wow? All right? Sure?
Speaker 1 (01:00:40):
Yeah, so this awesome bear was using open ais chat
shept four model. But after the Public Interest Research Group
tested three different AI toys, they found that these toys
can quickly veer into pretty not safer kids territory, giving
advice on sex positions, fetishes, I had to find knives
(01:01:00):
in the kitchen, how to start fires with matches. Open
ai then pulled the plug on follow toy, suspending the
company's access with AI models. An Open AI spokesperson told
the research group, I can confirm that you've suspended this
developer for violating our policies.
Speaker 2 (01:01:15):
I'm sorry, is this toy? Is this Megan?
Speaker 4 (01:01:18):
Is this?
Speaker 2 (01:01:19):
Oh my god?
Speaker 1 (01:01:19):
I'll related. I watched Megan two on the plane. So
I saw Megan one in the theater and I fucking
loved it. I finally saw Megan two on the plane
to Barcelona. Have you seen it?
Speaker 2 (01:01:28):
I actually haven't seen either of them. Oh my god,
I need to watch them. I've seen many of the memes.
Speaker 3 (01:01:35):
I've seen many, many many like and Halloween goshuomes so.
Speaker 1 (01:01:39):
My gosh, Meghan one. I absolutely loved Meghan two. I
will say like it. I kept saying so Mike, producer.
Mike was sitting next to me on the plane to Barcelona,
and I was I kept being like, this is a
movie for Tangoti listeners.
Speaker 2 (01:01:54):
And that's okay. Sometimes you.
Speaker 4 (01:01:59):
Should say, I.
Speaker 2 (01:01:59):
Would be very curious for your thoughts. They've been watching it.
Speaker 3 (01:02:01):
I'll watch it sometime soon and let you know my thoughts.
Per shit, because I I am curious also how this is,
because you know, I love like ex Machina and like.
Speaker 2 (01:02:12):
I love that genre movie, so I'm very curious. Also
Alison Williams, so.
Speaker 1 (01:02:19):
You know, yeah, she's really I think this is her
best movie since get Out.
Speaker 3 (01:02:25):
Oh absolutely, I actually so the first time I saw
her and something was get Out. So going back and
watching I had like my experience watching Girls was having
seen Alison Williams can get Out, Adam Driver, and Star Wars.
Actually I think those were the main two. But like
watching it that it was like weird to be like, no,
I know you for the role that like because of
this show you got cast in and I can see why.
(01:02:49):
But I this isn't right. I feel like I should. Yeah.
Speaker 1 (01:02:52):
Yeah, that's so coody showy to be like, I know,
I don't know you from Girls.
Speaker 2 (01:02:56):
I know, I know Star Wars.
Speaker 3 (01:02:58):
Yeah, come on yeah yeah. Anyways, back to this bear,
Teddy Bears teaching kids about knife play. Yeah nice play
bear actually voice by Adam Driver.
Speaker 2 (01:03:14):
Oh my god.
Speaker 1 (01:03:14):
No, Now that's a toy that would be the hottest
toy of the holiday season.
Speaker 2 (01:03:19):
Yeah. I don't know if kids would be buying that,
I would buy it. I would need that.
Speaker 1 (01:03:25):
So open Ai was like, we can't have these bears
telling kids to like do spetishes and play with knives.
So they were like, we kicked this company off of
our platform. But it seems like that move might put
more pressure on open Ai to police how their products
are used, especially as it is trying to partner with
(01:03:45):
other kids toy companies like Mattel, which had started a
partnership with I think earlier this year. But here's really
the Kickerer. Other chat GPT powered toys are still very
much available on Amazon, So if you're looking for a
memorable and potentially horrifying holiday gift, well Amazon has you covered.
Speaker 2 (01:04:05):
You know what.
Speaker 3 (01:04:06):
Actually, I was just talking to all right, our friend
who was trying to get me to watch the movie Chucky,
and I was like, I was terrified of Chucky when
I was a kid. I refuse that, is, like I
because I am I'm somebody I was scared of horror
movies when I was a kid, and now I'm making
my way through all of them. But I'm like, that's
one of the ones that I absolutely refuse. My cousins
were obsessed with it and they used to like terrorize
(01:04:26):
me with it, and I know, I was like, it's
so scary. I was like this, this, this sounds like
my worst nightmare when I was like eight years old.
Speaker 2 (01:04:34):
Yon, just the weird grossness of it all.
Speaker 3 (01:04:37):
I'm like, I child me would be absolutely terrified by
any of this.
Speaker 2 (01:04:43):
This is nightmare of fuel. Like, come on, I also
had a thing with Chucky.
Speaker 1 (01:04:49):
My brother was obsessed with Chucky, and I was terrified
of Chucky.
Speaker 2 (01:04:54):
I was really good, yeah, and I.
Speaker 1 (01:04:57):
Like horror movies and I could not handle Chucky. Really
anything where it's like you might be too young. Did
you ever watch the puppet Master movies, anything where it's
like little dolls or little toys come to life.
Speaker 2 (01:05:09):
I don't like that. That's too scary. No, yeah, I was.
Speaker 3 (01:05:12):
I did recently. This might have been like a year ago,
actually not recently.
Speaker 2 (01:05:16):
Whatever.
Speaker 3 (01:05:17):
A year ago, I was making your way through all
the conjuring movies and I skipped Annabelle because I was like,
I can't anyway anytime, it's like a doll.
Speaker 2 (01:05:28):
No, I just know just I don't want to do.
I don't want to think about that. That's too much
for me. I don't know what it is about having.
Speaker 3 (01:05:35):
Dolls come into life that is too scary for me.
Speaker 2 (01:05:38):
It keeps out away.
Speaker 4 (01:05:39):
Uh huh.
Speaker 1 (01:05:41):
Well, if you're looking for a holiday gift for Joey
or myself, maybe pick something else. We don't like the
dolls that could potentially come to life. I'm okay with
the fetish talk for adults, not necessarily in a bear
for kids, but let's just steer clear of the creepy,
fucked up doll all genre altogether.
Speaker 3 (01:06:03):
I'm trying to think of like a a way there
could be like a positive like any way I can
think of like a fetish AI thing for adults that
also doesn't end well though, too. That's maybe we should
just keep Ai out of this, you know, yeah, crazy,
crazy concept, But you've ever seen any horror movie.
Speaker 1 (01:06:22):
I think the takeaway is, let's just keep Ai out
of this. How much ever seen Terminator?
Speaker 2 (01:06:28):
Oh my god?
Speaker 3 (01:06:29):
Literally the Space Audits Space Odyssey. I was about to
say Space Audity and then I was like, that's the
song that's not confused as well? Yeah, yeah, I was.
I was trying to argue with me because my dad
uses chow GBC and I was like, I remember being
like sixteen, You're making me watch this movie and me
being like, this is really long.
Speaker 2 (01:06:50):
I don't like it.
Speaker 3 (01:06:51):
But I'm like, did you not take away the message
from the movie that you made me watch?
Speaker 2 (01:06:58):
I don't know.
Speaker 1 (01:06:59):
Sounds like somebody needs to go back and rewatch it
and get a crash course into what hell is really
capable of?
Speaker 2 (01:07:05):
Hibe holl and Meghan can do a team up, you know,
oh now that I will watch.
Speaker 3 (01:07:09):
Here we go see I'm in I'm in LA for
one day and I'm like next Blockbuster hal versus Megan.
Speaker 2 (01:07:16):
Let's for your consideration.
Speaker 1 (01:07:19):
Joey, Thank you, Joey, thank you so much for running
down these stories for me. Where can people keep an
eye on you if you don't become Hollywood's next big thing.
Speaker 2 (01:07:34):
Where can folks keep an eye on you?
Speaker 3 (01:07:36):
Yes, well, of course look for my future film hal
Versus Megan Uh coming out to theaters near you, or
are actually Joey Plus streaming service my my streaming stories.
You have to give me fifty dollars a month to
watch these amazing movies that I'm definitely making.
Speaker 2 (01:07:59):
That's pretty, that's good, I think. Yeah, I might be
honest something here, let's see.
Speaker 3 (01:08:05):
But you can follow me on Instagram at Pat not Pratt.
That is Patt n ot p r A T t uh.
That's also my Twitter handle and my Blue Sky and
other various other platforms. But I really only use Instagram currently.
And yeah, you can also listen to some of my
work on the show Stuff Mom Never Told You. I
(01:08:26):
recently did an episode talking about the Mom Donnie Win
in uh New York City, Bridget, I think you got
to mention that from being the first I got a
bunch of different text.
Speaker 2 (01:08:36):
From people out of we were talking about. However, it
was sort of like people.
Speaker 3 (01:08:39):
Outside of New York were celebrating this year, and I
was like, the first text at was just Bridget and
all cats sending me mom Donnie, I was like, yeah,
he does.
Speaker 2 (01:08:52):
Be and excited.
Speaker 3 (01:08:54):
It's okay me too. I watched that election come in
at a at a gay bar near me and that
was the beautiful moment. That was a really really great moment.
But if you want to hear about that, yeah, listen
to the stuff fun and virtual du. I also work
on the show Outlaws Ks Madison and the show Afterlives,
(01:09:15):
which is out wherever you get your podcasts, signal orwer
winner and speaking of signal orders, also the show Black Fifa,
which is also on the Outspoken Network.
Speaker 1 (01:09:25):
So yeah, well, thank you for being here, Joey, and
thanks to all of you for listening. I will see
you on the Internet. Got a story about an interesting
thing in tech. I just want to say hi, you
can We just had hello at tengodi dot com. You
can also find transcripts for today's episode at tengody dot com.
There Are No Girls on the Internet was created by
(01:09:46):
me Bridget Todd. It's a production of iHeartRadio, an Unboss
creative Jonathan Strickland as our executive producer. Tarry Harrison is
our producer and sound engineer. Michael Almada is our contributing producer,
Edited by Joey pat I'm your host, Bridget If you
want to help us grow, rate and review.
Speaker 4 (01:10:02):
Us on Apple Podcasts.
Speaker 1 (01:10:04):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.