Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this
is there Are No Girls on the Internet. You're listening
to their No Girls on the Internet for we explore
the intersection of technology, social media, and identity.
Speaker 2 (00:23):
And this is another installment.
Speaker 1 (00:25):
Of our weekly news Roundup, where we round up all
the stories online that you might have missed so you
don't have to. Producer Mike, I think we have to
gear up for a Trump tantrum incoming because Trump has
been saying for the longest time that he feels that
he deserves the Nobel Peace Prize. I guess the people
who decide who deserves a Nobel Peace Prize and who
(00:48):
doesn't disagreed because he didn't win.
Speaker 3 (00:51):
He did not win, and also he's like the only
person on earth who is aggressively campaigning to win the
Nobel Peace Prize, like it's so shameless.
Speaker 1 (01:01):
We were talking about this before we started recording, but
there was a time where most people did not care
nor even know who the Nobel Peace Prize winners were.
I looked up the list this morning of the last
the people who had won it. For the last thirty years.
I maybe I'm uninformed, but I only knew a handful
of people on that list for the last thirty years.
(01:23):
One of them, notably Barack Obama, did win it. I
know who Barack Obama is, so anybody who tells you
otherwise is misinformed. However, you have these people who were
acting like they were very invested in who wins the
Nobel Peace Prize all along.
Speaker 2 (01:36):
No you weren't. You just want to be a Trump
sick a fan. No you weren't.
Speaker 3 (01:39):
Yeah, when we looked at that list, it really underscored
to me how American centric our news bubble is. Like
I like to think of myself as a pretty informed guy,
but I didn't know like most of those people, and
Donald Trump certainly did not.
Speaker 2 (01:53):
No, of course not so.
Speaker 1 (01:54):
The Nobel Peace Prize is actual recipient is Venezuelan opposition
leader Maria Karinado. It owns like a very deserved win.
I did love this post on Blue Sky from Kieran Heally.
White House staffs should just make and show Trump an
AI generated video where he wins. Fly him to Duloof
and tell him it's Oslo and dare the New York
Times to say anything more than experts disagree about whether
(02:18):
or not mister Trump was award at the Nobel Prize.
Speaker 3 (02:21):
It's such a good solution because I know he would
fall for it, and I know the New York Times
would go along with it, like they wouldn't even need
to be dared. That would just be their default thing
that they would write.
Speaker 2 (02:31):
No, and absolutely one hundred percent for short.
Speaker 1 (02:33):
It would turn into a conversation where who's to say
what the truth is? You know, this underscores this fractured
reality that we're living in where some people say this,
some people say that, and there's no expectation of finding
the actual truth anymore.
Speaker 2 (02:48):
I genuinely feel that.
Speaker 3 (02:50):
No, and there would definitely be a New York Times
opinion piece about how like the truth is somewhere in
the middle. But I do love this so much. So
as we're recording this, the news was just announced a
few hours ago, so I don't believe Trump has officially
reacted yet or posted on his social media slash crypto platform.
I can only assume he is waking up and throwing
(03:13):
things around in the Oval office in a rage, which
is a wonderful image. But I just love this high
level trolling from the Nobel Committee because after all of
Trump's bluster and like demands that he win, they chose
to highlight this democracy activist of a country that Trump
(03:35):
is like murdering their citizens and threatening and perhaps actively
gearing up for war with But like, it's really messed
up what the Trump administration is doing with Venezuela, and
it's really scary. And I love so much that the
(03:56):
Noble Committee has lifted up this democracy activist instead of
giving into Trump yes.
Speaker 1 (04:06):
And speaking of this idea that if the White House
just told Trump that he won, you wouldn't really have
people disputing that. Probably in legacy media they would be saying, oh,
some people say he did, some people say he didn't.
That reminds me so much of this new open AI
platform sora ai, their tool that lets anybody make hyper
realistic AI videos.
Speaker 2 (04:27):
From text prompts.
Speaker 1 (04:28):
It's kind of like Google's platform vo three, which we've
talked about extensively on the podcast. Have you seen any
of these sora ai videos on social media or have
you played with it yourself yet?
Speaker 3 (04:39):
I have not played with it myself. I'm actually curious
to but because we make this show, but I believe
it's still invite only, so they're using that exclusivity to gatekeep.
And so I've seen a few videos but have not
directly used it myself yet.
Speaker 1 (04:58):
So I have also not used it myself, and generally
I try to not give my opinion on consumer tech
that I have not personally played with. I've seen a
couple of videos floating around. I will say this, some people,
for whatever reason, do seem to like it. They seem
to be enjoying it. I've seen people creating videos using
Sora and.
Speaker 2 (05:15):
Then posting it on social media.
Speaker 1 (05:16):
And something that makes Sora unique is that it has
a feature where people scan in their face or their friends'
faces and then put them in these AI.
Speaker 2 (05:24):
Generated scenarios, which we'll talk more about in a second.
Speaker 1 (05:26):
So I have a few friends, people that I trust,
who have posted Sora ai content and they were like, Oh,
it's so fun.
Speaker 2 (05:35):
I'm having so much fun on this platform. So I
don't know.
Speaker 1 (05:38):
I have not played with it myself, and maybe I
will eat my words, but I have seen people say
this platform is so fun and it's such a joyful
experience that it's going to rock social media.
Speaker 2 (05:50):
And I just don't see it. I really just don't
see it again. Maybe I'll eat my words.
Speaker 1 (05:56):
I'm happy to take this back if it turns out
that I'm incorrect. But it really reminds me of a
few years ago when a lot of folks are posting
those AI generated futuristic looking selfies on Facebook a few
years ago. I think the reason why people who are
using it are saying it's so fun, they're just excited
to see themselves and their friends in AI generated scenarios.
Speaker 2 (06:19):
Do you know what I mean?
Speaker 1 (06:20):
I don't think that that necessarily means that the kind
of content that is coming out of these platforms are interesting, unique,
the kind of thing that's actually going to hold someone's attention.
I just think people are excited because, hey, that's me
and my friend in a scenario. It almost reminds me
of you might remember.
Speaker 2 (06:36):
This, do you ever?
Speaker 3 (06:37):
Yik Yak I do from like the early Internet, Like
we're talking early two thousands, right, Yeah, So for.
Speaker 1 (06:45):
Folks who don't remember yik yak, was this app from
I want to say twenty thirteen.
Speaker 2 (06:50):
Oh apparently it just relaunched in twenty twenty one, so
maybe we can get back on it.
Speaker 1 (06:54):
But you would take a picture of your face and
you can put it in these animated cards. So I
remember making one for Christmas where it was my face
and my best friend Kristen's face in the in like
dancing animated elf. So it looks it looks like our
faces were on the bodies of elves and we were dancing,
and we sent it to all of our friends and
we thought it was just hilarious. That is genuinely what
(07:16):
this reminds me of. I think people are just excited
to see their own faces and the faces of their
friends in these AI generated scenarios. When people post them
on social media, they post in the handful of posts
that I've seen people post them saying oh this is
so incredible?
Speaker 2 (07:31):
Can you believe this? Can you believe this?
Speaker 1 (07:33):
And it just reminds me of when people post super
hyper specific to them content on social media, and of
course they like it because it's about them.
Speaker 2 (07:41):
But will other people like it? I'm not so sure.
Speaker 3 (07:44):
Yeah, I think you're absolutely right that it probably doesn't
have a lot of staying power because, for one thing,
it's just like the novelty of like, oh look, I
can put myself in an AI generated video. It reminds
me of something that one of my friends, who is
a visual artists, told me about self portraits one time,
how he considers them to be like one of the
(08:06):
laziest forms of art. You can probably guess which friend
this is, because it's just it's just a subject that
is always there.
Speaker 2 (08:15):
If you don't have any other.
Speaker 3 (08:17):
Ideas, you can do it. It's just like yourself. And
I feel that that's a pretty good analogy for this
kind of tech that is just like, hey, look it's
a picture of you.
Speaker 2 (08:29):
This is like a mirror, but there's some.
Speaker 3 (08:32):
Cool AI wizardry behind it.
Speaker 2 (08:35):
Well, don't tell that to free to Collo, but sure
that's true. Yeah, it wasn't my opinion, it was his.
I know exactly the person that you're talking about. Also,
I know that's true.
Speaker 3 (08:47):
Should next time I see him, I'm going to ask
him about freedom.
Speaker 2 (08:49):
Anyway, we digress.
Speaker 1 (08:51):
So I've actually seen tech tools that both remove and
add a Sora ai watermark to any video, whether or
not it's AI and whether or not it's made with
sora Because right now, when you make these Sora ai videos,
there's a watermark that kind of moves around that's kind
of meant to be a way for anybody to see
it and say, hey, that's AI, even though it has
(09:12):
not stopped people from commenting on videos that have that
Sora AI watermark as if these videos are real, which again,
I think it goes to show that it's not just
that these AI videos are hard to tell that they
are AI, which it is hard to tell.
Speaker 2 (09:27):
And it's not just that they're everywhere, it's that.
Speaker 1 (09:29):
We live in this climate that is completely eroded our
trust in everything that we see online.
Speaker 2 (09:35):
And I know people are having fun with it. I
get it. I don't want to be a party pooper,
but I just don't love it.
Speaker 3 (09:40):
I mean, I'll be a party pooper. I feel that
it's like kind of my role on this show.
Speaker 2 (09:44):
Yeah.
Speaker 3 (09:45):
I think all those problems you mentioned are also compounded
by the add nature of social media, where everything is
like fast cuts, super short form videos. A lot of
videos are only a few seconds long, and no surprise,
people when they're scrolling through videos are not taking the
(10:05):
time to critically evaluate is this real or not before commenting,
which is unfortunate and we should do that, but let's
be honest, a lot of people don't. I'm certainly guilty
of the same, and the algorithms in a lot of cases,
in the platforms the way they're designed, do everything they
can to encourage people to not critically evaluate. They just
(10:29):
want to connect with feelings and emotions to keep people
clicking and engaged.
Speaker 1 (10:34):
Yes, and you know who is really raising the alarm
about that right now? Family members are famous dead people
because Sara lets anybody use the likeness of anyone public
figures copyrighted characters whoever. So to back up a little bit,
open AI's first position was that anybody who objected needed
to opt out of having their likeness used in this
way on Sora, which to me is wild and truly
(10:57):
not how consent works in any of their context, where
you would have to object and opt out otherwise is
just understood that you've given your consent. But eventually, as
ours technical reports, open Ai was forced to change the
way that Sora handles fictional copyrighted works. Open Ai CEO
Sam Autman wrote this weekend that copyright holders now have
to opt in to allow their characters in Sora to videos,
(11:21):
rather than opting out as it was when the service
first launched, and those folks might share in some of
the revenue from any Sora videos of their characters. So
open ai calls these cameos, which I do think is
a pretty kind of genius bit of marketing. This lets
living users, including public figures, opt into Sora's cameo feature
(11:41):
by scanning their own face with their phone to drop
themselves into any Sora scene with remarkable fidelity. Open ai
says that cameo users are quote in control of your
likeness end to end, and the feature is designed quote
to ensure that your audio and image likeness are used
with your consent. Folks might have seen that Logan Paul
opted in, and since then there's been.
Speaker 2 (12:02):
A blood of ai videos of.
Speaker 1 (12:04):
Him coming out as gay, which he actually spoke up
and said, oh, I don't love these videos. They're kind
of causing me issues, but I guess he opted into them,
So what are you gonna do. Obviously, dead public figures
cannot consent to opt into this future, and they're all
over the platform. When OpenAI was asked about this, they
basically said, yeah, that's the situation. So already I have
(12:27):
seen Sora ai generated videos of things like Stephen Hawking
in his wheelchair, but he's doing skateboard tricks on a
skateboard ramp or Kurt Cobain from Nirvana eating a bunch
of KFC chicken fingers. I don't know why one of
the early use cases in this is bringing back dead celebrities,
(12:49):
but there you have it, and that's also how you
end up with content like AI generated Martin Luther King
giving his I have a dream speech at a podium,
but instead of words, he's making racist monkey noises. I
have seen eight ton of MLK AI generated content being
made with Sora. I saw content where he was eating
(13:10):
a crab boil with Malcolm X. I saw content where
instead of giving his speech, he gives that do you
know that meme of the kid who is saying, do
you ever believe you think you dream that you think
you could do anything?
Speaker 4 (13:22):
Have you ever had a dream that you you had
your you you could, You'll do?
Speaker 2 (13:28):
You want you you could do so, you you'll do?
You can?
Speaker 5 (13:31):
You you want?
Speaker 2 (13:31):
You want them?
Speaker 1 (13:32):
Where when Martin Luther King gets up to give this speech,
that's what he says. And the wildest thing to me
is that Bernie's King. Mlk's daughter says that people send
her these videos not necessarily to try to troll her,
because they don't like her, they send her these videos
because they think she might like it. And Zelda Williams,
daughter of the actor Robin Williams, who died by suicide,
(13:55):
shared that she also doesn't like this, that people will
send her these AI generated videos of her late father,
maybe thinking that she will like it or get a kick.
Speaker 2 (14:04):
Out of it.
Speaker 1 (14:04):
In a statement, she said, please just stop sending me
AI videos of dad. Stop believing I want to see it,
or that I'll understand. I don't and I won't if
you're trying to troll me. I've seen way worse. I'll
restrict and move on. But please, if you've got any decency,
just stop doing this to him and to me, to everyone,
even full stop.
Speaker 2 (14:21):
It's dumb.
Speaker 1 (14:21):
It's a waste of time and energy. And believe me,
it's not what he would want to watch. The legacies
of real people be condensed down to this vaguely looks
and sounds like them.
Speaker 2 (14:30):
So that's enough.
Speaker 1 (14:31):
Just so people can turn out horrible TikTok slop puppeteering
them is maddening. You're not making art, You're making disgusting
overprocessed hot dogs out of the lives of human beings.
Out of history and art and music and then shoving
them down someone else's throat hoping.
Speaker 2 (14:46):
They'll give you a little thumbs up. And like it.
Speaker 1 (14:49):
Growth AI is just badly recycling and regurgitating the past
to be consumed.
Speaker 2 (14:54):
You are taking in the human centipede of content. And
from the very very end of the line, all of
wall folks that front laugh and laugh, consume and consume.
That is a.
Speaker 3 (15:04):
Powerful condemnation of AI. And she's not wrong. It's really
pretty gross that this is like one of the big
use cases of this technology. And you mentioned that in
response to threatened copyright lawsuits, Open ai change their policy
to require copyrighted characters to be opted in. And you know,
(15:24):
that seems all well and good for the owners of
the copyrights, but that policy is doing nothing to protect
the descendants of these recently deceased public figures like Zelda
Williams or Steve Irwin's kids. And I think that's a
pretty good example of the inhumanity of these tech companies
and AI companies in particular, that they'll go out of
their way to protect copyrighted assets before protecting real humans.
(15:50):
I can't help but think of that new law they
have in Denmark to try to push back against this
that grants every living person in the country copyright to
their likeness as a continues to blur the line between fiction.
Speaker 2 (16:02):
And real life. I remain hopeful.
Speaker 3 (16:05):
I guess that that kind of solution to protecting people
is gonna be viable.
Speaker 1 (16:11):
Yeah, and I have to add this is a little
bit of a tangent, but when my mom passed away
last year, was my mom was somewhat of a public figure.
Speaker 2 (16:21):
She was definitely a public figure in our town.
Speaker 1 (16:23):
And I don't know, I think that people really need
to understand what it feels like when you are grieving someone,
someone that you had an intimate, private relationship with that
nobody else in the world had. Even though people might
think that they had an intimate relationship with that person,
they didn't. They you know, they might have liked them,
or enjoyed their work, or respected them, whatever. When your
(16:47):
loved one, your deceased loved one, when their image and
likeness and story is taken in that way, it just
really hurts.
Speaker 2 (16:54):
And it's so confusing.
Speaker 1 (16:56):
And the fact that there's a dynamic where the people
who do this and the people that profit from it
think that you might like it just disgusts me. So
I've certainly never experienced this on the level of Zelda
Williams or Bernice King, but I know a little bit
about how invasive that feels, and then the feeling of
it being out of your control, right, the feeling of
(17:19):
I cannot stop this, I cannot stop the Internet. So
you know how they say like, once something is on
the Internet, it's on the Internet forever. Imagine that feeling
when the intimacy and the privacy of somebody that you
are grieving, somebody that you have lost, is taken and
you know there's nothing that you could do about it.
It's just out there in the ether because people decided
it belonged to them, and it doesn't. I think there's
(17:40):
really something so wrong about this. And then hearing people say, oh,
but the platform is a lot of fun I can
see myself and my friends, look, I made a video
where it looks like we're doing a funny thing. I
don't think that's a fair trade off, But I also
understand that we don't have a culture that asks people
to understand the stakes of that trade off, only understand
(18:00):
that if it happens to them, and I hope it
doesn't because it fucking sucks. So yeah, it's a little
hard for me to see people just talking about how
they don't care about any of those issues that you
just brought up, Mike, because this platform it's fun for
a laugh and it gives them a dopamine hand and
they can make a little bit of content with their
friends and get twenty likes on it or whatever.
Speaker 3 (18:20):
Boy, Yeah, thanks for sharing that. You mentioned consent a
little bit earlier in this discussion, and I do think
that's a good framework to understand this where I get
the sense that a lot of these videos that people
are making are just for laughs or for trolling, which
is messed up in its own way. But it's really
interesting that some of these people sending these AI slop
(18:43):
videos to people that feature their deceased parents thinking that
they're going to like it. It just seems really interesting,
and that seems like something different than just fun and lolls.
It seems like this technology it's enabling the creators of
these view videos to be in relationship with the deceased person.
(19:05):
And you know, it's certainly not new that celebrities have
had fans where the fans want to be in relationship
with the person. And I think that's a big you know,
it's related to why people listen to podcasts. People like
to connect with the hosts that they like, and you know,
it can feel like a bit of a parasocial relationship,
(19:26):
and hosts like to connect with their fans as well,
but there's a separation there. I think there's a way
to look at this technology where it is a way
to non consensually break down that separation and be like, no,
we are in relationship whether you or your deceased parent
consent to it or not. And it's gross and people
(19:49):
shouldn't do.
Speaker 2 (19:49):
It, and not for nothing.
Speaker 1 (19:51):
I promise you that if Zelda Williams or Bernice King
wanted to use AI to make this kind of content
about their dead fathers.
Speaker 2 (19:58):
They could.
Speaker 1 (19:59):
You don't need to send it to They have access
to the same Internet that everybody else has access to.
Speaker 2 (20:04):
If they wanted to make these videos, they could.
Speaker 3 (20:07):
Yes, and making them with AI strips the uniqueness of
these figures that made them the exceptional public figures that
they are. Every time you run something through AI, it
just makes it feel right. Based on all of the
other data that was used to train the model and
(20:30):
strips away a little of that uniqueness. So in using
these people, these deceased figures, it diminishes them, which is
another part of what's I don't know, not great about it.
Speaker 1 (20:45):
That's what when Zelda says that it's the human centipede
of content, that's exactly what she's referring to. I think,
and it's Yeah, now I've changed my mind. I don't
think Zora is good. I don't mind being a party
pooper about this one.
Speaker 2 (20:58):
Not good. I don't like it.
Speaker 5 (21:00):
Yeah, let's take a quick break at our back.
Speaker 3 (21:19):
What else don't you like? Bridget Well, I don't love.
Speaker 1 (21:23):
What Google is allegedly doing to their women in Tech initiatives.
So this is a little bit of a developing story.
Is Google dropping their Women Tech Makers initiatives? So I
saw this post in the Women in Tech sub read
it on Reddit it reads. I was part of Google's
initiative Women Tech Makers or WTM for short. Our job
was to organize events and help other women find their
(21:43):
way in the tech sphere. Occasionally we will be invited
to Google events with travel costs covered and merched. Nothing big,
but it was nice to be acknowledged while also supporting
women in STEM since the end of last year. Beginning
of this one, support was dwindling. First they stopped hosting webinars,
then made us request funding pre approved and would only
pay us back what they thought was reasonable and that
(22:03):
would be it spent our own money out of pocket.
The final blow is when they invited us to the
event in Europe and provided zero travel cost funding for us,
while Google Developers Group their other initiative, received what one
hundred percent of their travel paid for. Like that wasn't
bad enough, they didn't allow us to come to the
community mixer.
Speaker 2 (22:22):
Women tech Makers was.
Speaker 1 (22:23):
The only community not allowed to join the community event.
Now they straight kick us out of the platform, removing
all access to the resources and new home us to
some nonprofit painting it as a fantastic opportunity and how
cool it would be. Seven years of being an ambassador,
curated and organized more than forty events, including huge IT
(22:43):
conferences with four hundred plus attendees a team of thirteen women,
everything is now in shambles and demoralized because the Big
G decided that the community that was promoting and supporting
women in STEM is two DEI for their brand, so
that sounded pretty messed up to me. A lot of
the comments were essentially saying, yes, this has been the
vibe you know they have been, their support has been dwindling.
(23:05):
So I did some digging and I found a piece
that explains that Google officially did hand over their women
tech makers to a nonprofit called Technovation, a nonprofit that
says that they've met nearly twenty years empowering young innovators
to solve real world problems through tech and entrepreneurship. I
want to be clear that Technovation sounds good like this
is not to be smirch them at all. They seem
(23:27):
genuinely interested in supporting women tech makers, and it is
entirely possible that they're capable of doing more with this
program and taking this program further than Google ever did
or ever would, So I'm not trying to be smirch
them at all. But even the name Technovation, like taking
identity out of it specifically, is telling to me. And
(23:49):
I also really couldn't find this explained anywhere. If this
was really meant to be this fantastic thing, this fantastic
partnership with these great opportunities, like Google told the women
Tech Makers ambassadors that it was going to be for them.
I would then expect Google to make some comment about
it somewhere or announce it in on the Google blog. However,
(24:11):
it seems like Google is keeping this move a little
bit quiet. When you go to the Women Tech Makers site,
it now just says Women tech Makers is now with Technovation,
a trusted organization dedicated to empowering girls and women in STEM,
and then redirects you to the Technovation site in a
piece called Google Women tech Makers moves to Technovation. But
not everyone is cheering, it explains quote. For many longtime ambassadors,
(24:35):
this transition feels less like a celebration and more like
closure for something that had already been quietly fading. Several
members say the signs were there as early as late
twenty twenty four, with dwindling support, fewer sponsored events, and
an overall sense that Google's enthusiasm for the program was waning.
Speaker 3 (24:52):
That is disappointing and just the way that this fits
so neatly with the larger trends of companies that aren't
just retreating from diversity commitments that they have made and
have supported for years, but seemingly like performatively running from
them to demonstrate that the Trump administration and their allies
(25:13):
how anti woke they are. It's just so disappointing and depressing.
But I guess it really shouldn't be surprising that, you know,
these giant tech companies were not as firmly behind those
commitments as they claimed to be. Back when that was
(25:33):
the in vogue thing to do.
Speaker 1 (25:35):
Yes, and when it was the in vogue thing to do,
they did it with such fanfare, such pr pushes. They
really wanted a lot of attention for doing this, and
now that they're retreating, they want to do it quietly,
And I just I don't think that's fair. If you
said it with your full chest when it was in
vogue that you were doing this, say it with your
full chest that you are abandoning these commitments that you no.
Speaker 2 (25:58):
Longer want to do that, and that's what you're going
to be doing. I don't like that they're doing it quietly.
It feels a bit sneaky to me.
Speaker 1 (26:05):
And Google can say whatever they want, but I do,
as you said, think it's a clear symbol of how
vocally championing women and other marginalized folks in technology is
so easily and callously discarded. By some of these big
companies and institutions, especially given that tech layoffs are rampant
right now and that they have disproportionately impacted marginalized people
(26:28):
like women. One study by eight full Ai found that
women are more than sixty five percent more likely than
men to lose their jobs in tech layoffs. THO Women
Tech Council reported that women were one point six times
more likely to be let go than their mill peers.
Another study from Revelio Labs back in twenty twenty two
found that black workers make up six point zho five
percent of the tech workforce, but accounted for seven point
(26:49):
forty two percent of all tech layoffs, so they were
laid off at a higher rate than their proportion of
tech employment. Same is true for Latino We're LATINX workers
who are definitely overrepresented in tech layoffs. And so I think,
against that backdrop, Google also backing away from all of
these commitments that they made for marginalized people in STEM
and tech, it's just craven to be so sneaky about
(27:11):
it against that.
Speaker 3 (27:12):
Backdrop, Yes, and if the mo here is to do
it quietly, you know, and we still notice some of
these big things, like you know, supporting an initiative to
organize conferences for women in tech. When the funding for
that goes away, people notice there's no way to do
that like completely behind closed doors because it was a
(27:35):
public basing event. But things like employment and who gets
laid off or not, that's so much harder to see.
And so if they're doing the publicly visible stuff quietly
on the down low, it just makes you have to
wonder what is going on with things that are less visible,
Like you just mentioned those employment or those layoff statistics,
(27:57):
and also those layoff statistics they're all for you know,
you mentioned that women are being more likely to be
laid off during this round of tech layoffs black workers
where Latin X workers are, and those are independent effects.
I have to suspect, based on decades of evidence, that
(28:20):
the interactive effects, the intersectional effects are even greater, right
where Like you can have the independent effect of if
you're a black tech worker you're more likely being laid off,
as that Revelio Lab study found, if you're a woman,
you're more likely to be laid off than your male peers,
(28:41):
as that Women Tech Council report found. I don't have
the numbers in front of me. But I would bet
good money that even above those independent effects, being a
black woman in tech right now puts one at even
greater risk of layoff than just the independ and an
additive effects of being black and a woman. We've seen
(29:03):
that so many times in analyzes of labor statistics.
Speaker 1 (29:09):
Oh, absolutely right, they I mean, I can I can
confirm they probably wanted to give rid of us first,
you know, but let's use this tech climate to get
her out of here.
Speaker 2 (29:19):
And I think, to put all of this back into.
Speaker 1 (29:21):
Perspective, even if you don't work at one of these companies,
it is better for all of us.
Speaker 2 (29:26):
It is safer for all of us.
Speaker 1 (29:27):
Our tech landscape is better when the people who are
making the decisions about the technology that we all use
look like the people who will use that technology. And yeah,
it's not just about I know, I've said this a
million times. Inclusion at tech companies is not just the
right thing to do ethically, which it is. It also
makes technology in the entire tech landscape better safer for
(29:49):
all of us.
Speaker 2 (29:50):
Everyone.
Speaker 3 (29:51):
We just talked about SORA. From a consent framework, right
how it's you know, from a certain perspective, it is
violating consent in a way that is hard people. You
have to suspect that if most of the people working
on that project were women, they might be waiting consent
a little bit higher in their policies.
Speaker 2 (30:11):
Oh.
Speaker 1 (30:11):
It reminds me so much of the story we talked
about when Instagram rolled out that feature that displayed all
of your stories on an interactive map. I think that
how they rolled that out would be totally different if
they had more women making decisions about that kind of technology.
Speaker 2 (30:28):
The way they I really will take this to the bank,
The way they rolled.
Speaker 1 (30:31):
Out a tool that allows you to see on a
map where.
Speaker 2 (30:35):
Photos were taken.
Speaker 1 (30:36):
I think if they had more women making decisions, they
would have rolled that out a little bit differently, Because
most women will tell you, Yeah, I don't really love
the idea, even if it's not sow it tracking my
real time location. I don't love the idea of giving
the Internet or whoever a visual map of my frequent locations.
Speaker 2 (30:53):
Yeah.
Speaker 3 (30:54):
I mean, this conversation is getting pretty close to the
core essential premise of this show, which is that women
and girls are so often the subject of the Internet,
but often left out of decisions about how it is
shaped and how it functions.
Speaker 1 (31:15):
Yeah, that reminds me of this thread post that I
saw from Kento Morita that says, us, we can make anyone,
any being in the world with this new technology. They
can be genderless, they can be a mystic creature. What
should we do first, developers, make a woman that does
what we tell them?
Speaker 2 (31:31):
References Tillie Norwood.
Speaker 1 (31:33):
That AI actress we talked about last week, Alexa Sirie,
Google assistant Eliza Like. The way that consuming and controlling
women and girls is at the core of our entire
tech ecosystem.
Speaker 2 (31:46):
It just is. You see it everywhere.
Speaker 3 (31:49):
Yeah, and yeah. We see it in the content that
is produced. We see it in the boards and leadership
of tech companies, and we see it in the images
and the language that makes up the Internet.
Speaker 1 (32:06):
So it's funny that you bring this up because I
was reading this study that basically suggests that on the Internet,
in technology, all the grown women are actually girls. This
is according to new research from Soleine Delacort, who is
an assistant professor at UC Berkeley's Hash School of Business
and the co author of this study published in Nature,
where researchers looked at images and text across some of
(32:29):
the most well trafficked spots on the Internet, chatbt, Google, Wikipedia,
IMDb and found that women are regularly depicted as younger
than men and devalued in the real world in real
world spaces because of it.
Speaker 3 (32:43):
This was such a cool study and just staggeringly impressive
in its methods and its scope and its ambition. I
just wanted to shout that out before you get into it.
They had over six thousand human coders over six hundred
and fifty thousand images. It's like a breathtaking amount of
(33:05):
data and its orders of magnitude more data than many
studies that look at similar topics.
Speaker 2 (33:11):
We'll link to it in the show notes, and.
Speaker 3 (33:13):
Also I want to give them credit for like it
is a pretty accessible read as far as studies go.
I found it quite accessible.
Speaker 1 (33:20):
Yeah, as our resident podcast scientist, I knew that you
would be excited about to talk about the methodology.
Speaker 2 (33:27):
How did I know that you were going to.
Speaker 1 (33:29):
Have a little big up to the methodology used in
the study. But I read a summary of the study
in Mother Jones called It's true the Internet skews the
reality of women and men in the workforce, and it
really is quite interesting. So basically the researchers analyzed over
half a million images from Google Search in which women
consistently appeared younger than men. This serves as a measure
(33:50):
of cultural bias because it's basically trying to give you
content that you are most likely to click on. That
is from co author Douglas school Bolt, who's an associate
professor at Stanford. He said that has a way of
being prone to bias because it ends up just amplifying
whatever most people click on. So across the Internet, women
are most commonly shown in their twenties, while men are
(34:10):
usually shown in their.
Speaker 2 (34:11):
Forties or fifties.
Speaker 1 (34:13):
And it's not just that women in these images look younger.
Often they are younger on IMDb, in Wikipedia. The researchers
were able to collect information about the actual ages of
the people in these photos that were all over the Internet,
and that trend really persisted. So importantly, this trend does
not accurately reflect reality in both online images and text.
(34:36):
The researchers found similar skewed depictions of men and women
in thousands of occupations and other categories, but in census
data across most of those fields examined, there were no
age differences among men and women in the few professions
where an age gap existed. In the census data, the
women tended to be older on average than men, But
(34:56):
the online images presented an inverted picture pattern we see
in the data just does not match reality.
Speaker 2 (35:02):
Devil Court said, the.
Speaker 1 (35:03):
Average woman in the US and actually in the world,
has a higher light expectancy. The average woman is older.
So what we see in online images and text and
videos is wrong. So the Internet is really out here
making us think that the only women around are younger
than we actually are, even in situations where the data confirms, okay,
women in these specific positions do tend to be older.
(35:26):
Even in those situations, the women are still reflected as
younger than we actually are, not just in the United States, globally,
and since, as the study said, this is all being
shaped by our biases, I guess this is just anti
older woman bias being reflected back at us via the Internet.
Speaker 3 (35:43):
It speaks to this assumption that underlies so many of
these large language models and AI tools. There's this assumption
that the training data right, They're scraping data from the Internet,
all the different corners of the these huge corpuses of text
and images, and I think there's often an assumption that
(36:05):
that data accurately represents the real world, the natural world,
but it definitely does not. It reflects what people want
to see on the Internet, what platforms want to serve
up to people on the Internet.
Speaker 2 (36:22):
And so it.
Speaker 3 (36:23):
Is not at all surprising that the distribution of women's
ages in photos on the Internet is not a perfect
match for the distribution of women's ages among the living
women on the world. These are two different things. The Internet,
the images that are on the Internet are serving a
(36:46):
particular purpose, and like, in that sense, it's not a
new idea, but I think it's very easy to forget
that fact when we're talking about training data or a
bunch of other phenomena about the Internet. And I think
they've this study has done a really good job of
(37:08):
numerically quantifying that gap, that difference between the real world
and the data that exists on the Internet.
Speaker 1 (37:17):
Yeah, and something else about that study is that they
really make it clear that even though these images do
not reflect reality, it goes on to shape our perception
of irl reality.
Speaker 2 (37:28):
This age gap.
Speaker 1 (37:29):
Myth impacts how people view women in the workplace. As
part of the study, the researchers ask participants to find
photos of people working across different professions. When the participants
elected photos of women, they assume that the people with
that job would generally be younger and have less experience.
Speaker 2 (37:44):
And it is funny to me, just as.
Speaker 1 (37:46):
A woman of a certain age, you know, I do
think that people have this idea that when you hit
thirty as a woman, you basically dig a grave and
crawl into it and die.
Speaker 2 (38:00):
You are pretty much out to pasture.
Speaker 1 (38:03):
I do think that there is an impression that there
are twomen in their thirties, forties, fifty sixties out here
living our dynamic lives, that once you hit a certain
age as a woman, you just go out to a lovely,
idyllic farm and they give you some overalls, and you're
never to be seen from again. You're certainly not in public,
you're certainly not making podcasts and living your life and
(38:24):
on television. They just give you a pair of overalls,
a sun hat, some clogs, a thriller novel, and a
mug of cranberry tea and they send you to that pasture.
Speaker 3 (38:34):
I mean, except for the part of being sent out
to pasture, that does kind of sound like something that
you would like no, being a woman of.
Speaker 1 (38:40):
A certain age fucking rocks, right. You know, first of all,
you don't give a shit what anybody has to say
about you. You're like, whatever you say about me, go
ahead and say it. That's fine with me. You completely
stop giving a shit about the way that you present physically.
All of that stuff is for you, and anybody who
doesn't like it a kick rock. You basically become whoever
it was you wanted to be or were at fifteen
(39:01):
or sixteen. But this time around, you're totally cool with it.
You've got zero insecurity about it, and if you're lucky,
you got a little bit of coin to support whatever
that thing is. Tea, herbal tea really hits different. I
feel sometimes I'm drinking a lot less.
Speaker 2 (39:14):
These days, and sometimes a good mugger. Herbal Tea really
does hit different.
Speaker 1 (39:18):
Aging is awesome, and anybody who tells you otherwise just
hasn't hit that age yet.
Speaker 3 (39:23):
I've heard people say that this is what they love
about the Real Housewives franchises, that they show women in
this age bracket that you're talking about, just demolishing those
stereotypes of what they're supposed to be doing with their lives.
Speaker 1 (39:40):
I'm sure that you got that for me, because that
is exactly what I love about these shows. What like,
Ramona Singer is almost seventy years old. Where else are
you going to see a woman in her late sixties
throwing back box wine.
Speaker 2 (39:53):
And throw an ass? Where else are you going to
see that? I want to see that? I want.
Speaker 1 (39:57):
I think we need reminders that people live full, dynamic
lives into their seventies. We need to dispel this notion
that you just fall off a fucking cliff or crawl
into a grave once you hit thirty, especially as women,
and there's not really a lot of places where you
get to see what that looks like. Bravo for Better
or for Worse is giving it to us, and I
(40:20):
like it.
Speaker 2 (40:20):
I don't know. I feel like every conversation that comes
back to Housewise.
Speaker 3 (40:23):
With me, you know, I'm liking at an algorithm. I'm
just trying to give you what you want.
Speaker 2 (40:29):
It's talking about housewives more.
Speaker 4 (40:32):
After a quick break, let's get right back into it.
Speaker 2 (40:47):
Okay.
Speaker 1 (40:47):
I have to give a trigger warning to this story
because it is very disturbing, but I don't think it's
gotten a ton of coverage, So I did want to
quickly talk about it. And that is a story out
of New Jersey where seventeen year old Vincent Betalaro was
charged with intentionally hitting and killing two seventeen year old girls,
Isabelle Salas and Maria Niotis at seventy miles per hour
(41:08):
with his car as they rode e bikes around their house.
Speaker 2 (41:12):
So, before this, Vincent shared these rambling live streams.
Speaker 1 (41:17):
He was a YouTuber who would YouTube about sports, but also,
according to New Jersey Today, would also talk about his
need for vengeance against one of these girls. He would
post these rants on YouTube live about false allegations against him,
including child sexual exploitation material and this one girl from
his school who was causing him legal trouble, he said.
Speaker 2 (41:38):
So he really.
Speaker 1 (41:39):
Blamed seventeen year old Maria, one of the girls that
he would go on to kill and her mother for
I guess legal trouble and legal problems he was having at.
Speaker 2 (41:48):
School, and he began harassing her.
Speaker 1 (41:51):
According to her neighbors and family, he delivered pizza's to
her house the kai that you're meant to pay cash
for upon delivery, as some kind of a bizarre revenge
for what he says was a critical post on social
media that she made about Charlie Kirk after Kirk's death.
So after he delivered these pizza to her house, Vincent
went on YouTube and said, whenever Maria sees the pizza guy,
(42:12):
come better think of Charlie Kirk for making fun of
his fucking death. Stupid ass clown, just remember that. So
he started stalking her, even staking at her home in
the months before he struck and killed her and her
friend Isabella as they rode their bikes on the street.
So her mom basically had been reporting this creep to
the police for harassing her daughter for a while, and
(42:33):
it just genuinely sounds like nothing was done. According to
NJ Advanced, Vincent even referenced all of this, the fact
that her mom had been reporting this and nothing was done,
on his live stream. He referenced a police investigation several
times on his live stream this summer, saying, because I
got into her relationship business and now the school, and
now I'm going to tell you all this. Cops got
(42:54):
involved in the school, got involved in the shit somehow
they're believing this crap. They suspended me based and definitely
until they figured it out, which makes again no sense,
But by June, he said, the police told him quote
that the case is going to be dismissed and I
Am not going to be facing any charges, he said
during a live stream. He also compared himself to an
ex Dodgers pitcher, Trevor Bauer, who was suspended over sexual
(43:16):
assault allegations back in twenty twenty one but never faced
any criminal charges. Vincent said on his live stream, quote,
I'm basically cleard of any wrongdoing.
Speaker 2 (43:24):
It says a lot. It says a lot.
Speaker 1 (43:26):
If they're closing the case on me for this and
they're bringing me back to school, it says how much
of an innocent I was in the first place. Now
people are trying to treat me like Trevor Bauer to
make me feel like.
Speaker 2 (43:35):
The bad person.
Speaker 1 (43:36):
So I wanted to talk about this one because I
don't feel like it's gotten enough coverage that this person
who was holding out a vendetta against one of these
girls for comments she allegedly made about Charlie Kirk ended
up murdering her after harassing her both in person and online.
Speaker 2 (43:51):
And I wanted to bring it up because it.
Speaker 1 (43:53):
Is absolutely tail as old as time that a woman
or a girl reports dangerous behavior that typically has an
online component. Nothing is done, It is belittled, it is dismissed,
it is chalked up to just oh, it's just internet whatever.
And then when nothing is done, it leads to larger
violence that could have been prevented. This, it sounds like
(44:16):
to me, this was preventable, the fact that this girl's
mom had been reporting this creep for a long time
and there's a documented even he talks about this documented
history of these reports and saying, oh, they found out
I was innocent. They dismissed it was nothing there. I
was innocent. I was innocent.
Speaker 2 (44:31):
Just goes to show that.
Speaker 1 (44:32):
When we do not take online harms against women and
girls seriously, it has violent consequences.
Speaker 3 (44:38):
Absolutely, what a tragic story. I cannot begin to imagine
the rage that that mom must feel that she clocked
this guy as a threat, she was reporting him to
the authorities, and nothing was done, and then he made
good on that threat.
Speaker 2 (44:58):
It's such failure.
Speaker 3 (45:00):
And I don't know enough of others. I don't I
don't want to just blame the police. I don't want
to blame the school. It's hard to know exactly who
should be responsible, and accountable here, but clearly this was
a huge failure and it's.
Speaker 1 (45:16):
So tragic, tragic and preventable. Okay, well, switching gears a
little bit, will you allow me to talk about Drake
really quickly?
Speaker 2 (45:27):
Yes? Please?
Speaker 3 (45:27):
We need a palate cleanser after that last story.
Speaker 1 (45:30):
Okay, So do you understand how deep a dis track
has to hit for the target of said distrack to
try to sue over it, because that is what Drake did.
So after Kendrick Lamar's epic distract Not Like Us that
basically was a song of the Summer where Lamar heavily
(45:50):
implies that Drake is a pedophile, and I mean not
for nothing. Drake did have some kind of a friendship
with Millie Bobby Brown when she.
Speaker 2 (45:59):
Was fourteen and he was.
Speaker 1 (46:00):
Thirty one, where Millie Bobby Brown said, oh, we're we're buds.
We text all the time and I don't know, maybe
this is just the old woman in me talking, but
what the fuck would a thirty one year old have
to be texting.
Speaker 2 (46:10):
A fourteen year old about nothing good? Exactly?
Speaker 1 (46:14):
So, as we like to say in the South, a
hit dog will holler I guess, And Drake was so
hit But he took it to the courts.
Speaker 3 (46:22):
Damn that is sad like I grew up in the
nineties when rappers were shooting each other, right, which is
obviously like very bad, but like we have come a
long way to file a lawsuit like come on, Drake.
Speaker 1 (46:40):
Well, this week the court deemed that a song that
implies someone is a sex criminal is not legally actionable.
Speaker 2 (46:47):
So this is from Billboard.
Speaker 1 (46:48):
A federal judge dismissed Drake's defamation lawsuit against Universal Music
Group over Kendrick Lamar's Not Like Us, ruling that a
quote war of words during a heated rap battle did
not viol the law in that some of the case
was logically incoherent.
Speaker 2 (47:04):
So Drake filed this suit.
Speaker 1 (47:06):
Earlier this year, claiming that UMG, the music company, defamed
him by releasing Kendrick Lamar's distrack, which tarred him as
a quote certified pedophile he likes to call himself a
certified lover boy more like a certified pedophile. Drake argued
that millions of people took that lyric literally, severely harming
his reputation, but this week the judge granted UMG's motion
(47:29):
to dismiss that case. Ruling that Kendrick's insulting lyrics were
the kind of hyperbole that cannot be defamatory because listeners
would not think they were statements affect The judge ruled
that quote the artist's seven track rat battle was a
war of words that was the subject of substantial media
scrutiny and online discourse. Although the accusation that the plaintiff
is a pedophile is certainly a serious one, the broader
(47:52):
context of a heated rat battle with insidiary language and
offensive accusations held by both participants would not incline a
reasonable listener to believe that not like us in parts
verifiable facts about the plaintiffs.
Speaker 3 (48:04):
This, I mean, it reminds me of the classic example
of Bob Marley's track, right, like when he says I
shot the sheriff. No one thinks that he actually shot
a sheriff. Right.
Speaker 2 (48:14):
This is singing, This is it's art.
Speaker 1 (48:17):
So I do have a point that I do feel
I need to make whenever we're talking about this Drake
and Kendrick Lamar beef, which is that ultimately we are
talking about allegations of harm against women and girls, and
I think it's important that that not get lost in
all the hooplah and fanfare around disc tracks and all
of that. I think it's incredibly easy for the fact
that we're having a conversation about actual harmful allegations to
(48:41):
go overlook because we're talking about something that is so
amplified and big.
Speaker 2 (48:45):
So I do think it's important to just name that.
But also it reminds me of.
Speaker 1 (48:51):
I don't even know if I should tell that story
on the podcast, but I once got in fairly serious
trouble at a job because of a karaoke song choice
that I did during a work conference where I sang
LLL cool Jay's mama said knock you out, and I
dedicated it to.
Speaker 2 (49:09):
Let's just say some public figures, and.
Speaker 1 (49:14):
And then I nailed the rendition that is my song.
If you've ever heard that song, it's a it's like
a tough one to sing. But I didn't really think
about the lyrics that are actually quite violent, and I didn't,
you know, when I got up to do this karaoke song,
I wasn't thinking this is going to be taken as
a as a literal promise or a literal threat. After
(49:35):
the performance, I'm slapping high fives, I'm feeling good.
Speaker 2 (49:38):
The next business day.
Speaker 1 (49:40):
The next Monday, my manager calls me into her office
and she's got the lyrics to Mama said, knock you
Out printed out for me to look at. And I
know this song from singing it at the gym and
from being a fan of llll Cooljay, but I had
never actually sat down and looked at these lyrics, which
are quite explicitly violent, like in the song e Elle
(50:02):
Cool Jay explicitly says these are lyrics that will make
you call the cops, Like there's one in particular line
that he says, don't you never ever pull my lever
because I explode and my nine is easy to load.
And I remember my boss reading that to me and
thinking like, oh, is this really the kind of song
you want to be singing on stage.
Speaker 2 (50:22):
With such a gusto, And you know she was right.
Speaker 1 (50:25):
Looking back now, I can see how it wasn't super appropriated.
It wasn't super cool, and I understood that, but yeah,
I wish I wish I could. I wish I could
have pointed to this judge's rulings, saying, ah, a judge
has ruled that we don't take song lyrics literally, even
at a work karaoke event.
Speaker 3 (50:41):
I'm so glad you told that story. I think it's
so funny. I've heard that story before, and I just
love to picture a young bridget like screaming that into
the mic. I bet you did kill it. Oh, I
literally killed anything or anybody, but like.
Speaker 2 (50:52):
No one is literally wanting to kill anybody.
Speaker 3 (50:55):
The other funny thing about this trade story is that
that same judge, Jeannette Varius, she's been involved in a
lot of these Trump cases and like most notably earlier
this spring, was like weighing in on whether Doge and
Elon Musk could access Treasury information. So she's been involved
in like really high profile, high stakes cases affecting like
(51:17):
the way that America functions. And I just find something
funny about the fact that she had to take time
out from that to weigh in on Drake's hurt feelings.
Speaker 1 (51:26):
I know, I wonder if she was, like, you know
those half glasses that judges wear kind of at the
bottom of their nose. Is she pouring over these lyrics
and watching Kenja Klamar kind of wink wink at the
camera during a Super Bowl performance, saying, Oh, Drake likes
a minor.
Speaker 2 (51:42):
You know, was she pouring over all of this. I'm
so curious.
Speaker 3 (51:46):
Yeah, I mean she seems like a thorough judge, so
I suspect she probably was.
Speaker 1 (51:51):
That's so funny the world we live in where one
day you're making a ruling on high level Trump administration
shenanigans and the next day your mediam a beef between
Drake and Kendrick Lamar.
Speaker 2 (52:02):
What a country? Yeah, what a country?
Speaker 3 (52:06):
So, Bridget, we got to wrap it up here, but
I wanted to promote the mail bag if that, If
that works for.
Speaker 2 (52:13):
You, please promote the mail bag.
Speaker 3 (52:15):
Yeah, so we're gonna do a mail bag episode. We've
gotten some great submissions, Thank you so much to everybody
who wrote in.
Speaker 2 (52:22):
We did just like a.
Speaker 3 (52:23):
Couple more to really have like a full episode. So
if you're listening, please write in with a mail bag question.
It doesn't even need to be a question. It could
just be something that you're thinking about, uh, that you
want to get Bridget's take on. Just write in and
let us know. Or if you do have a question
for Bridget or for me, or about the show, or
(52:44):
about anything something you want us to look into for you,
happy to do it, but please right into Hello at
tangote dot com and we're gonna sweeten the pot that
if you if you want a free sticker, you get
a free stick for writing in. We've got these cool
new tangoty stickers. We'd love to give you one. Just
(53:04):
include your address. We're not gonna use it for anything
other than sending you a sticker. But yeah, right in
and listen.
Speaker 2 (53:10):
No, I don't know.
Speaker 3 (53:11):
We're gonna leave it open for one more week. We're
gonna extend it a little bit, so Bridget, do you
have anything to add to that.
Speaker 2 (53:17):
The stickers are vinyl. You can put them on water bottles,
which I love.
Speaker 3 (53:21):
Yeah, they're high quality stickers. These these are not the
cheap things. Uh, they're the good ones.
Speaker 2 (53:27):
So write in let us know. We want to do
this mailback episode.
Speaker 1 (53:30):
If we don't get enough submissions, I'll just answer the
questions that we got. I guess to the people, to
the two people who ask questions, I will just answer them.
But yes, Mike, thank you for being here as always,
and thanks to all of you for listening.
Speaker 2 (53:44):
I will see you on the internet.
Speaker 1 (53:51):
Got a story about an interesting thing in tech. I
just want to say hi. You can be just said hello.
At tenggoty dot com. You can also find transcripts for
today's episode at tengoity dot com. There Are No Girls
on the Internet was created by me bridget Toad. It's
a production of iHeartRadio and Unbossed Creative. Jonathan Strickland is
our executive producer. Tarry Harrison is our producer and sound engineer.
Michael Almato is our contributing producer. I'm your host, bridget Tood.
(54:14):
If you want to help us grow, rate and review.
Speaker 2 (54:15):
Us on Apple Podcasts.
Speaker 1 (54:17):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.