Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
So last week we hit pause on our usual Friday
news wound up so we could dig into a different story,
and that is what is being erased from Charlie Kirk's
legacy after his murder. That meant no news round up,
But don't worry, We've got it for you now. It
is a fun one, so please enjoy There Are No
(00:21):
Girls on the Internet as a production of iHeartRadio and
Unbossed Creative. I'm Bridget Todd and this is There Are
No Girls on the Internet. Welcome to There Are No
Girls on the Internet, where we explore the intersection of technology,
social media, and identity. And this is another installment of
our weekly news roundup where we dig into some of
(00:43):
the stories that you might have missed so you don't
have to. I am so thrilled to be joined by
this week's a guest host, Ashley Ray, the host of
TV I Say, a podcast that I've loved for a
really long time. It was on hiatus, now it's back
on the Courier Network. Ashley, thank you so much for
being here, Thank you.
Speaker 2 (01:01):
For having me. I am so excited and I'm a
big fan of you. Oh my goodness, Well this is great.
Speaker 1 (01:06):
So you know this a tech podcast, but it's also
a culture podcast, which includes TV, which I watch a
lot of. For folks who don't know, Ashley has the
most correct television opinion out of anybody out there. So
if you disagree with Ashley, you're basically just wrong. You're
you're wrong about TV. I just finished listening to your
episode with Gibson John's, which was delightful. I will say, like,
(01:29):
something that I was so curious to ask you about
that episode is that you know, you were talking about
reality TV, which I'm obsessed with. I watch a lot
of it. Uh, and it sounded to me that you
both were kind of agreeing that you don't kind of
like get mixed up in the online fandoms. That is
something I've really noticed you. You can't just watch and
enjoy a show like Love Island. You like, like, let's
(01:51):
say that you have an inconsistent opinion about somebody on
Love Island. There's this like rabid fan based dominated in
the conversation. Oh, you just stay out of it.
Speaker 2 (01:59):
I mean I got called out for that very reason.
Speaker 3 (02:03):
Uh.
Speaker 2 (02:04):
This last season of Love Island, I was a big
O'landria fan. I loved her throughout the season. I was
rooting for my girl. I wanted her to leave Taylor
the whole time, and when you know, it was that
frustrating we when she just like kind of wouldn't do
it and her friends were like, no, you should try
to get it, and I was like, I'm so frustrated.
You need to play the game differently and go for
what you want. Girl, Like, I'm so tired of this.
And people were like, how dare you criticize her? Weren't
(02:27):
didn't you just say you liked her and now you're
criticizing her? And I was like, so you can like
someone and also not like everything that they do and
and then yeah, and people were like, I don't understand.
And then I was happy when she got with Nick,
and people were like, how can you be happy when
you were just criticizing her?
Speaker 1 (02:43):
And I'm like, because she did so she went for
what she because she did what she wanted.
Speaker 2 (02:48):
And it's it's just wild, like they're so it's stand
up all of it. I don't quite get. There's some
docs out about it, actually, if you want to check
about Stan on Paramount Plus. People really really intensely love
to love someone and reality stars, I guess are the
perfect target.
Speaker 1 (03:08):
I've been a reality TV fan for a very long time.
You know, some of the shows that I have watched,
I have a very long relationships with the I'm thinking
about shows like OG Real Houseives of New York. Yeah,
I feel like the stan standem online fandom stuff. I
feel like that is so that it's gotten so much
(03:28):
more toxic and so much louder, And it didn't used
to be like that back in the day. You would
read what Kelly Bensimone had to say on the Bravo blog, right,
but it didn't really necessarily go like you didn't have
people watching, engaging in this level of fandom online around
it and looking at your every move.
Speaker 2 (03:46):
I just think Love Island is the sort of perfect
example of how bad this just ecosystem has gotten, where
they went out and looked for things in these people
people's lives because they didn't like them, just so they
could pull that out and cancel them. People were calling
ICE on someone in the cast family, like people were
(04:06):
calling CPS on Hood's daughter and her and her baby daddy.
Like that's where I just I don't understand. I also
have been watching reality TV since the Osbourne since Simple
you know, since a simple life, Like we didn't do this,
Like I was never watching the Osbourne's and I was like,
I need to know Ozzy's every thought and who he's
(04:27):
friends with, and I need to follow him on every channel.
And now it's the expectation is like they need to
immediately leave the show and they need to be on
Snapchat and this and this, and I should be able
to know at all times what events they're going to.
Are they still dating this person? And I don't know
exactly why people think they have that level of access,
or why they deserve it, or why they think it's
(04:49):
mandatory if you're on television. But I don't think that
we can demand that of people just because they go
on a reality show.
Speaker 1 (04:57):
No, And it also just makes it less fun. Yeah,
I want to watch a Philly fun show. I don't
want to have everything this person has ever done and said, yeah,
dragged up on social media and then be expected to
litigate it myself. But like watching the TV show.
Speaker 2 (05:10):
Yeah, like it's a political statement. If I like watched
Love Island and left at something Austin said. I don't
like Austin, but he can say something funny. It's entertainment,
and so I like with ninety Day Fiance, I try
to separate like the real happenings and I just watched
the show. I don't like pay attention to the gossip
of the fans and what everyone's arguing.
Speaker 1 (05:32):
I just I try.
Speaker 2 (05:33):
I just put my faith in the narrative that I'm
presented by producers.
Speaker 1 (05:37):
Yes, I mean I have noticed, even beyond the fandoms,
especially on Bravo, some of the shows have gotten quite dark.
You know. I'm thinking about some of the some of
Brinn's storyline from the last season of Rony, which was
a mess. Some of the stuff, I don't know if
you watch Potomac, some of the stuff with that Mia
the Valley is quite dark. Like I had to sort
(05:58):
of check out of it for a while because I
was like, well, I don't want to watch like like
a family go through something that seems quite real.
Speaker 2 (06:06):
Even like Below Deck got really dark, Like, yeah, I
don't know what was up with Bravo or I don't
know what was kind of going around the Bravo universe
at the time, but like even Below Deck, I think
it was blow Deck. Mediterranean had this odd season where
like two people were kicked off of the crew because
they like assaulted other people on the crew, and everyone
was just like should we still be doing Below Deck?
(06:28):
And they kind of were like, we need to redo it,
like we just need to change everything, like we need
to make it fun again, like let's get some fun,
happy people in the crew.
Speaker 1 (06:36):
And I don't.
Speaker 2 (06:37):
I think obviously if everything is the pandemic and you know,
but I think there was something about that moment and
people being more online where I feel like in reality
TV people.
Speaker 1 (06:49):
Share more now.
Speaker 2 (06:50):
I think there was some expectation before of like this
is fake or like we're making a storyline, and now
people are like, well, you know, during COVID, I was fine,
like having a live log of my life, So why
would I make things up? Like yeah, I'll share these
kind of deep personal things. And I guess it's great TV.
Like I do think reality TV now feels more authentic
(07:11):
in a way, especially the stuff like ninety Day Fiance.
But I guess that is encouraging people to go, yeah,
you share everything, so I should have complete access to
you. You should be on like you know, TikTok live all
the time whenever I want to see you.
Speaker 1 (07:23):
Yeah, I remember back in the day. I didn't watch
OC but like the stuff with Taylor Armstrong, there was
like darkness there. But looking back now, it's clear we
weren't the audience was not being told everything, and there
was still this attitude of like, oh this, we need
to make this seem a little more polished than it
actually is. And I think today that they would they
(07:45):
would they would handle that in a completely different way.
Speaker 2 (07:48):
Oh yeah, yeah, I yeah, I will say I do
I give respect to reality TV produced They I don't
even respect, but I do think that they have put
more focus on like mental health and just trying to
think of, oh right, we're putting these people on TV
and how they appear could have consequences, and we're a
(08:11):
big part of how they appear and how those narratives
are made.
Speaker 1 (08:15):
I hate to keep going back.
Speaker 2 (08:16):
To Love Island, but it was you know, a lot
of what like Olandria face where they show kind of
at some points made her seem like a bully her
and Shelley, and you had BuzzFeed like posting memes about
like punching Shelley in the face and stuff, and it's like,
you know, yeah, and it's like producers are feeding into
that and they had to like kind of change the narrative.
(08:36):
So I do think reality producers now are like, yeah,
let's handle things with care.
Speaker 1 (08:43):
I will say, I'm sure people listening are like, I'm
tuning in to hear tech takes, not reality TV takes.
But I do feel like being a longtime scholar of
reality TV, I think it helps me understand and explain
a lot where we're at politically and social Trump he
honestly got his start as a reality TV star. He like,
(09:05):
if you understand the medium of reality TV, if you
understand how villains work on reality TV, I think it
can be helpful to understanding this our current political.
Speaker 2 (09:13):
Moment, our current political moment, and moment in entertainment. I
think if you want to understand why kids are watching
kick Now instead of television, why people prefer to watch
a show on YouTube versus you know, traditional streaming, A
lot of that has to do with reality TV and
kids who've just grown up on that and they are
used to it, and not only to the point where
(09:34):
it's the entertainment they're accustomed to, but it's the entertainment
that they want to make and know how to make.
Like there are people that are so young, and I
engage with them on TikTok and I'm like, oh.
Speaker 1 (09:44):
My gosh, you would be perfect on real.
Speaker 2 (09:47):
Healthwives, Like how like you know how to do this,
you know how to craft a narrative out of like
your life or whatever is happening. And I think, you know,
as those kids grow up and start making media, I
think we're going to see them really just doing more
with like reality and it blending the line between that
and documentary. There is a documentary that just came out
called Thirstrap on Paramount Plus. It's about this TikTok influencer
(10:12):
name like Whitey. He was like a white guy who
was very attractive, and he winked to some song by
Rod Stewart. I don't know, there's like a different internet
for white people that goes viral. I didn't know this guy,
but all of these like middle aged women became obsessed
with him and started sending him thousands and thousands of dollars,
and he basically dropped out of school to make this
(10:34):
his career and slowly realized like, oh, this attention economy
cannot last for forever. It has these negative impacts on
my mental health and my psyche and also it does
a horrible impact on these people who demand constant access
to me, and you know, start doxing his family when
(10:54):
he makes them mad because they don't like that he
had a girlfriend. And so I think, you know, if
you want to understand why it seems like people are
just so intense about everything, now, I think reality TV
it's the answer.
Speaker 1 (11:11):
I need to watch this. Your laugh is so recommendation
was a good one. I hadn't even heard of it.
That what is it? Furry police?
Speaker 2 (11:19):
No, yes, unmasking a murder? Yes, furry detectives.
Speaker 1 (11:23):
That everybody should be talking about that documentary.
Speaker 2 (11:26):
Why isn't everyone talking about it?
Speaker 1 (11:28):
I am like, this is it's the wildest thing I've seen.
Speaker 2 (11:31):
Every person I introduced to it is like, first of all,
why did you do that?
Speaker 1 (11:34):
I have nightmares now?
Speaker 2 (11:36):
And it's so well made, it's so well done, and
it's a story that no one knows about. And you
think like, oh, we've gotten to the worst part. And
then they're like.
Speaker 1 (11:45):
So there's two more episodes and you're just.
Speaker 2 (11:47):
Like, how, how does it get worse than this? And
it does and then it does.
Speaker 1 (11:52):
Okay, So this is actually a great segue into what
I wanted to talk about first, which is everything happening
with CBS. I will give a little bit of backstory
to sort of set it up. So a few months ago,
CBS settled a lawsuit with Trump over allegedly a deceptively
edited interview of then presidential candidate Kamala Harris. So we
(12:12):
know Trump is not We'll say Trump has a specific
kind of relationship to the truth. But this is what
the suit alleged. His suit alleged that a sixty minutes
interview with Harris was deceptively edited to portray her in
a more favorable light. So specifically, the claim was that
there were these different versions of Harris's response to a
question that were used in different broadcasts, like a shorter, clearer,
(12:36):
more favorable version on face doination and then a more
kind of rambling, less polished, full version that was in
a sixty minutes broadcast. I have heard both versions, and
it is incredibly clear to me that the version in
question that he is upset about is like just edited
for clarity, like they've edited it for time because it's
a shorter segment. It's like very very clear to me
(12:57):
that's what's going on. So you're a podcast. You work
in media, you know this is a pretty cut. Like
I like, the podcast that you're listening to right now
is not raw audio. It is edited for clarity. I
don't want to I don't want to burst anybody's bubble.
But if media, it's edited. Yeah, you don't want to
hear us going uh uh uh. You don't need that,
(13:19):
so they cut it out.
Speaker 2 (13:21):
I don't really understand why he's upset because I feel
like the same. It's not like the message becomes different.
It's I don't think she looks much better in the
other one. It's just quicker. So yeah, I didn't get it.
I'll see the big deal you and me both.
Speaker 1 (13:40):
But Trump's argument was essentially that CBS editing these interviews
was election interference. Sued CBS. CBS settled and agreed to
donate sixteen million dollars to CBS's parent company, not to
Trump himself, but to his sort of presidential library fund.
Out of this came in agreement that CBS and sixty
(14:01):
Minutes in the future will release transcripts of interviews with
US presidential candidates after they air, with allowance for reactions
for like legal or national security concerns. Typically, I do
feel like a company only will settle like this if
they feel like they might lose. But I think that Trump.
I think that with CBS. Honestly, I think that they
were like, you know, this is a way to curry
favor with the new administration. I think from the perspective
(14:24):
of a CBS executive, it's like, oh, a win win,
we get to avoid this what could be a nasty
fight with a vengeful administration, and then we could hand
that administration like a pr win kiss the ring, easyps right.
Speaker 2 (14:36):
Yeah, they already sacrificed Stephen Colbert as his alter. There's
already talk of them wanting to buy Warner Brothers Discovery
today that Paramount Help just today, Yeah, yeah, just today
that came out that this is the next move. They're
just buying it all up to make it cable again.
(14:57):
So I think them knowing that it's going to get pushedback,
that would just be such a large conglomerate. I think
they're trying to do whatever they can to abuse this
administration so they can I don't know, buy Peacock in
every single streaming platform. I don't I don't know why
they want to do that, but yeah, I don't know.
Speaker 1 (15:16):
What the endgame is either, but I will say like
they really did a lot, I think to curry favor
with the Trump administration. They appointed a bias monitor, and
I think that just goes to show like there is
no end to like when you give an inch, they
take many many miles, there is no end to what
they take. And so CBS already has all of this
(15:38):
unprecedented oversight from the federal government to make sure that
they're not being I don't know, biased against Trump for
mean to Trump. So it is against this backdrump enter.
The United States Secretary of Homeland Security, Christine Home, she
did an interview with CBS and that she did not
like how that interview was edited after the fact. So
(15:58):
after the interview, she tweeted, this morning, I joined CBS
to report the facts of kilmar Garcia. Instead, CBS shamefully
edited the interview to whitewash the truth about this MS
thirteen gang member and the threat he poses to American
public safety. Now, mind you, when she tweeted this, CBS
had already posted both the full transcript and the full
(16:21):
recording of that interview. So it is sort of like, well,
I don't know just it just it just seems like, yeah,
what else do you want at this point.
Speaker 2 (16:28):
They're never They're never going.
Speaker 1 (16:29):
To be happy. It's just they'll never be happy.
Speaker 2 (16:32):
If they can find something to pick at to say
I'm being victimized, they will. That's what they want, is
that ability to say you're targeting us.
Speaker 1 (16:42):
It's unfair for us.
Speaker 2 (16:44):
So even if you know these networks do everything they want,
they're still going to turn around and be like, oh, yeah,
well that Salas Park episode was rude, and so actually, no,
we're not going to approve this.
Speaker 1 (16:55):
You know, it's they're never going to be.
Speaker 2 (16:56):
On your side.
Speaker 1 (16:57):
You can't truly buy them. Yes, exactly so. And again,
as we were talking about earlier, it is just so
common for networks to do this with lengthy interviews. It
is not unusual, it's it's totally commonplace. And again the
substance of the complaint doesn't even really stand because it's like, Okay, well,
somebody could just go watch the full interview. Honestly, when
(17:18):
you look at what they chose to edit so in
the interview that upset her so much, she's talking about
kilmar Garcia. Kill mar Garcia of course, with the man
who was sent to immigration prison in El Salvador because
of what was widely reported to be an administrative error.
She accused him of essentially sex crimes against a child.
He has not been charged with this, and as far
(17:38):
as I know, there was no evidence of this. Besides noo,
I'm saying the Department of Justice is like looking into it.
So CBS, I think kind of completely correctly made the
correct call. They're like, oh, we can't just.
Speaker 2 (17:49):
Air belamatory live. Yeah, Like we don't want to get
sued for libel like that is real.
Speaker 1 (17:58):
Yeah, And so I don't blame CBS for this at all.
And to be honest with you, I feel like they're
almost doing Christy a favor, because when you say something
that is wild and also potentially legally actionable, editing that
out is a favor. I almost thought they did her
a favor by not allowing her most salacious, baseless, inappropriate
(18:21):
accusation just stand. In that way, they almost are trying
to make her look I think, I think anything more sane.
Speaker 2 (18:28):
Yeah, And that's the thing is that she doesn't want
to look sane. She wants that clip that's gonna go
viral and like get people angry, So that's what it is.
At the end of the day, it doesn't matter if
it's a lie or not true or unfounded. She dropped
that because she knew, Oh, this is what people are
going to freak out over and say, like, oh, she's
saying she's making things up, and that's what she wanted,
(18:48):
and she didn't get that moment, so she had to
turn it around and say, you know, CBS is editing
me and whitewashing what I had to say.
Speaker 1 (18:56):
Exactly, and I guess I don't feel like journalists are
required to repeat on authoritarian administration's baseless smears, and at
the end of the day, like journalists do have an
ethical obligation to not just let anybody say anything on
their platform. However, because CBS is run by people who
are really trying to curry favor with this administration, they
(19:19):
announce that they are no longer doing any editing of
interviews like this. So just days after these complaints, CBS
News said that they would no longer allow editing of
its guests words on the Sunday Morning Public Affairs show,
So going forward, they're only going to do broadcast live
or live to tape meeting that guest statements could not
be edited subject to legal or national security restrictions. Phbeus
(19:40):
if this change was made in response to audience feedback,
which I guess you mean, like an audience of.
Speaker 2 (19:44):
Ones, you know, you know, trum In Trevid Gnome just
like saying to me, I do like this honestly, because
it feels a bit sarcastic and sasse on CBS's part,
because I do think this is going to come back
to bite a politician in the ass. I they're they're
gonna be like, oh right. There was a reason why
we wanted things edited, and we wanted to have a
(20:06):
second to collect our thoughts.
Speaker 1 (20:09):
Uh.
Speaker 2 (20:09):
So that's what I'm excited for. Someone is gonna say
something really dumb.
Speaker 1 (20:13):
This is a low stakes podcast, and it is lightly
edited for clarity. If you're talking about important global politics
stuff with big implications, you're stressed out. If you're in
this administration, maybe you've had a couple drinks, you know
what I'm saying, and the lights are on you. We're
about to get some gas. That's a great point. Yeah, yeah,
it's gonna be good. So okay, I wanted to include
(20:34):
this story because there's another sort of substory in this
that it's my own little personal thing. So I have
a little bit of a bone to pick with CBS, obviously,
and I live in DC. We've been dealing with the
entire you know, takeover of our police force and surge
of a National Guard in my city. President Trump was
trying to make a big show of how safe DC
(20:56):
is and how he's finally cleaned up the city, YadA, YadA.
So he was making a big show of going to
dinner at a seafood restaurant downtown. When he got there,
the activist group Code Pink had people in the restaurant
to protest. They screamed, they called him hitler. It was
a great clip. However, when CBS News reported on it,
(21:16):
their headline was quote when President Trump ada or Washington
restaurant to promote his federal law enforcement surge of the
nation's capital, he was greeted by protesters inside but cheers outside.
So I'm gonna play you a clip, Ashley, and this
is a clip of I'll put it in the show notes.
It is Trump walking into the restaurant. It is very
(21:38):
clearly outside. I'm gonna play you the clip, and you
tell me what you hear. Yeah, so that is the
(22:00):
dynamic that CBS News described as cheers from outside. I
don't know what ye Ashley, would you give it time
about as cheers?
Speaker 2 (22:09):
I know, I gotta give that to the booze.
Speaker 1 (22:11):
The booze won that one.
Speaker 2 (22:13):
Uh and people swearing there were some wooze but they
were drowned out by the booze.
Speaker 1 (22:19):
Yeah, I feel like a more honest way to say
it was cheers and jeers something, But just to say that,
oh the it clearly sort of suggests that the people
outside on the street love them, and there was echoes
of cheer. But those activists inside they've always got They're
always complaining about something.
Speaker 2 (22:38):
Yeah, you know, they're just people. They always making a
rock case. And but everybody outside was like, yeah, we
love this Trump. So yeah, yeah, I just CBS, what
are you doing?
Speaker 1 (22:48):
What are you doing? Get it together?
Speaker 3 (22:51):
Yeah, let's take a quick break at our back.
Speaker 1 (23:09):
Okay. So I also wanted to talk about kind of
a darker story, which is the well dark, but I
guess it is a story about justice being served. So
I'll let y'all be the judge. So this website, this
porn website, girls do porn the creator of this site
has sort of been He was on the LAMB for
a while, he was on the FBI's most wanted list,
(23:31):
and this week he was sentenced to you might have
seen a documentary about it. If yeah, Ye're gonna say, yeah,
I've definitely seen this doccumentary yet, so you know all
about it. Michael James Pratt, the creator of the California
based porn website Girls Do Porn, was sentenced to twenty
seven years in federal prison for sex trafficking after pleading
guilty for using force, fraud and coercion to recruit hundreds
(23:54):
of women, many of whom were in their late teens,
for adult videos. He was sentenced on one charge of
sex trafficking in my force fraud or coersion, and one
kind of conspiracy to commit the same crime. So I've
also the documentary is quite good. It does really make
clear how long of a thing this has been. That
(24:15):
that documentary came out years ago, the site was online
even earlier than that, and only this week was he
actually sentenced.
Speaker 2 (24:22):
Yeah, I did not know that there was more happening
with it, Like it feels like it was so long ago.
I was like, oh, right, how did that documentary and like,
did anything actually happen to him?
Speaker 1 (24:33):
I guess not until now. Yeah, so it's been kind
of a long time coming. Pratt was on the FBI's
ten most wanted list when he was arrested in Madrid
and twenty twenty two, three years after he fled while
awaiting sex trafficking charges. The San Diego Union Tribune reports
that Pratt, a forty two year old New Zealand citizen,
admitted in a plea ar agreement earlier this year that
between twenty twelve and twenty nineteen he conspired to traffic
(24:56):
fifteen victims. The authorities have said that as a tiny
fraction of the app actual victims of the conspiracy, it
is a dizzying array of crimes. Basically the site it
just sounds so awful and scammy. Basically, these are not
women who were consensually wanting to be and signed up
to be involved in making consensual adult content. This was
(25:18):
back when I guess you might call it casting couch
porn was like a growing thing online, and so he
had this entire very gross, not to mention illegal scheme
worked up where him and his team would post what
seemed like legitimate modeling ads on Craigslist that women looking
for modeling work would apply to. Then they would pay
(25:40):
other women to kind of act as references to vouch
for this company and say, oh, it's a legitimate modeling company.
I had a good experience with them. You can trust them.
So when I was in college, I did a little
bit of modeling on the side for extra cash. And
I will say the it is a space rife with
all kinds of gross stuff. But what is a it's
(26:01):
it's not it's not a great space, I'll say. And
you you're when you're wading through it, it's like scam, scam, scam,
scam scam.
Speaker 4 (26:07):
Yeah.
Speaker 2 (26:07):
My my brother is a model and he lives in Texas.
One time he was like, Hey, I'm gonna be in
La I'm shooting. I'm a model.
Speaker 1 (26:14):
I'm gonna do this thing with an agency. I was like,
what are you talking. He's like, yeahs the underwear thing.
I'll be in a warehouse.
Speaker 2 (26:19):
And I was like, oh no, no, no, no, no, baby, yes,
mendis ad.
Speaker 1 (26:25):
It was legit. It was okay, it was legit. It
was legit. It was legit.
Speaker 2 (26:28):
But I was like, no, I'm coming with you, I
will be checking this out good.
Speaker 1 (26:32):
Yeah, that was always my thing. I would always show
up with that with a male friend or you know,
that was always the way that I played it. But
so even for a space that is rife with those
kinds of gross things, what is described here I have
never encountered and had not heard of. So when these
women would would you know, think like, okay, this is
a legit, legitimate modeling gig. They would show up and
(26:54):
then they would be asked to do sex acts on camera,
but often would be applied with drugs or alcohol, made
to through a contract that they were unable to read,
or even verbally promised that the videos would not be
published in the US, only abroad, or it would only
be sold for like private buyers, so it won't be
widely distributed. So they would knowingly lie to these women
(27:15):
and with the understanding that these videos were going to
be distributed widely and essentially blow up their lives. I
found this to be really interesting. According to people who
were in on this scheme, fifty percent of the women
were not even paid the amount they were promised. So
after all of that, you know, you're already kind of
being forced into doing something that you don't want to
(27:37):
do via lies. They're like, oh, we're going to give
you two thousand dollars and they didn't even get that.
So I don't think I don't need to tell anybody.
All of this is not only just like scam me,
it is also illegal. And the sex acts that they
would do once the women you know, agreed to do
this were often violent and sometimes according to the Department
of Justice Criminal Right, it was it was assault. It
(27:57):
was like very violent. The Sandy Jago Union Tribune has
some really in depth reporting about the sentencing and how
Pratt was made to listen to his victims and like
face them for the first time. A lot of these
women have been trying to communicate with him for years,
saying take this down, take this down, and he would
just ignore them and eventually skip the country. So him
(28:19):
having to actually listen to what they had to say
for the first time I think was really resonant. Their
testimony is horrifying. They talk about how after the footage
was released online, their personal contact information would also be released,
so they would face harassment. They would, you know, face consequence.
People would send videos to their friends and family, they'd
be fired from jobs, kicked out of school, lots of
(28:41):
addiction issues, lots of suicidal ideation. So when these victims
faced him in court and spoke to him, you really
kind of hear how important of a moment it was
for them. A couple of things that they said. One
victim said, we meet again, but this time it is
you who cannot leave. Another said, we are not here
for forgiveness, we are here for justice. I won. One
(29:02):
woman told Pratt, turning to speak directly to him, I
am not your victim. I am your reckoning. So these
were women who were, like I am waited so long.
Like I definitely think people should read the article because
they talk about how these women almost formed a sisterhood
because they were so like they had to band together
to take down this too.
Speaker 2 (29:20):
Yeah, so even to get to get any of this
to happen, to get anyone in law enforcement to pay
attention and to take it seriously.
Speaker 1 (29:27):
It's so sad how long this took. And I found
this bit to be really interesting from the piece. One
victim told the judge that Pratt provided her a cake
on the day that she shot her video because it
was her eighteenth birthday. Another woman identified in the fleet agreement,
as victim number one was also eighteen years old when
Pratt quote rushed her through a contract and did not
provide her with a copy. He paid her two thousand
(29:48):
dollars and then ignored her please to pick the video
down when it posted online. That woman said in court
that she has since graduated from Princeton University and now
works with the tech industry and has become a specialist
and helping people send takedown notices to websites.
Speaker 2 (30:03):
You know what, I follow her.
Speaker 1 (30:05):
I follow her. Really, she's very cool. She's very cool.
Speaker 2 (30:10):
Because people like, like misogynists are always like, you know,
some rich guy probably paid for you to go to Princeton,
like you're a sugar baby. And she's like, uh no,
actually I had a fluorized scholarship and I do this.
And she's just such a badass who is not afraid
to shoot back at these people who, you know, think
that they can control women with online manipulation.
Speaker 1 (30:33):
Yes, And I think your point is such a good
one because I think for so long the implication was
that these women don't deserve respect. They did it, They
did it to themselves, exactly when we actually look at
the charges and what they is what the government says
he did. They didn't do anything themselves. They showed up
thinking there was going to be a modeling gig and
(30:53):
they were clearly lied to and manipulated, And so it's
I don't think it's fair to be like, oh, well,
you brought this on yourself, didn't know this was going
to happen, whatever, whatever, whatever, And it said, well, if
I brought it on myself, he wouldn't have had to
lie to me to get me to do it. He
wouldn't have to manipulate me to get me to agree to.
Speaker 2 (31:07):
It, to get me to agree.
Speaker 1 (31:08):
But it's coercion exactly. And I remember kind of a
while ago, I think it was two young women who
were contestants in the Miss Teen USA pageant. They were
they it was revealed that they were both on this
site and they had to drop out of the pageant,
and I remember thinking, you know, all these years later,
clearly they were not the one who did something wrong.
Speaker 3 (31:29):
It.
Speaker 1 (31:29):
Yeah, where where was the Where was the smoke back then?
I know we had a very different culture for women then,
but where was the smoke for the person who was
breaking the law and trafficking women to enrich himself by
victimizing and manipulating them.
Speaker 2 (31:44):
Yeah, that just as the time was not part of
the narrative. Nobody thought about like you know, revenge, peorn
laws or oh the guy is doing something wrong here too.
It was just Oh, it's her fault she did, you know,
she's their own fault. She didn't want this, she shouldn't
have taken nude, she shouldn't have done this. And now
at least kind of people are like, oh, you're committing
(32:05):
a criminal act like this.
Speaker 1 (32:07):
It's just so intrusive.
Speaker 2 (32:10):
I like, more than like just, I just think it's
one of the worst things that can happen to someone,
even if it's someone like I don't like, you know,
like Kendall from Love Island his nudes were leaked by
someone and you just felt awful for him, Like, it
just is such a horrible invasion of privacy and to
see someone not only do that because they want to
(32:32):
humiliate these women, but also because they're making so much
money off of it, and they know they can get
away with it because there are other horrible.
Speaker 1 (32:38):
Men who want this content. Yeah, I just this is
so mad. Yeah, you put that really well. I mean,
it's something I think about a lot is and I
know that you remember this, but in the in the
twenty tens, the iCloud photo hacks where all of these
celebrity intimate content got hacked and put online, and you know,
(33:00):
I remember so clearly the conversation that we had about
that when it happened was exactly what you just articulated, that, oh,
they shouldn't have taken the pictures. I remember the it
was a the tech columnist at the New York Times.
His advice to Starlits was don't take nude pictures. And
here we are, you know, ten years later, and we
really see how none of that met the moment because
(33:22):
now we have AI enabled deep fakes, where all you
have to do is exist. You don't do anything. You
could just be existing in like fully clothed it get
clothed existing, and then you actually have to have that part.
Speaker 2 (33:34):
You can just have a personality, and then Meta will
take you and turn you into a sexual chatbop so
exactly as they did with Taylor Swift. And it's just
I don't know, I think, you know, years from now,
like as we've had this kind of reckoning with revenge porn,
I think eventually we're going to be like, oh, right,
that was a horrible invasion of privacy that we were
(33:56):
just taking people's photos and making AI images and voices.
Speaker 1 (33:59):
Out of them. I think it was some brand just
recently got in trouble.
Speaker 2 (34:04):
I think it was maybe Fashion Nova, but they like
clearly were using AI to make their model things. And
then the AI pulled in Luigi Mangioni's face. Yes it
was she am yeah, she and yeah, and everyone's like yeah, yeah,
that's a problem.
Speaker 3 (34:20):
Yeah.
Speaker 1 (34:21):
And I wish I could rewind back to those moments
like the iCloud photo hacks and force us to have
a different conversation because about digital consent, about consent more generally,
about not just putting the onus on the victim, the
or people to not do things in order to protect themselves.
(34:41):
I wish we could have had a different conversation because
I feel like where we are today in twenty twenty
five might be different. And it's a shame that we
had all of these big moments where we could have
had that conversation and work to create a different culture,
and we didn't.
Speaker 2 (34:53):
Yeah, and now we're still dealing with it.
Speaker 1 (34:56):
And you mentioned a good point of how this is,
you knowuative these videos from Girls to Porn, they were
distributed to other porn sites like bigger porn sites and
tube sites like Pornhub. According to the Daily Dot, Girls
Do Porn videos were viewed over eight hundred million times
on these websites, including roughly six hundred and eighty million
(35:17):
views on Pornhub, where Girls du Porn was among the
top twenty most viewed channels. And it just sounds like
even after Pornhub officially cut ties with the Girls Dow
porn channel in twenty nineteen, it just sounds like they
were very, very slow to take these videos down. It
took like reporting, it took I think Motherboard wrote a
(35:38):
big piece about it, like it took a lot of
cajoling to just simply take down these videos that I
think it sounds like Pornhub knew were made under very
sketchy circumstances. Yeah, and they're making money off of it,
you know.
Speaker 2 (35:52):
I think they also have faced some lawsuits because they,
you know, had videos up that were people who were
under age or trafficked, and they get those reports and
just kind of were ignoring them until it was like, oh,
we have to take it down like the cops are
getting involved, Like okay.
Speaker 1 (36:07):
So that's exactly why this whole girl's due porn thing
currently is also implicating porn Hub. In twenty twenty three,
Pornhub's parent company agreed to pay more than one point
eight million dollars to resolve a criminal probe alleging that
it profited from sex trafficking through its hosting of girls
do porn videos. More than one hundred and twenty women
have sued porn Hub's parent company in San Diego in
(36:28):
federal court a legend pornhub illegally published sex trafficking videos.
Pornhub's parent company settled the first of those suits under
terms that were not disclosed. So exactly what you just
said that you know, this wasn't just one small site
that was profiting off of this. They were profiting off
of it, but then those videos were then enriching like
(36:50):
much bigger porn sites like Pornhub, and so yeah, it's
just a scheme where mostly men who run these sites
get rich off of the manipulation of women, and then
it's the women who also face this added burden of
public scorn, public scrutiny. Why did you take the pictures,
getting kicked out of school, getting online, harassed, all of that.
Speaker 2 (37:12):
Yeah, and then you know, like it reaches the Andrew
Tate level where then they have the women who are
making these videos, and then they use that to blackmail
them into making more content, and then they blackmail the
women into scamming other men so that they can get
more money.
Speaker 1 (37:28):
And yeah, yeah, just a whole pyramid scheme built on
manipulation and sex crimes. More after a quick break, let's
(37:49):
get right back into it. Okay, So I have one
other story I wanted to talk to you about, which
I feel like as a podcaster, as somebody who cares
about media, I know I have a lot of thought abot,
but you might as well. Did you hear about this
AI based podcast startup? Yes, yeah, I heard? Okay, So,
(38:10):
as a podcaster, I am often asked if I'm worried
about AI, specifically AI taking the jobs of human podcasters
like you and me, Ashley, and so I'm asked this
so much that I've actually turned the answer to this
question into one of my signature public talks about the
intersection of creativity and technology. If you follow the conversation
around AI, oftentimes that conversation is like, Oh, what's going
(38:33):
to happen to human creative professionals, human podcasters. Huve been screenwriters,
Wh've been artists, animators, musicians, whatever. Anyway, so I spend
a lot of time thinking about this, talking about this.
So I was not terribly shocked to read that Hollywood
Reporter piece about Inception Point AI, this company that is
trying to build a stable of AI talent to host
podcasts and eventually become broader influencers across social media, literature,
(38:57):
and more. The Hollywood Reporter report, it's amid the high
cost of producing narrative podcasts and price the short term
contracts for popular hosts. The idea here is being able
to own scale and control the talent unlike those off
the cuff humans, and produce shows at a minimal cost.
We believe that in the near future, half the people
on the planet will be AI, and we're the company
(39:19):
that's bringing those people to life. This is from CEO
Janine Wright, who previously was the chief operating officer of
podcasting at Wondery, which has recently had to be organized
under the changing podcast landscape following a series of questionable
business decisions.
Speaker 2 (39:33):
Decisions and layoffs and yeah, all of it, all of it.
It's making a lot of sense now. I I don't
know who hears the statistic like we think fifty percent
of the people will be AI, and gets excited about that,
Like what you're excited about? Like dead Internet? You want
to be chatting with thoughts? I That alone is something
(39:54):
I just don't get, you know, I don't Apparently they
already getting like ten million downloads a month or something,
And I truly just believe it's other robots listening to it,
that they are just their spot farms that are just
listening to the AI, and we're just creating a whole
other ecosystem where AI is just talking and listening to AI.
Speaker 1 (40:15):
Because who wants that?
Speaker 2 (40:17):
Like you listen to your favorite comedians probably you know
people who are experts.
Speaker 1 (40:21):
Who needs that?
Speaker 2 (40:24):
And I think anytime a place has tried the online
influencer thing, it just hasn't worked. Little Mikayla is an
example of that just really failing, because at the end
of the day, people want an influencer who is a
real human being.
Speaker 1 (40:39):
Yeah. I remember when she had a scandal where it
was like, oh, I've been sexually harassed and it's like
you're not a real personally talk.
Speaker 2 (40:49):
About like yep, and then they did it they where
they like had her get cancer so that and then
like she disappeared and people just kind of forgot.
Speaker 1 (40:57):
So yeah, it is why the stuff that people think
people want, you know. And so you were talking about
some of the numbers behind this AI podcasting company. So
the company says that it is able to produce each
episode for one dollar or less, depending on length and complexity,
and then attach programmatic advertising to it. According to them,
(41:18):
this means that if twenty people listen to that episode,
the company has made a profit on the episode without
factoring in overhead. So they also said that they have
five thousand shows already across its Quiet Please podcast network,
and they are able to produce three thousand episodes a week.
You were right. Collectively, the network says that they have
seen ten million downloads since since September twenty twenty three.
(41:39):
It takes about an hour to create an episode from
coming up to the idea to getting it out to
the world, and they have about fifty AI personalities that
they've created, including food expert Claire Delish Gardner and nature
expert Nigel Thistledown and Alie Bennett, who covers offbeat sports. Obviously,
none of those people are real. Claire Delish is not
a real cookie an expert. I don't know if I
(42:01):
need to say that.
Speaker 2 (42:01):
Nigel Thistle, that's a name from That is Bridgerton. That
is the name from Bridgton.
Speaker 1 (42:08):
So you might be thinking, who in the hell wants
to listen to this? If you're thinking that, I am
sorry to tell you you're an idiot who was also lazy
and you don't know what you want. That is from
the founder who said quote. I think that people who
are still referring to AI generated content as AI slop
are probably lazy luddites because there's a lot of good
stuff out there. So you don't know. If you're thinking
(42:29):
I don't want to listen to that, you're a moron
who is also lazy.
Speaker 2 (42:32):
According to the CEO, if you don't want to listen
to a robot that's making things up from scanning Google,
you're lazy.
Speaker 1 (42:41):
Yes. So I saw this headline everywhere and there were
a couple of pieces that I just want to make
sure it get included to like one. It sounds like
right now they're focused on making podcasts out of just
basic information that you could find anywhere online, but you
might want it in podcast form. We might make a
pollen podcast that eighty fifty people listen to, but I'm
(43:02):
already at unit profitability on that, so maybe I can
make five hundred pollen report podcasts, she said. So it
sounds like what they're saying is that right now, like
you might want to get the weather report in a
hyper personalized podcast. So I can understand that as a
completely different use case from the kind of podcast that
(43:23):
you and I make that take research and human ingenuity
and human compassion and human connection. I guess if it.
Speaker 2 (43:30):
Was like a robot that like went through my emails
and messages and then it was just a podcast where
it's like telling me about things I need to know personally,
but that's just like Siri, I guess.
Speaker 1 (43:40):
Like yeah, And I guess, I mean, I almost feel
like if they were, if they were going to do that,
I would want them. But that's not a podcast, you know.
But I when I listen to my voicemails that somebody
has left just for me, I wouldn't call that a
like words have meetings and podcasts mean something. And yeah,
I do think a big part of podcasting is the
(44:03):
kind of community aspect of it. You know, when I
when I listen to the episode of of the of
Your podcast with Gibson John's, I scrolled down to the
Spotify comments because I wanted to see what people were
saying about it. Right, part of it is talking about
what other listeners are thinking about it and what their responding,
what their response is to it. I don't see how
a hyper personal podcast, it is only listened to by
(44:25):
a handful of people in the country, is going to
provide that.
Speaker 2 (44:27):
Yeah, you're not creating a community, and that's what people
want when they really get into podcasts. They love they
love the community, like you said, the friends, the just
I think it's also seeing a podcaster grow, right, like
when you look at people who if you look at
people who love the podcast like come Town and now
you have like Adam Friedlander and he's, you know, doing
(44:48):
these huge YouTube interviews with like big people, starv ros
is blowing up everywhere and just having I love him.
Speaker 1 (44:56):
I have such a I love frush on him.
Speaker 2 (44:59):
He's so so funny and he's like I love seeing
him blow up and it's like, oh my gosh, I
remember when he was like on this little tiny podcast,
like making weird jokes with his friends. So you don't
get that obviously with AI robot podcasts.
Speaker 1 (45:14):
Yeah, in the article, the people who run this company
were saying, oh, well, we're not gonna have it so
that people like, we're intentionally don't want people to create
connections with our AI host. So we're going we're not
coming up with the backstories for them. And I was
sort of thinking that is sort of the fun of
listening to a PA. I'm sure there are people who
don't care about what I have to say but have
(45:35):
been listening to me for a while and it's like, oh,
I just want to I just want to hear her
thoughts on stuff. And I didn't even watch that movie
She's talk like I half of the stuff that you
all talked about in that episode, I don't watch it.
I don't watch TLC. Not not to say that I'm
better than you, cause I watch Bravo. No you are,
you are, But you know, like, I just enjoy your
voice and I want to hear what you have to say,
and I don't think that that can be replicated with AI.
(45:58):
That is that that human aspect of it is integral
to the the why we make anything.
Speaker 2 (46:04):
Yeah, and those the background stories of who someone is.
My absolute favorite podcast moment is Caleb Harron. I'm like,
I feel like everyone knows this, him asking a guy
in his class what Colonne he's using, and he like
does the whole story of like just him causing me,
like son Connor.
Speaker 1 (46:22):
Well cologne is it that?
Speaker 2 (46:23):
And it's so funny and it's just like one of
those things that's the story from his life that just
came up in conversation.
Speaker 4 (46:30):
He said, Cooper, what clone do you wear? And he said, oh,
I don't. It's just like I don't know. My mom
gets it for me. And I go, oh yeah, but
like what is he goes, I really don't know. My
mom gets it from me. And during this whole assembly,
I just keep every time I get a chance to
being like Cooper, do you even know like what the
bottle is?
Speaker 3 (46:43):
You know?
Speaker 4 (46:45):
But I'm trying to play it cool, so I'm like, yeah,
I almost wonder what the shaped the bottle and he's like, oh, yeah,
I don't remember.
Speaker 3 (46:52):
Man.
Speaker 4 (46:52):
Literally, I'm being so persistent because when finally I go,
I need to know what colone you are, and he goes,
if I tell you, will you just stop talking to me?
Speaker 1 (47:00):
And I and I go, Can I go?
Speaker 4 (47:01):
And I go? Yeah. It was like scenic the way
it played. It liked be like talking to the boys
and cutting up. And then there was a moment's silence.
I'd be like, anyway, Coop, I'm having to repeat Cooper Cooper,
like I was saying, No, you guys are so bunny Cooper,
what color is the box?
Speaker 1 (47:20):
It comes up?
Speaker 2 (47:22):
Sorry, you're not getting that from an AI robot.
Speaker 1 (47:23):
They can't do it. And that's the best part of podcasting,
those little moments where you could never have scripted it
if you tried. You could never You could never have
have planned it out if you tried. Yeah, that's what
you that's what you tune into podcasts for. You never
know what's gonna happen when you put on the mic.
And if we if we give that over to AI,
(47:44):
and not just AI, but a certain kind of like
tech company suit. I don't know. I just I thought
reading this, this this Holly. Everything she said was gross. Yes,
And there was a time where you would be embarrassed
to have your name attached to something like this, and
so I don't like the idea that it's not just
the AI, it's people who talk about creativity in this
(48:06):
particular kind of way, like oh, well, we're already getting
ten ten million downloads and.
Speaker 2 (48:11):
Da da da dah, and yeah it's hitting my KPIs
and the ROI and it's crazy, like, oh, I.
Speaker 1 (48:17):
Hate the idea that the people in charge of things
sometimes hate the thing that they're in charge of it.
You don't, you don't even love it, you don't even
like it.
Speaker 2 (48:28):
Yeah, it's just there's a job for you. You like people
who make TV. Now, it's so many industries are just
people who are able to grow a profit, and that's
why they have their job.
Speaker 1 (48:38):
I'm sure they answer is yes, But do you watch
the Studio? Oh? Yeah, I love the Studio me too.
I think I think I watched it because of your recommendation.
It was so good. I feel like that that show
if you're not watching The Studio on Apple TV. I
just finished it and I loved it. But I think
that that show gets it right of the difference between
somebody who just I love movies. I've always wanted to
(48:59):
work for movies. Do you you? And then when he's
taken a pee at the Oscars next to the head
of Netflix.
Speaker 2 (49:05):
Yeah, just Ted Cerana Sorondos has said many times like
I am not a movie person, I'm a tech person.
Speaker 1 (49:10):
Yeah, and that's where the industry is now. I guess
it just makes me very sad. And I will say, like,
as somebody who knows quite a bit about tech and
the people who make it and how they talk about
their own technology, reading that Hollywood reporter piece to me,
a couple of pieces still gets about to me. One
the fact that clearly humans are still required to make
(49:31):
this thing exists. You know, the episodes are built by AI,
but it's humans who are doing everything else. And so
the idea, I think that we are being sold, this
idea that you can have a soup to nuts podcast
with no human interference, and the article does not suggest that.
It's like, oh no, it's humans in every step of
the way, and those humans are currently unpaid. The startup
(49:53):
is currently bootstrapped and employees are not yet salaried, but
the company will will soon seek outside funding. So I
feel like the big headline that I think being missed
in this whole operation is that it's profitable because people
are not being paid or not making money.
Speaker 2 (50:08):
Yeah, because there are people in and they're not being
paid and so it's profitable.
Speaker 1 (50:12):
Yeah, it's the whole AI boat.
Speaker 2 (50:14):
That's what it all feels like, like a bubble that's going
to burst because it's something where like when Chad GBT
was really just like a bunch of people in India
who would like google things really quickly for you and
then as soon as the you know, curtain lifts, I'm
hoping that every the whole bubble just pops.
Speaker 1 (50:31):
We have been sort of talking about it a bit.
I think it's coming. I think people are less interested
in AI and their commercial products, people are using AI less.
I just think we are starting to see the writing
on the wall that yeah, maybe this this technology is
not all it's cracked up to be. And I think
that the article about this AI podcast scheme to me,
(50:56):
it read the whole thing read like a please fund
us advertise, like, oh, you know we're seeking funding on that.
Speaker 2 (51:02):
You know, all things seem like and it's so lubinan
and so profitable and you listen, you want to get
on the ground floor of this, and it's I think
that's just someone who knows this is disgusting and who
has probably gotten no traction among like actual podcasters or
in the industry.
Speaker 1 (51:17):
Yeah, and you know, coming talking to somebody who writes
phenomenally well about culture and television and art and media,
A robot can never replace you, Ashley, like I can
never have you. A robot could never have the kind
of nuanced, layered opinions that you have about media. A
robot like you, you approach everything as such a rich text.
(51:40):
It doesn't matter if it's you know, a fit like
an indie film, an a twenty four film or something
or ninety day fiance brings such a like nuance to it,
and a robot couldn't do that. I wouldn't even to
listen to it. Try. Yeah, exactly. You know who wants that.
Speaker 2 (51:57):
We want to hear the people we love want to
hear their brains think of weird things.
Speaker 1 (52:02):
That's the fun of it. That is the fun of it. Actually,
I have to put you on the spot. You ended
your episode with like rapid fire TV homework. I gotta
watch the show Smoke, which I have not watched. With
the way that you described it made me want to
watch it. What are you watching right now that you
think people should know about.
Speaker 2 (52:18):
Oh oh, there's so many good ones right now, but
I'll keep it techy for you. There's a new show
that just debuted. It's called The tech Bro Murders. Each
episode looks at a murder in the Bay Area basically,
(52:38):
and is usually someone who's like, oh, they worked at
Facebook and then they lost their mind.
Speaker 1 (52:43):
So so far, it's like, I think a few episodes
out right now. It's pretty good docuseries. It's a docu.
Speaker 2 (52:50):
It's a docuseries that's out weekly, and it's it's yeah,
it's like wow, Yeah, drinking red bull all day and
not sleeping isn't good for your mental health and we'll
make you do wild things.
Speaker 1 (53:03):
Who knew? Okay, that's going on my list. Yeah, where
can folks follow you listen to your podcast? Well, it's
a career. And also I don't want to embarrass you,
but I went to the courier, I guess like a
launch aman in DC, and I chatted with your boss
and he had such glowing things to say about you,
Like he was like, it's such a brilliant podcast and
(53:24):
we're so lucky to get it. And I was like,
you sure are better say that. I was supposed to
be that and I couldn't. I had to stay back
and help my mom with some stuff. But I was like, oh,
I want.
Speaker 2 (53:33):
To be in DC and like, but they're amazing there.
You can listen to my podcast wherever you do podcasts.
It's also on YouTube on Courreer's channel or at YouTube
dot com at tv I Say, and you can follow
me at the Ashley Ray Everywhere.
Speaker 1 (53:48):
And you can follow me at Bridget Marine DC, on
TikTok and Instagram, and on YouTube at There Are No
Girls on the Internet, Ashley. Thank you so much for
being here. Thanks to all of you for listening. I
will see you on the Internet. Got a story about
an interesting thing in tech, or just want to say hi?
You can read us at Hello at tangody dot com.
(54:09):
You can also find transcripts for today's episode at tengody
dot com. There Are No Girls on the Internet was
created by me Bridget tod It's a production of iHeartRadio,
an unbossed creative. Jonathan Strickland is our executive producer. Tarry
Harrison is our producer and sound engineer. Michael Almato is
our contributing producer. I'm your host, Bridget Todd. If you
want to help us grow, rate and review us on
Apple podcasts. For more podcasts from iHeartRadio, check out the
(54:33):
iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.