Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls.
Speaker 2 (00:14):
On the Internet.
Speaker 1 (00:17):
Welcome to the brand new season of There No Girls
on the Internet, where we explore the intersection of technology,
social media, and identity. And this is our first iteration
of our weekly news roundup of our new season. We're
bringing you stories you might have missed this week, and
another big first, we have our first ever guest co
host for our roundup, the Amazing, the Wonderful that Like Media,
(00:41):
Maven Kristen Conger of Unladylike Media. Kristin, thank you so
much for being here.
Speaker 3 (00:46):
Oh, thank you so much for having me. I'm honored
to be the first.
Speaker 1 (00:50):
I think of you as kind of like a like
a virtuoso. You are the co author of the book Unladylike,
a Field Guide to Smashing the Patriarchy and Claiming your Space.
Awesome title, most of two great podcasts, Conspiracy she wrote
and Unladylike. We're in the middle of a series called
Gender War Games. What is that? For folks who don't know?
Speaker 3 (01:09):
Yes, so for anyone who has missed it, we right
now are kind of living in this uncanny valley of
various forms of gender panic and honestly, gender war games
is partly like my own excuse to talk to like
smart feminists about all of these this wild gender and
(01:35):
anti gender discourse happening right now. So if you're feeling
kind of spun out, come get some contexts gender war games.
Speaker 4 (01:47):
Come find out why.
Speaker 1 (01:49):
Well, speaking of finding out, why, are you ready to
dive into some news that folks might have missed this week?
Speaker 4 (01:54):
I'm so ready.
Speaker 1 (01:55):
Okay, So we got to start with an update on
my girl. I have talked about her on the podcast
few times because I'm fascinated. But we have an Elizabeth
Holmes update. Are you familiar with Elizabeth Holmes?
Speaker 4 (02:07):
I mean, how could I not be? I should have
worn a black turtleneck.
Speaker 1 (02:10):
In honor and like spoken in a weird, fake deep
voice to signal that you're like a serious tech person.
That's my Elizabeth Holmes voice.
Speaker 4 (02:19):
It's good, it is good.
Speaker 1 (02:20):
Elizabeth Holmes, the disgrace CEO of the scam blood testing
company Faranos, who is now in prison for defrauding investors
because her blood testing was all a scam. Anyway, So
Weed just recently found out what her husband is up
to these days, and that is, you know, securing the
family business, you know, the scam family. He is securing
(02:41):
funding for a brand new blood testing startup company that
he is launching.
Speaker 4 (02:47):
My mouth fell on the floor when I read that.
Speaker 1 (02:51):
Yes, I mean, if there's one thing these people have,
it's the audacity, you know, really no no bounds. So,
according to The New York Times, her husband Paul Evans,
is raising money for a company called Hemanthus, which means
blood flower in Greek. And apparently it's already accumulated millions
of dollars in funding. Who would give these like at
this point, like, if you're giving this person your money,
(03:13):
you get what you get.
Speaker 3 (03:14):
Like come on, yeah, I think that you you should
be there should be an extra tax, Like you should
have to give that money away.
Speaker 1 (03:23):
Totally agreed. So this company says it is the future
of diagnostics and a radically new approach to health testing
and will use AI to test blood and other bodily
fluids like so i'va and urine for diseases. So it
sounds like just like theorhanose, but instead of just blood,
it's also other bodily fluids.
Speaker 3 (03:42):
What is this couple just you know, are they just
really fluids? You know what?
Speaker 4 (03:48):
And I don't want to follow up answer. I'm so
so sorry, I asked.
Speaker 1 (03:55):
So the cut really called this out, saying that it's
basically the same company that you've had a photo of
the new company's prototype, which looks suspiciously like Aaronos's now
defunct testing machine. They write, anyone else experiencing deja vu
to the company. Here's what they had to say. They
were like, we know this looks bad, but trust us
(04:16):
on X they wrote, quote, we're Hymanthos. Yes, our CEO,
Billy Evans is Elizabeth Holmes partner. Skepticism is rational. We
must clear a higher bar, so we will communicate directly
the unfiltered truth, no intermediaries. We prefer to build first,
talk later. The science, when ready, will stand on its
own merits. But we feel compelled to introduce ourselves because
(04:38):
of the recent media coverage. And yeah, I just really
I almost almost kind of have to respect it, Like
I wish I had this confidence to go not just
into a health testing company, but specifically going into a
company that tests blood, just like Baronos did. Like you
almost have to respect it. So I do have a question,
(04:58):
which is that if they are a mayor couple, is
it possible that Holmes could one day be profiting from
this new blood testing company, because you know, it could
be marital property since he started this company after they
were already married. I don't know the answer to this,
but I am very curious, Like I don't necessarily like
a dynamic where she is revealed to be defrauding investors
(05:19):
with a blood scam and then gets to make money
from blood testing later on.
Speaker 3 (05:22):
Well, and also with the added bonus of not having
to wait until she's home from prison because her husband.
Speaker 4 (05:31):
Is just at home cooking up what's it called hymantheus hamanthea.
Speaker 1 (05:36):
It's also terrible made.
Speaker 4 (05:41):
So maybe we shouldn't be worried.
Speaker 1 (05:42):
Yeah, And I mean, whenever we talk about Holmes, I
do like to just remind folks that she is responsible
for like actual harm. Her blood scam told a woman
who was prone to miscarriage that she would never be
able to have children. It told somebody that they were
HIV positive, and this person could not afford to get
like an action blood test for some time, and so
like had to just wait while they accumulated enough money
(06:04):
to like get another blood test that told them like no, no,
you actually you don't have HIV and also misdiagnosed somebody
is having cancer. And so Holmes basically says that none
of this should disqualify her from running similar kinds of
health test and companies. In fact, to this day, she
maintains her innocence and says that failure isn't fraud, even
though like, yeah, she might have defrauded some people, but like,
(06:26):
come on, it's just business.
Speaker 4 (06:27):
Wow.
Speaker 1 (06:27):
And while Holmes might not be directly involved in her
partner startup, she does say that her days working in
health tech are far from over, and in fact, she's
already working on patents from prison. She says, there is
not a day that I have not continued to work
on my research and inventions. I remain completely committed to
my dream of making affordable healthcare solutions available to everybody,
(06:50):
which like girls keep them at this point, I don't
let me need them.
Speaker 4 (06:55):
I just have to say, bravo. That was I mean,
that was a beautiful you put me. I felt like
I was in the boardroom with Elizabeth.
Speaker 1 (07:02):
Oh my gosh, I've been doing that voice for the
longest step. When the dropout was being released on Hulu,
I was like watching it, like like I was like
very into it. I have been mimicking her voice for
a long time. I feel like I feel like it's
like in my in my repertourna.
Speaker 4 (07:18):
You drop into it seamlessly.
Speaker 1 (07:21):
Thank you, thank you.
Speaker 2 (07:27):
Let's take a quick break at our back.
Speaker 1 (07:42):
Okay, so we have to talk about this recent news
with the Federal porn band. Are we at the making
all pornography a federal crime stage of Project twenty twenty five.
Potentially one of the projects stated goals was to permanently
criminalize all pornography, and now Mike Lee, a Republican Senator
from Utah, has introduced a bill that would do exactly that.
(08:04):
Lee recently introduced the Interstate Obscenity Definition Act or IODA,
which would effectively criminalize all pornography nationwide by legally redefining
what it means to be obscene in a way that
virtually includes all visual representations of sex. According to the bill,
quote a picture, image, graphic image, file, film, videotape, or
(08:26):
other visual depiction of any media that quote appeals to
the prurient interest of innudity, sex, or excretion would be
considered criminal. So, like people are pointing out that, like
that could mean Game of Thrones episodes, right, Yeah.
Speaker 3 (08:40):
I mean also, it's been a long time since I
took my mass com law class, but I feel like
we've kind of like constitutionally supreme court wise, we've.
Speaker 4 (08:50):
Been around the block.
Speaker 1 (08:52):
Yeah, So Lee Lee addresses that he says obscenity is
not protected by the First Amendment, but hazy and unforceable
legal definition have allowed extreme pornography to saturate American society.
How to reach countless children, Our bill updates the legal
definition of obscenity for the Internet age, so this content
can be taken down and it's pedlars prosecuted. So one
(09:13):
of the things that we talk about on the show
a lot is like, if when you hear someone using
language that is like we need to do xyz to
the Internet to protect children, your Spidey senses should sort
of tingle, because that is exactly the language that people
who want to restrict the Internet use before taking some
like draconian action. Because what happens when the powers that
be decide that content related to the existence of trans
(09:35):
folks or queer folks or any other kind of LGBTQ
person is pornographic and thus needs to be like legislated
off the Internet, and it's it's peddler's put behind bars,
you know. Like most rational people, I also want to
keep explicit content away from kids. But research is super
clear that when you restrict the Internet through things like
(09:56):
verification laws that verify your age or like laws where
you you have to show your government idea to access
certain content, that is not effective, that does not effectively
keep kids away from explicit content. So all you are
really doing then is like setting the stage for extremists
to have the power to legislate anything they don't like
off of the Internet.
Speaker 4 (10:15):
Exactly.
Speaker 3 (10:16):
And it's fascinating that it's the same group of people
who are now pushing for an extreme kind of anti
porn band while also celebrating social platforms removing censorship.
Speaker 4 (10:29):
It exactly makes sense.
Speaker 1 (10:32):
It makes no sense, and it's one of those things
that like, if you follow the thread, their argument completely
falls apart. So speaking of platform moderation and what is
and is not allowed on platforms, I am so sorry everybody.
We have to talk about Kanye West, folks who have
been listening to the show for a while, know that
I had kind of a like Kanye West like me
(10:54):
paying attention to him funeral where I was like, I'm
just I'm leaving him in the past. I'm moving forward.
But so I am like loave to talk about this man.
But what's happening with his new song really says more
about the state of Facebook's moderation than it says about
Kanye himself. So if you've not checked in with Kanye
in a while, he's making new music. He is hanging
out with like the worst of the worst Z list
(11:15):
extremist internet personalities. They will do these like live streams
from their hangout sessions and it just looks grim like
it looks like something out of a David Lynch film
where there they'll be like in a beautiful waterfront mansion
and then it's just like all these goons wearing all black,
like it's just a It just as a very weird
(11:37):
vibe and the vibes look terrible when there, I guess,
let's put it that way. So Kanye released a new song.
The song is called I'm going to call it Smiles Schmidler.
That is not what the song is called but what
you're thinking that is the name of the song, right,
you feel me?
Speaker 4 (11:50):
Gotcha? Gotcha taken up?
Speaker 1 (11:52):
So yeah, I just did. Our producer Mike and I
had a whole conversation. I was like, I don't even
really feel comfortable like saying the name of the song
on Mike. I don't want there to be like audio
of me saying this. So yeah, well college Schmele Schmitler.
But you know, you know what it is, and yeah,
the song it sounds exactly like what you're thinking it is.
It's all about how the trials and tribulations of Kanye
(12:16):
West's life have turned him into a Nazi. Now, most
streaming platforms and social media platforms will not allow a
song called Schmeile Schmitler where the chorus is schmile Schmitler
over and over again on their platforms. Right except Instagram,
as it turns out, because according to a reporting from
four or four media major shout out to them. They've
been like, really on this story, the song is all
(12:38):
over Instagram, they right. While other social media sites and
streaming services rush to scrub Kanye West's pro nazi song
from their platforms, the curious or enthused can find memes, remixes,
and unedited audio of West's new song all over Instagram.
And I mean it's as if that's not bad enough.
This is like exublicitly against Instagram's platform policies. In fact,
(13:02):
Nazism is one of the only specific groups that Meta
calls out by name in its own rules. In the
current version of its Community Standards Policy regarding Dangerous Organizations
and Individuals, the company says it will remove any content
that promotes Nazis, saying, quote, we remove content that glorify, support,
or represents ideologies that promote hate, such as Nazism and
(13:23):
white supremacy. But for four found that there are reels
that use this song that have over one million views.
There's one reel that calls it quote the Song of
the Summer, which, FYI, I'm pretty sure it's not right.
Do you wait well side question? Do you have any
like potential candidates for the actual song of the Summer?
Speaker 4 (13:44):
Oh my gosh, no, not off the top of my head,
what about you?
Speaker 1 (13:47):
I had to look it up, and I will admit that,
like in the list, I felt very old because I
was like, oh, half of these songs, I know, I know, I.
Speaker 4 (13:56):
Was like I need to go do my millennial homework.
Speaker 1 (13:58):
I know, trust me, if you like me, there's no
shame of admitting, Like, we'll put some lists in the
show notes. Let me, I will be honestly like I was, like,
I don't know half of these songs. I'm gonna say
it's Luther by Kendrick Lamar and Sizza or like Charlie
XCX or something like that. But like, I know one
thing is for sure, and it's not Smiles Schmidler. That
is not the song of the summer.
Speaker 3 (14:18):
Absolutely not well, And so you mentioned that this was
getting traction on Instagram.
Speaker 4 (14:26):
What about Facebook or is it just limited to Instagram
at this point?
Speaker 1 (14:31):
That is a great question. I my sense is because
Instagram and this this is my sense. My sense is
because Instagram is much more like short form video heavy ie.
I would bet it's showing up there more than Facebook.
But I need to look into That's actually a very
good question.
Speaker 3 (14:49):
I just assume if it's if it's something bad, it's
gotta be on Facebook too.
Speaker 1 (14:53):
I don't know. Don't even get me started. Like if
Facebook has zero haters, I am dead. I'll just put
it that way. There are like number one Borce platform
around here, and so like, if folks are wondering, like
how is this song showing up on reels, it is
showing up in like the grossest ways that you can imagine.
One reel depicts a white dude in khaki pants dancing
(15:14):
to the song in front of a glowing, spitting swastika,
and the caption reads white Dad's getting turned to Kanye's
new song at the summer barbecue flame emoji, which I
read is like a pretty clear reference to the Holocaust,
And that one reel has been viewed almost a million times.
The account that shared it describes itself as a quote
(15:34):
race realist and meme guy. In the bio, much of
the content is memed up clips of avowed white supremacist
Nick Fuentez. So like it's very explicitly and overtly dealing
with like Nazism, white supremacy. Like they're not taking they're
not like they're not being like an obscure about that.
Like that's like very clearer. And if that's not bad enough,
(15:58):
I think to add insult to injury. Facebook when asked
about this, they honestly like speak to us like we're
stupid because four four actually reached out to Facebook and
like ask like, why is this kind of content allowed
on your platform even when it's explicitly disallowed by your policy,
like called out by name, and their response is very frustrating.
They said, we recognize that users may share content that
(16:19):
includes references to designated dangerous organizations and individuals in the
context of social or political discourse. This includes content reporting
on neutraally discussing or condemning dangerous organizations and individuals or
their activities. So the kind of reels that four or
four media found, those are not neutral, They are not
neutrally discussing, They certainly are not condemning. But the ideology
(16:44):
being talked about in this song, if anything, like some
of them are maybe making fun of that, you can
maybe argue that, but most of them are pretty queerly
and explicitly celebrating it. And so like, why even make
a rule that exuplicitly names and bans the glorification of
Nazism if you're not going to actually abide by it.
Speaker 3 (17:02):
Right, and then come around and call it like try
to twist what is obviously like a song intended to
troll and like terrorize and call it like, oh.
Speaker 4 (17:16):
So it's reporting and discourse, like.
Speaker 1 (17:19):
What exactly, And in case anybody's wondering. On Joe Rogan's podcast,
he talked about the song and he said it's kind
of catchy. Stop it. Yeah, So like yeah, that's there's
that okay. So speaking of things that are gross and horrifying,
(17:40):
let's check in with what's going on on x slash Twitter.
If you'n't even like me, you're probably like not spending
a lot of time there. I do not blame you,
but a little update as to what's going on on
that platform. According to kalinicule Tie, a research re at Bellingcat,
users have been using Rock, which is ex's like AI chatbot,
to undress women in the comment of their posts, and
(18:01):
Groc is sort of doing it. According to PC mag,
while the chatbot rejects prompts for completely nude images, it
does fulfill remove her clothes requests with AI generated images
of women in bikinis or lingerie. Groc's responses are public
and appear as replies to the original prompts. Uh, pretty gross.
So like if I were to post something on X
(18:23):
or Twitter, somebody could say remove her clothes and Groc
would serve up an image, an AI generated image of
whoever in a bikini or lingerie is just like it's
just really fucked, like there's no other and it's like
other AI chatbots don't do this, right, like other AI
chatbots have guardrails against this that Groc does not.
Speaker 3 (18:46):
Yeah, it seems like Groc is uh is he like he?
Why am I gendering broc oh God?
Speaker 1 (18:53):
But you know, Grock's a guy.
Speaker 4 (18:56):
He's a.
Speaker 3 (19:00):
But it's especially unhinged though, right because I was also
seeing I've seen headlines as well about like just wild
kind of like South African, like wild racist kind of
stuff being baked into Rock.
Speaker 1 (19:18):
Yeah, so we were talking about this. Producer Maca and
I were talking about this before we got online. I
was like, oh, I'm just speaking about Crock and he
was like, oh, the South Africa thing. So basically Elon Musk,
who is South African, was annoyed that GROC was not
serving up like enough racially inflammatory content about how they're
(19:38):
bringing South African, like white South Africans to the United States.
So I guess they overcorrected because now when you ask
Rock anything, it's like, oh, it's just like the oppression
of the white South Africans, like you could be acting
for a recipe and it's like, oh, are you also
interested in the plight of white South Africans. They really
(19:58):
the pendulum has swung in the other direction, Mike, Do
I have I summarized that correctly?
Speaker 4 (20:04):
Yeah?
Speaker 5 (20:05):
I think that's right. I think they just like turned
up the white Afrikaanners dial and it's like any question.
I saw somebody submit a haiku that was like had
nothing to do with anything, and it was like, oh, yes,
this is like the blood of the white africaners, like
my god, GROC like set.
Speaker 1 (20:23):
Her down, settled down rock. Oh so yeah, honestly, like,
I have never heard anybody say a positive thing about GROC. Yeah,
I don't think it's going well over there. So about
the non consensual like undressing of women, you know, When
(20:46):
pointed to a post about this and asked about the
guardrails that GROC has against non consensual AI generated explicit content,
GROC responded with an apology and said, this incident highlights
a gap in our safeguards which failed to blow like
a harmful prompt, violating our ethical standards on consent and privacy.
We recognize the need for stronger protections and are actively
(21:07):
working to enhance our safety mechanisms, including better prompt filtering
and reinforcement learning. We are also reviewing our policies to
ensure clearer consent protocols. So like, at least Groc knows
what he's doing is wrong. At least I really I
should not be answer pomorphizing AI, but I do it
constantly in my head, even though I shouldn't.
Speaker 3 (21:25):
Well, in twenty twenty five, we are all at this
point programmed to make a public apology, make a make
a yes, a normal apology, and notes app Yeah.
Speaker 1 (21:35):
Groc knows when called out, Like what the proper thing
to do is just apologize very quickly via you know,
an apology that sounds like Verry Wordsmith. So like, I
something about this story gets me because I feel like
it is exactly the kind of climate that makes social
media and technology more generally like a hostile space for women.
And I think you know, in twenty twenty five, most
(21:57):
of us are using social media like it is just
part of showing up in civic and public life is
being online. And if anybody can use AI to undress you,
to sexualize you when you do show up online, we
simply do not have an online landscape that would that
allows for women to like fully and safely participate in
civic life. And so I mean I am not showing
(22:18):
up on x or Twitter as a platform anymore, but
if I were, and if other women are, I think that,
like I can see why that would drive women off
of these platforms. And so these people who say so
much they care about free speech, they care about like
you know, having a marketplace of ideas, this is anti
free speech. Women are going to not show up to
these platforms and not you know, make their voices heard
(22:40):
on them if the threat is there them being like
sexualized and undressed and violated in this way. So it's
not only like a gross violation, it's also in my book,
like very anti democratic.
Speaker 3 (22:50):
Absolutely well in the the unsettling layer of it as well,
that yes, Grock is a problem, but it it's also
such a problem that there are enough people wanting to
train Grock to essentially like assault people.
Speaker 1 (23:09):
Yeah, And I guess like that's my point, Like that's
like why I don't like to answer fromre size technology
like this is because like it's learning from us, so
like it's built by people, it's trained by people that
learns from people all of the biases and gross things
that we know that people have, Like it is, it
is just reflecting that back and like turbo charging it.
(23:31):
And so yeah, it really does reflect the fact that,
like this technology is being trained on the worst human
impulses and that is a problem. And not only is
that like a tech problem, it's like a like an
US problem, like a people problem. If we can't figure
out how to, you know, how to have a more
equitable landscape, how to not use technology to just reinforce
(23:55):
and reinforce these like patriarchal attitudes about women and violate women,
we're never gonna get anywhere. So it's like it's like
both a tech problem and a like human problem. I
guess we gotta buy Grock a copy of your book.
Speaker 3 (24:09):
Come on, Grock, get a ladylike Rocks.
Speaker 4 (24:13):
It's a gender neutral term.
Speaker 1 (24:14):
Yeah.
Speaker 2 (24:19):
More after a quick break, let's get right back into it,
all right.
Speaker 1 (24:34):
So, speaking of things that are infuriating me and also
sort of inspiring me, I guess I need to talk
about a landro Nelson. Landre Nelson for folks who don't
know she is incredible. I have been lucky enough to
like hear her speak, and she is really phenomenal. She
has a very long and storied career in public service.
She was the first black person and the first ever
woman of color to lead the White House Office on
(24:56):
Science and Technology Policy. She's a very accomplished scientist professor.
In twenty twenty three, she was nominated by the Biden
administration to the UN High Level Advisory Body on AI.
Her list of contributions to public interest and science like
could go on and on and on. I could talk
about her all day. She's phenomenal. Shout out to her. Well.
This week she resigned from the National Science Board and
(25:17):
the Library of Congress Scholars Council, citing creeping authoritarianism. And
I want to talk a bit about what she said
drove her to these resignations. One because I think coming
out publicly the way that she has is really brave
and it comes with tons of risk, Especially right now.
I haven't really seen this story talked about in the
(25:38):
way that I wish it was being talked about, because
I think absolutely everybody should read what she has to say.
I'll put the link to the full piece in the
show notes. But I also think it highlights what we
all lose when voices like Nelson's are pushed out of
public service, Like we all benefit from having voices who
are critical, who ask tough questions. When those voices are
(25:59):
in in the conversation, all of our lives can improve.
And when those voices are pushed out and silenced, we
all lose out. And so yeah, it's it's I think
that her her coming out publicly with what she saw
and why she's leaving is really important. So have you
did you do you know anything about her work or
(26:21):
had you heard this story?
Speaker 3 (26:22):
I was not familiar with her, but it's it's tragic
to lose someone like that who also, I mean just
the amount of knowledge, like institutional knowledge that she possesses,
Like when that one person just walks out the door
like that, it's I mean, yeah, it's it's gross how
(26:42):
many people are being forced to, yeah, just walk away.
Speaker 1 (26:48):
Yeah, And I mean I live in Washington, d C.
Where a lot of federal workers live, not all, but
a lot of federal workers live in this area, and
I have been covering it for another podcast I work
on called City CASSIDC. It's just we all lose out,
like it really is. We all lose out for no reason.
(27:09):
It's not saving any money. We could we could do
a whole episode on this. But like for all of
the like puffery that Elon Musk talked about, like fraud
and waste, we have we the deficit is up. And
when you fire somebody illegally and you have to pay
them severance that you didn't think you're gonna have to pay,
or you have to like fund the lawsuit or something,
we have not saved any money, right, So like it's
like we're shooting ourselves in the foot for no reason.
(27:31):
So first, what is the National Science Board the entity
that she was one of the entities she was resigning from.
So the National Science Board of the United States establishes
policies of the National Science Foundation, which is one of
the biggest and most important funders of scientific research in
our country. The National Science Board also serves as an
independent policy advisory body to the President and Congress on
science and engineering, research and education issues. Basically, it supports
(27:54):
all non medical science research in the United States, so
kind of like an i age, the Nationalist Suit of Health,
that is the health and medical counterpart of the National
Science Foundation, And basically like it is why I am
talking to you about technology and the Internet on a podcast,
because if it was not for this board, we would
(28:16):
not have the Internet like it is important. So her
peace in time explaining her decision to resign isn't just
maddening because this like accomplished person is being pushed out
of government, although yes, that it's also maddening because it
shows how much the Trump administration is attacking and gutting
these organizations that technology and scientific advancement really rely on.
(28:38):
I was watching this just like a side frustration. I
was watching this podcast with Pete Boudajeg and he was
He's been on these like podcast bro type podcasts recently,
and there was one where he was basically explaining to
the World's Stupidest podcast Bro that the Internet would not
exist and like the iPhone, would not exist without the
(29:00):
government because it was essentially a project of the government
in collaboration with researchers and universities. Why we have the
Internet at all? And it really him having to explain
that really revealed that, like some people just don't know,
or don't think about, or have not been taught about
where research and scientific and technological advances come from. In
(29:23):
this country, and that if we didn't have research, but
like funded research, we didn't have universities, and we didn't
have government, these things that we take for granted, like
the Internet, would not exist, we would not be like,
we would be a completely different country, and I would
argue a worse country if not for these things. For
so long, the US has led the world in scientific
(29:44):
discovery and technological innovation and like improved our lives and
drove economic prosperity. And it is just depressing as hell
to watch Trump throw all of this away because it's
woke or whatever, and then watch people who do not
really get it cheer it on and not realize that
you were cheering on something that is going to make
all of our lives so much worse. So I think
(30:05):
everybody should read her piece in Time explaining her decision
to resign, But I want to share a couple of
pieces of it because it really, I thought was so
perfectly put.
Speaker 3 (30:13):
So.
Speaker 1 (30:13):
She says that initially she planned to stay in government,
but at a certain point that kind of just became impossible.
She writes, perseverance has its limits. The erosion of these
institutions integrity, and the growing realization that it is impossible
to fulfill their missions in good faith has made the
cost of continuing untenable. That is why I m at
step away from my work with two federal institutions that
(30:34):
I care deeply about. In both of these roles, over
the past few years, I've been asked to serve on
diverse bodies that offer guidance about how the executive and
legislative branches can be stewards of knowledge and create structure
to enable discovery, innovation, and ingenuity. In the case of
the National Science Board, this ideal has dissolved so gradually
yet so completely that I barely noticed its absence until
(30:55):
confronted with the tallow simulacrum. And yeah, that really, I
feel like that really just packs a punch of how
difficult it would be. Because when all of this hearted happening,
I was someone who was like, federal worker should just
stick it out. We need their voices, like make them
fire you, blah blah blah. And I still sort of
(31:16):
feel that way. But I had never really thought about
what it would be like to continue to have your
name as the overseer of a board that its mission
had been so hollowed that it was just like fake,
Like how difficult that would be?
Speaker 3 (31:32):
Yeah, yeah, once your job becomes like enacting the kinds
of draconian policies as well, Like yeah, it's.
Speaker 4 (31:45):
It's enraging, it is.
Speaker 1 (31:47):
And she calls out, like just the general attacks on
knowledge that this an administration is pushed forth writing this
hollowing out is not just about governance and the abstract.
It has material consequences for which research questions get asked,
which data sets get produced, which knowledge gets produced, and
which perspectives shape our understanding of pressing societal challenges. It
(32:07):
has consequences for the integrity of knowledge itself. And I
think that is so true that you know again, I
feel like it's easy to take for granted that they
are smart people asking the questions that are going to
lead to scientific advancements, right, Like there are people who
are solving problems and challenges that somebody like me might
not that have not even been revealed with somebody like me,
(32:29):
and thank god, somebody is working on it. Like what
do we get when those people are attacked and pushed
out of the important work they've been doing.
Speaker 3 (32:36):
Right, and all of their research funding drained? Like it
also seems like part of this an extension of the
whole like Trumpian mentality and like conservative mentality of running
the business, running the government like it's a business, like
it's a corporation.
Speaker 4 (32:55):
So if something is not immediately turning you a profit,
then it's out door. Like it's just like no thing
so doesn't.
Speaker 1 (33:04):
Work that way, And that's just I mean, like that's
just not how we get innovation. Like so many innovations
take a long time to reveal themselves or like take
some collaboration or like yeah, they're slow burns, and like
thank god somebody stuck it out to see them to
the end, because it's like, Ben, there was this great
thing or this great development that helped everybody. Like it's
just it's it's just not how it's just not how
(33:27):
any of this works. So, in addition to resigning from
a position at the National Science Board, she's also resigning
from her position on the Library of Congress Scholars Council,
which is like a body of distinguished individuals convened by
the Librarian of Congress to advise on matters related to
scholarship at the Library of Congress. So y'all might have
seen recently that the Librarian of Congress doctor Carla Hayden
(33:48):
was recently fired. Uh really for no good reason.
Speaker 4 (33:53):
That's okay.
Speaker 3 (33:54):
I had missed that news, and I I mean, I'm
not shocked because she is a black woman, and it
seems like three black women in the federal government, you know,
has a target on her back.
Speaker 1 (34:06):
So exactly that Nelson points out that in the email
dismissing her, she was so this is someone with a PhD. Right,
she's doctor Carla Hayden. The email dismissing her was addressed
to Carla just her first name, not doctor Hayden, which
is like insult to injury. And importantly, it sounds like
Trump just like fired this woman on his own. He
(34:28):
was not acting on behalf of the lawmakers who oversee
the Library of Congress, and the White House Press Secretary
said that she was being fired quote for things she
had done at the Library of Congress in the pursuit
of DEI and putting in appropriate books in the library
for children. So I want to play this a quick
bit of audio of that press conference.
Speaker 2 (34:47):
Question.
Speaker 1 (34:48):
Now, the president fired the Library of Congress, why do.
Speaker 6 (34:50):
You choose to do that?
Speaker 7 (34:51):
We felt she did not fit the means of the
American people. There were quite concerning things that she had
done at the Library of Congress in the pursuit of
DEI and putting inappropriate books in the library for children,
and we don't believe that she was serving the interests
of the American tax Pairwell, so she has been removed
from her position in the president as well within his
rights to do that.
Speaker 4 (35:11):
Has she visited the Library of Congress, That's what.
Speaker 1 (35:13):
I'm saying, Like, think about this for one second. Does
she think that the Library of Congress lends out books
to kids? Like, like genuinely does she think that it
doesn't lend out books to anyone? Like its job is
to house all the published books in the United States.
So like, first of all, the way that I could
never obviously it's never happened, But like I just cannot
(35:35):
imagine sitting in a press conference and hearing somebody say
that and not being like a bullshit or even if
it's asking a follow up, but I guess, like who
would be asking the follow up at this point? Like
one American News They're like, sure, yeah, that's right, she
was lending out books to kids that were inappropriate. It's
like sure, And I think you're exactly right. That like,
(35:55):
and Nelson clock says in her piece, she says that,
like the reason why they pushed her out is because
they are just like pushing out black women in public
service under whatever pretense they invent. And that's that, Like
they don't even have to have a real answer that
makes any sense in reality whatsoever. Nelson writes, the ouster
of Hayden is part of a broader pattern of political
targeting of women and black public servants across the federal government.
(36:19):
So exactly what you said, right.
Speaker 3 (36:21):
And I wonder too if part of uh Trump taking
it like firing her himself, and the way it sounds
like it went down, it seems like there's also like
an especial like a real distaste for anyone who was
the first in their position, to which I believe Carla
(36:41):
Harden was she sorry.
Speaker 1 (36:44):
Yeah, I completely agree, And it's like it just really
crystallizes the way that they are being so exuplicit about
rolling back any progress, right, Like, And I guess that's
the thing is like it does feel I really work
hard to not feel defeated in this moment when things
(37:06):
feel so bad, but like that's what it feels like.
It feels like what they're trying to do is be
like any gains that women or you know, historically marginalized
folks have made, we want to make it. We want
to like spell it out that those are being erased.
And I guess not that I'm saying it out loud.
They can take it off the website, they can fire people.
(37:28):
I wish they wouldn't, but they can. But we know,
right like like they can't take away the gains that
we have made. They can try, they can pull their
little bullshit in their little scams, but like they can't
take being the first black anything away from anybody, even
if they do delete it from the website.
Speaker 3 (37:43):
And I mean, unless I'm mis understanding completely how the
Library of Congress works, which is possible even if let's say, uh.
Speaker 6 (37:52):
The library former Librarian of Congress did put out some
books for the children. Even if that did happen, I
guess that's what those books are still cataloged at the
Library of Congress. Like those books people that are going anywhere, Yeah.
Speaker 1 (38:05):
Exactly, Like the point of the Library of Congress is
that it holds all the books. So like, if this
book was published, you don't have to like it, but
you can't take it out of the library of Congress, Like,
that's not how it works. So I feel like Nelson
really is speaking so powerfully to this moment that we're in,
she writes. To watch these changes unfold without naming them
for what they are is to participate in a collective
(38:27):
amnesia about how knowledge infrastructures shape power relations, like the
shopkeeper in an authoritarian society described by Vachlovehovel in his
essay The Power of Powerlessness, who participates in its own
oppression through small, daily acts of complicity, placing a party
slogan in his window, not out of conviction, but out
of habit. To remain on advisory boards that have been
stripped of meaningful advisory function is to become that shopkeeper,
(38:51):
to lend legitimacy to a process that has been systematically delegitimized.
And yeah, I mean I really a pre She ate
that she is not just leaving quietly, that she is
leaving with such a resounding, clear warning to all of
us about what happens when you do become complicit, not
(39:13):
out of convictions, out of habit, when you do say
like oh yeah, like they deleted all this stuff from
the website. I guess it never happened, like when you
allow for this to become legitimized, And I just it
would have been so easy and probably like advisable in
some ways for her to leave quietly. But I'm so
glad she chose to leave loudly. And I think everybody
(39:33):
should read her parting words. It is like a warning
to us.
Speaker 4 (39:37):
All.
Speaker 1 (39:38):
Okay, so real quick last story. I mean it's I
can do it very quickly actually, because she remember back
in twenty twenty three when HBO became just Max and
everybody hated it.
Speaker 3 (39:52):
Oh yeah, how could I Where were you?
Speaker 4 (39:57):
You know it.
Speaker 1 (39:59):
Is you're joking, but yo, I remember being like why
would they take away the HBO? Like I was so
like angry because like HBO was just like what do
you think of HBO? You think of like Soprano, Sex
and the City, like these like iconic shows, and it's like, yeah,
let's just be Max, Like what is that?
Speaker 4 (40:18):
Yeah?
Speaker 3 (40:19):
I was unnecessarily stubborn about it in the way of
like I'm.
Speaker 4 (40:23):
Not gonna call it x huh. It's very much like
it's not Max out.
Speaker 1 (40:28):
Well, don't worry because they're going back to HBO. Max
now that's it. That's the story. They've they've they've they're
back to it. They have they've heard our complaints.
Speaker 3 (40:37):
I would love to see. I want to see a
spreadsheet of a breakdown of how much that circular decision costs.
Speaker 1 (40:44):
I'm so I had the same I had the same question, like,
did how much money did an executive get paid to
make the initial choice? How much money went to change
the branding? I have so many questions.
Speaker 3 (40:56):
Oh my god, and you know that there were so
many like brainstorm sessions and wow, that's incredible.
Speaker 1 (41:02):
Let me ask you this. When you hear the like
HBO like static, like, what's the what is the like
theme song? This is a question I ask everybody. What
is the like theme song that you're like, Oh, when
you hear that HBO intro noise, you know it's gonna
be this.
Speaker 4 (41:17):
Show either Sex and the City or Soprano.
Speaker 1 (41:20):
That's okay. I think those are the only two acceptable answers. Producer, Mikes,
you want to chime in and tell us what it
is if I make fun of you all the time.
Speaker 5 (41:28):
There's a period when I was watching a lot of
True Blood and I was just powering through one episode
after another, and yeah, that was really my my I
didn't grow up with HBO, you know. I came from
a family where we just had three channels. So this
was the period when I had access to like an
(41:50):
unlimited set of all of the shows. And it was
just like static Sound Show, Static Sound Show. It like
trained my brain. So yeah, I'm not gonna apologize.
Speaker 4 (42:02):
What a weird answer, very unexpected I have. And now
you got me wanting to ask.
Speaker 3 (42:09):
I have one friend who is a true Bloodhead, and
now I want to ask her the same question.
Speaker 1 (42:15):
Did you just make that? Did you make up? Is
true bloodhead what they call themselves? Is that?
Speaker 4 (42:19):
Like, No, I don't know, Mike, Mike would know.
Speaker 5 (42:26):
As a member of the True Blood fan community, that's
not a term that we used to identify ourselves.
Speaker 4 (42:33):
It's not it's not canon.
Speaker 1 (42:36):
Well, Kristen, this has been incredible. Where can folks hear
the podcast? The multiple podcasts? Where can folks keep up
with you? How can they get the book? Tell us
all the things?
Speaker 4 (42:47):
Well, thank you so much for having me again. This
was so fun.
Speaker 3 (42:52):
Folks can listen to unladylike podcasts and the Gender War
Games mini series all four episodes of that are out now.
Speaker 4 (43:01):
Just search unladylike or if you.
Speaker 8 (43:03):
Want something a little weirder, just like for a conspiracy
She wrote, I just started a new mini series on
that today.
Speaker 1 (43:14):
Actually, what is the miniseries exploring?
Speaker 4 (43:17):
Well, the Earth is such a dumpster fire.
Speaker 3 (43:20):
Was just taking a quick break to aliens and Ufoh
my god.
Speaker 1 (43:24):
So it's that so like I am also in like
the conspiracy world, but so often they're like Quanon. It's
like nice to take a little break with, like, oh Aliens.
Speaker 3 (43:34):
Yes, I mean all roads will eventually lead back to
QAnon and like you know, I mean they all route
the same place. But and I didn't even think about beforehand,
like how much like UFOs and aliens are just it's
all like anti government stuff that takes me right back
down to Earth, which is where I.
Speaker 4 (43:51):
Wanted to leave. But yeah, it's been it's been.
Speaker 3 (43:55):
A fun kind of mental break, but it's fascinating and
really like spent a ton of time thinking about UFOs
and now I think I've spent too much time.
Speaker 4 (44:05):
So come listen to conspiracy, she wrote.
Speaker 1 (44:08):
The truth is out there and you're gonna find it. Kristin,
thank you so much for being here. Thank you for
being our inaugural guest co host on There Are No
Girls on the Internet. Y'all can follow me on Instagram
at bridget Marian DC, on TikTok at bridget Marian DC
on YouTube There are No Girls on the Internet. I
(44:29):
know that sounds very awkward. I just started doing more
social media, so don't make fun of me. I'm being
perceived there and it's fine. Thanks so much for listening.
I will see you soon. If you're looking for ways
to support the show, check out our merch store at
tenggodi dot com slash store. Got a story about an
(44:51):
interesting thing in tech? I just want to say hi.
You can reach us at Hello at tegodi dot com.
You can also find transcripts for today's episode at tengody
dot com. No Girls on the Internet was created by
me Bridget Tod. It's a production of iHeartRadio and Unbossed
Creative edited by Joey Pat Jonathan Strickland as our executive producer.
Terry Harrison is our producer and sound engineer. Michael Almado
(45:12):
is our contributing producer. I'm your host, Bridget Todd. If
you want to help us grow, rate and review us
on Apple Podcasts, For more podcasts from iHeartRadio, check out
the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.