All Episodes

July 25, 2025 74 mins

Bridget runs through this week's tech news you might have missed with the brilliant Dr. Titi Shodiya, co-host of the excellent podcast Dope Labs.

LISTEN TO DOPE LABS! https://www.dopelabspodcast.com/

FOLLOW DR. TITI SHODIYA: https://www.instagram.com/dr_tsho/

Debate-style video roils internet after participant openly identifies as fascist: https://www.nbcnews.com/tech/internet/jubilee-debate-video-fascist-participant-roils-internet-rcna220303

Emmanuel Macron, Brigitte Macron sue right-wing podcaster Candace Owens over false claims first lady was born a man: https://www.cbsnews.com/news/french-president-emmanuel-macron-brigitte-macron-sue-candace-owens-claims-first-lady-born-man/

Trump’s order to block ‘woke’ AI in government encourages tech giants to censor their chatbots: https://apnews.com/article/trump-woke-ai-executive-order-bias-f8bc08745c1bf178f8973ac704299bf4

ICYMI TANGOTI ep: She Called a Black Child a Slur — Then Raised $700K. Kiandria Demone Is Saying ‘Not Today’: https://podcasts.apple.com/sn/podcast/she-called-a-black-child-a-slur-then-raised/id1520715907?i=1000708401994

If you’re listening on Spotify, you can leave a comment there or email us at hello@tangoti.com!

Follow Bridget and TANGOTI on social media! Many vids each week.

instagram.com/bridgetmarieindc/

tiktok.com/@bridgetmarieindc

youtube.com/@ThereAreNoGirlsOnTheInternet

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. Welcome to
another episode of There Are No Girls on the Internet.
This is another rendition of our weekly news roundup where
we summarize all the stories on the Internet that you

(00:25):
might have missed if you don't have to. And I
cannot tell you how much of a fan I am
of the person that we're great who's present we are
graced with as this week's co host Chapter T SHOWDIA scientist,
engineer and co host of the Fantastic Dope Labs podcast.
Welcome to the show, Bridget.

Speaker 2 (00:46):
I have dreamed about this moment. I am so excited
to be here. Thank you so much for having me.

Speaker 1 (00:51):
I mean, I'm such a big fan of what you're
doing with Dope Labs. For folks who don't listen to
the podcast, you should be listening, but give us a
summary of what do you all do at Dope Labs.

Speaker 2 (01:01):
So at Dope Labs I co hosted with my one
of my best friends. Her name is Zakiah Wattley. She
also has a PhD hers in genetics and genomics. Mine
is in mechanical engineering and material science, and we show
how science intersects with pop culture. So it's a really
fun show that is rooted in our friendship because we've
known each other since grad school.

Speaker 1 (01:17):
We bonded in the struggle.

Speaker 2 (01:19):
And we just take things that folks are talking about
in their group chats, things that are tumbling down your timeline,
and show that there's a little bit of science behind
it all.

Speaker 1 (01:29):
I still remember the very first episode. It might have
been the episode that the first episode I heard. It
might have been the first one you did about cuffing season.

Speaker 2 (01:36):
Oh yes, I absolutely loved that episode and that was
such a great introduction for us to the world because
it I feel like it just encapsulated everything that we're about,
you know, the science, the clownery, and you know, talking
to strangers and hearing all these really funny, amazing stories
and talking about you know, the ecology behind all of it.

Speaker 1 (01:56):
It's it was really fun. Science, clowning and talking of
strangers is like the hat trick of things that I enjoy. Yes,
and I'm want to I mean, I'm excited that you're
here familiar reasons, but one of the reasons is that
you all just did a your most recent episode was
all about how folks should be thinking about AI in
their own lives.

Speaker 2 (02:15):
Was what was Can give us a little bit a
little taste of that conversation, Yeah, I think the reason
why we wanted to talk about AI. This was kind
of like our second episode that we talk about AI.
We talked about the impacts and the cost of AI,
and then this most recent one we kind of talked
about how we can engage in AI in ways that
make sense, that fit into our lives and we don't

(02:36):
have to feel, you know, bad about it because AI
is coming whether we like it or not. And you know,
there are some things that are really really bad about AI.
I mean, there's things that are really really bad about
a lot of technology. But we just want people to
arm theirselves with knowledge so that as we move forward,
because this is going to move forward with or without us,

(02:57):
that we can you know, step into our workspace, we
can step into the rest of the world and know
one what is out there, make and make an intelligent
decision about whether or not we want to engage for
whatever reason, whether it's oh, I'm not going to upload
photos of me and my kids and my family into
an AI large language model or Okay, this is how

(03:18):
I'm going to engage with it. I'm only going to
use this for work. I'm going to make sure that
I toggle these certain things off so that you can
use it in ways that you feel comfortable with and
you don't feel like your privacy is being violated.

Speaker 1 (03:28):
And so much of what we do on this show
is just sort of I guess I'll, for lack of
a better phrase, arm people with the truth, arm people
and the facts. And so I talk to people who
are really skittish about using AI and in any capacity. And
you know, I think like for folks who are like
I would never use AI, I would never use tash GPT,

(03:48):
I totally get it. But I want people to be
able to make informed decisions have a sense of it.
So just so don't have a sense that is rooted
in AI hype of like it can do all of
these things that it absolutely get off to you. It's
a good These use cases are great when they're not
actually good use cases.

Speaker 3 (04:04):
Right.

Speaker 1 (04:04):
Similarly, don't fall into this trap of like you need
to be afraid of it. You can't use it right.

Speaker 2 (04:09):
I mean, because the way that I always talk about
AI is AI is a new technology. If you think
back in the history of this world, every time a
new technology came into the forefront, there was always this
stage where people were terrified. When you think about mathematicians,
when the advocus was invented, there was this whole big

(04:30):
thing where it's like, you are not truly intelligent if
you have to use an advocus. Fast forward, when the
calculator first came out, everyone was saying, you should not
be using a calculator. No one's going to know how
to use their brains anymore. Then you have you know,
the advanced calculators TI eighty nine, We have computers and
everything like that. Then the Internet comes and everybody's like, wow, no,

(04:52):
this is not something we should be working with. And
now the Internet is ubiquitous, it's everywhere that our fingertips.
And then we he had Google. And when Google first started,
everyone was very very confused and saying, it's just like
this cycle with technology where it's like something comes out,
there's an innovation there, people are like, don't use it.

(05:13):
You should be scared, and then everybody realizes, oh, it's
not that bad.

Speaker 1 (05:17):
We can live with this, and it actually is making
our lives better. So it's just important to understand how
these things work, understand how you can interact with them
or not interact with them if you don't want to.
I know some people who haven't used the tid In
not calculator, and that is totally fine. That doesn't make
them less smart or more smart or whatever. It's just
if you don't need it, don't use it. So I

(05:37):
want to talk about the executive orders the Trump administration
put out about AI, because I'm sure you have thoughts
I have thus, but before we jump into that, I
just wanted to really quickly talk about this thing that
I saw floating all over the internet, and that is
this Jubilee Media. I guess they're calling it a debate,
but I'm not even comfortable really using that word. Where

(05:57):
this journalists medi Hassan, who is like a progressive journalists, right,
I really like respect, do you do? You know what's work? Yes?

Speaker 2 (06:03):
I do really respect him. Loved all of everything that
he's put out unto same.

Speaker 1 (06:09):
So this clip of him debating twenty far right conservatives
viral and Jubilee Media. For folks who don't know Jubilee Media.
They started their founder. I looked into them. Their founders
said that they started because they were so set up
with how partisan politics have gotten and they wanted to
create what he called the Disney Channel for empathy. That

(06:31):
was a long time ago, because basically what they're more
known for now is platforming extremists. So you know the
far right conservatives quote unquotes that they had on the
show that were debating medi Hassan. They say things like, oh,
I don't mind being called a Nazi, and yes I
am a fascist and things like that, And it's really
clever how this platform has been able to really normalize

(06:51):
and launder these very extreme positions like aligning yourself like
like explicitly aligning yourself with Nazis is extreme to say
the least, But then they phrase it as, oh, these
are conservatives. I mean, I am not a conservative. I
would not say that it's fair to say, like, oh,
somebody who aligns themselves with Nazis, that's merely a conservative perspective, right. No. No,

(07:16):
The viral moment that I want to talk about was
this guy named Connor. You're a little bit more than
a far right Republican. Okay, what can I say? I
think you say I'm a fascist, Yeah, I am so,
basically Connor and like, of course his name is Connor
proudly proclaims, yeah, that's right, I'm a fascist. So this

(07:37):
interview goes viral and Connor he goes on the crowdfunding platform,
gives sen Goo and says I lost my job. I'm
trying to cut crowdfund twenty thousand dollars. We actually talked
about give Seno on our first episode of this season,
that it's kind of become the platform for people who
have done something bad in public and then a big
payday from it. Our earlier episode was about this woman

(07:59):
who calls a black child a slur on a playground
and raised almost a million dollars on Gibson Go afterward.
If you missed that episode, we will put it in
the show notes. But that woman kind of became a
right wing celebrity, and I think Connor from Jubilee is
also on his way there too. He already raised quite
a bit more than the fifteen thousand dollars he was

(08:19):
initially asking for, and so you know, there's a lot
to be said about Jubilee Media and like what they're doing.
But I mean, I just I I wanted to talk
about them briefly because I don't think this is going
to be the last we see of Connor, right, Like, no,
it's definitely not going to be the last we see

(08:39):
of him, because you know how they say a watch
pot never boils. It's like a watched fascist never shress
the fuck up, you know what I mean?

Speaker 2 (08:49):
And that's T shirt please, it's like you will not
shut up. And I'm just like, that's why I refuse
to click on certain things. If I see the clip,
I scroll pass. I said, I don't want to contribute
to any of those metrics because I feel like it's
mostly you know, people who are not conservative that are
running up the numbers because we're so outraged, and so

(09:11):
they're using outrage as currency. And I'm just like, no,
I don't want to contribute to this guy's notoriety.

Speaker 1 (09:17):
I don't want to.

Speaker 2 (09:17):
Contribute to, you know, it being pushed more and more
onto people's fees, Like we have to stop giving them
a microphone. I'm so sick of it. Who else did that?
It was what was his name, otto Acho or whatever?
Who had that show where he was talking to racist.
I was like, this is the dumbest idea ever. Why
not use that microphone to talk to somebody who actually

(09:39):
has brains in their head and is willing to have
smart conversations, not these really old, outdated conversations. If we
stop giving these folks microphones and turning up the volume
just so that we can get clicks.

Speaker 1 (09:51):
They will eventually shut up. And it will they won't
be able to give all of these other people out
in the world who are looking for, you know, verbiage
to use in their little racist conversations. They won't have
anything to say.

Speaker 2 (10:03):
They'll have nothing to have in their mouths to then
regurgitate onto the rest of us exactly.

Speaker 1 (10:09):
And my thing about people like Connor is that if
it wasn't for Jubilee Media giving him a platform, he'd
probably just be like typing his opinions on our Reddit threat. Honest,
he would not be our problem. And giving him a microphone,
giving him a platform, giving him a way to to
make a little bit of coin for himself, or the
abhorrent views, I think really is the problem. And you
put it so well, I do think this is We're

(10:32):
not in twenty sixteen anymore, right, There was a time
where people wanted to talk about like, oh, defeat the
al alt right in the marketplace of ideas, just debate them,
debate them. No, in twenty sixteen, this style of debate
content is a scam. It's not informing anybody. Yes, And
I guess ultimately I strongly feel like there is no

(10:52):
debating people, Like, how do you debate somebody who with
their full chest says yes, I am a fascist. When
you don't have a shared reality, there is no way
to have a debate. And I think that we should
stop acting like this is anything other than a spectacle
and right wing a starmaker machine.

Speaker 2 (11:09):
You can't call it a debate because it's you're not
talking to somebody reasonable. You know, you have to have reason,
you have to have something, you have to have intelligent
thought in order to have a debate with someone. You
can't just say I'm a fascist and cross your arms
and say, yeah, that's what it is.

Speaker 1 (11:27):
That's what it is.

Speaker 2 (11:28):
That's not someone that you can debate. That is just
a that is it's like talking to a brick wall. Honestly,
like that's they don't.

Speaker 1 (11:37):
Have any because they will.

Speaker 2 (11:38):
What they will try and do is make you become
the intellectual and they they will you will have to
intellectualize their fascism. There is no intellectual roots to racism
and fascism. There's nothing intellectual about it. But this whole
conversation that that folks are having around all of this,
because we keep giving these people microphones is intellectualizing something.

Speaker 1 (12:02):
That is at its core not intellectual exactly. And the
way that you put that makes it so clear that
like if you say, like when it's like when black
lives matter. In the height of the Black Lives Matter movement,
there was a lot of that debate style content, and
it's like, how do I debate somebody who doesn't feel
that I deserve rights? How do I abate somebody that
doesn't think I'm human? Like what, like what can I say?

(12:23):
That would be like, oh, you got me. We're not
having that, We're not speaking the same language at a
certain point, right, And there's no conversation to be had
period exactly. And I mean I definitely have gotten offers
from not from a Jubilee, but from similar platforms that
are like, oh, come on, and then you can be
the sort of like progressive voice or a feminist voice.

(12:43):
And I just don't think in twenty twenty five, any
serious person all to be participating in this, Like I
already it doesn't matter. I could, I could go on
those platforms and make the most well formed argument, but
I still know regardless, it's gonna be you know, conservative
Republican takes down feminists like because that's really what they're after, right.

(13:05):
They're not after any kind of like actual moment of
clarity or empathy or understanding each other. It's just a
ridge debate with a foregone conclusion.

Speaker 2 (13:15):
Right, And even if you go in there and you
cut them up to shreds, they're gonna clip it up
so that it looks like you got tongue tied, like
you were stumped. And no, you're not gonna let me
fall victim to the editing. This is not love and
hip hop. Okay, hey, this is not Love Island.

Speaker 1 (13:30):
You're not gonna have me looking like a fool just
because you wanted to clip me up and now I'm
looking crazy. Yes, exactly. Let's take a quick break at

(13:55):
our back, Okay, Gezi, we have to talk about somebody
who I am a little bit I have a weird
obsession with Maybe obsession is the wrong word, Okay, that
is Candice Owens. I have people who listen to the show.
I have described her as my shadow self because I
feel that she's almost like a fun house mirror version

(14:17):
of myself. I know that sounds well, but we did
an episode about her rise that I see so much
of myself In her early days, we were both, you know,
people who like to talk about politics on the early
days of the internet. She back in the day when
we were both doing this at the same time, she
was actually a progressive voice, and she had a very
specific switch where she went onto some of her more

(14:41):
like noxious extremist views. So she also is somebody who
I think is prone to hyperfixation, which I also am,
although her hyperfixations are very different than mine. Hers are
things like Blake Lively is lying about everything, or Harvey
Weinstein is actually a victim, the victim of me too.
And one of her most recent hyper fixations has been

(15:04):
the First Lady of France, Bridget Macrone, the wife of
the French President Emmanuel Mark Macrone. Owens spent months months
building out this increasingly elaborate, completely unfounded conspiracy that the
First Lady of France is trans. She does it before
you ask. She has no proof of this. This is

(15:26):
just something she's like. It's like she she initially started
calling it a gut feeling, and it's so fucked up
because initially she was it was like clearly like she
basically has never had any proof, and she's gone from
being like, this is a gut feeling. I have to
I have the proof of this, I have the receipts,
and so I have quietly been wondering where this all

(15:47):
ends for her for a while, and we may have
her answer because this week the French president and his
wife file the defamation lawsuit in the US against Owen's
centered on her claim that the First Lady is trans.
So they filed this complaint in the Delaware Superior Court,
saying that Owens has waged a lie filled campaign of
global humiliation to promote her podcast and expand her frenzied

(16:09):
fan base. As somebody who keeps tabs on what she's
up to, I wholeheartedly agree with how they're characterizing this,
because I've honestly seen I don't call it missed reporting,
but I've seen it reported like as if Candice Owens
just said this at a party or something that she
just casually said this once or twice, and that's actually

(16:30):
not what's going on in reality. She has made multiple
podcasts and videos cleaning who have found smoking guns and
claiming to have all this evidence, like she's got like
a string board and she's connecting the DUTs on this
claim that she's made like very concrete claims, and she's
gotten millions of viewers, which translates to cash to her

(16:51):
for this lie. So it's not like she's just like
saying this and they're coming after for saying this like
casually once or twice, as I feel like some of
the reportings suggest she has really monetized this unfounded lie
and builken into a whole conspiracy theory with her millions
of viewers. Yeah.

Speaker 2 (17:09):
I mean that's the bread and butter for folks like
Candice Owens because what their base it fails to grasp
onto is fact and so then they have to think
very very like dark meta. Yeah you know what I mean,
where it's like no, no, no, y'all are so blinded

(17:31):
by facts that you're missing all of this mystery and
nothing over here. And it's like that's how she that's
how she makes her money. She just makes something up
and then runs with it and it's like, oh, but
look at this, Oh why did she cross her legs
like this? Instead of like it's ridiculous, it's honestly ridiculous.
And anytime somebody is doing something like this, I'm always

(17:53):
just like, this is Olympic level like jumping, Like you're
going from one thing, you're doing an absolute triple jump.
You gold medalist in this jump that you're doing trying
to get to whatever conclusion and the other part of
it that I'm which is my issue with everyone that

(18:14):
has issue with transgender folks, why do we care? Like
I'm just like, and if she is, like, why do
you care?

Speaker 1 (18:25):
Who's doing abandon.

Speaker 3 (18:27):
Who cares?

Speaker 1 (18:28):
I'm not worried about what Matt Cralin is doing with
his partner, his wife. I don't care what they do
in their bedroom. I care about his policy. How is
that going to affect his people? The folks in the
United States the world don't. I don't even know what
his wife looks like.

Speaker 2 (18:43):
Like the fact that she's putting so much energy into this,
I'm like, can this girlfriend, now imagine if you put
this same level of intensity into anything else, Yes, you
would be able to do anything you could take over
this world, but you choose to hyper focus.

Speaker 1 (19:01):
On these really ridiculous things.

Speaker 2 (19:02):
I'm like, use your powers for good, not evil, and
don't get yourself in prison or sue to the point
where you can't even afford a microphone because you want
to do this. It's crazy.

Speaker 1 (19:15):
You put it so perfectly, the way that she has
picked apart every little iteration of their appearance, their marriage,
their friends, their family, their personal history, the way that
the way that Brigitte's hair will fall, like, oh, that
should tell tales. I'm like, like, she is dead. And
then sometimes it'll be like the absence of evidence is

(19:36):
what the evidence is. It's like, well, if she wasn't,
if she wasn't actually trends, don't you think, like it's
it is such a rabbit hole. And keep it in
mind that Brigitte and Emmanuel Macrone, they've been in a
romantic relationship since for a very long time. Like their
relationship is its own weird thing, Like they met when

(20:01):
Brigitte was a teacher at his school when he was
fifteen and she was in her thirties, And they met
in drama club at a school that like he was
a student and and she was a teacher into like
now that I know, I mean like exactly, so like
they're actually focused on the wrong thing a thousand percent.
There actually is stuff where it's like, well, who to

(20:22):
talk about how we're their marriage is? That's right there,
Like that's like Lagitimately we should be talking about screw
he focus on the wrong thing, like talk about the
real stuff. I did not know that no side eye
and the whole thing. If you wanted to make podcast
episodes about them, there's a lot to say that is
a mertle ground for discussion, and so you don't actually
need to make up these completely unfounded claims. Yeah. So

(20:45):
Owens responded to the lawsuit. She says, this lawsuit is
a littered with factual inaccuracies and part of an obvious
and desperate public relations strategy to smear my character. And
my question is this, why would the president of France
want to smear this random American podcaster other than the
fact that she keeps telling these outlandish lies about her,

(21:06):
like like like why why he's busy doing French president stuff?
You think he can even want to like target you
like a random right wing podcaster.

Speaker 2 (21:15):
Absolutely not right, It's it's really ridiculous. And I also
saw that she says something like, oh, you're just suing
me so you can have it on record that you
sued me.

Speaker 1 (21:23):
But I'll see you in court. Girl. This ain't what
you want. No, ain't what you gonna is not gonna say, oh,
you have to prove that you are biologically female.

Speaker 2 (21:38):
They're not gonna do that. They're gonna say that you
have no reason to believe that you. It is baseless,
and so you're in trouble, and sister, those dollars ain't
set up the right way, especially now that you haven't
always been aligning yourself with everything that Trump's doing. So
I feel like you've lost a little bit of your base.

(21:59):
They're not gonna say, sister, they're not gonna save you
in this instance.

Speaker 1 (22:03):
You're just gonna be broke. Okay. Yeah, this is a mistake,
and it is something that we we we've talked about
this on the show, but it's I think it's something
that we've seen more and more of where someone, often
like an extremist, will go on a smear campaign and
the only recourse is suing them for defamation. And we've
seen it work. So it's a business model. Now, yeah,

(22:24):
it is. It kind of is, and like I I mean,
it's it sounds like from reading the lawsuit that they're like,
we gave her ample opportunity to just stop this. We
were like, we wouldn't. And so I do think like
they would not have pursued this if they did not
feel like they had to. And I think the fact
that she has shown no signs of planning on reversing

(22:45):
course on this and just shutting up and saving herself
the trouble makes me think that they're that that's true.
They had to do this.

Speaker 2 (22:52):
Listen, if I had that level intensity about anything, I
just want to bottle what she has and.

Speaker 1 (22:59):
Used it in the right way. Yahsi your energy girl.
My god.

Speaker 2 (23:03):
I was gonna say, get you somebody who is so
fixed sated on you like that. But you don't want
that type ofation. That's too much stalking talk to that.

Speaker 1 (23:15):
Okay. So should we talk about Trump's woke AI executive orders? Huh?
I mean it would be wild to do a podcast
episode and not, so let's just get into it. So
Trump signed three AI focused executive orders on Wednesday aimed
at keeping woke AI models that's his word out of

(23:37):
Washington and hopefully this idea of turning America into an
AI export powerhouse. So there are three executive orders of this.
I will quickly summarize two of them, and I want
to focus on one of them kind of exclusively, but
I'll give you the sense of like what's going on
with everything so perfect. One is called Promoting the Export
of the American AI Technology Stack, which aims to support

(24:00):
the export of American developed AI systems. Its main mechanism
seem to be financing support for American AI companies to
facilitate them and licensing them to allied nations. It honestly
is pretty light on specifics, which, like surprise, surprise, an
EO is like light on specific side of this administration.
Who could have thought? But all the stated goal anyway

(24:23):
is to establish the US as like the leader of
the global leader in AI. I will say, like, to me,
it really does seem like it's a good opportunity for
American tax dollars to subsidize big tech companies even further,
particularly those companies that the Trump administration favors. Like mm hmm.
I'm expert in this, but reading what they put out
of all this just seems like a big gift to

(24:45):
Trump tech cronies. So something to know. Mm hm. The
next EO is called Accelerating Federal Permitting of Data Center Infrastructure,
which exclusively focuses on building new large data center projects.
Those were acquiring at least one hundred megawats of power.
So I am not a scientist or an engineer, so
you can let me know if I have this right.

(25:08):
I did a little bit of research, and how does
seem like one hundred megawatts is what you might think
of as like at the larger end of what is
normal for a data center? Is that puviny cent of that?
I think? So? Yeah, okay, And this Executive Order calls
on agencies to remove environmental regulatory hurdles so that the
negative impacts of the environment or water resources are not

(25:31):
a barrier to what everybody wants, which is the construction
of new large data centers, which okay, I mean there
are literally people who do not have water in their
households because of data centers. There are communities like the
neighborhood in Tennessee where Elon Musk built a massive data center,
no longer have clean air. This Executive Order basically is

(25:52):
like fuck them Maribobi, Fine, we need this technology. Yeah,
And it's a.

Speaker 2 (25:58):
In my opinion, it's a ruse because it's like, it's
not really about the technology. It's about the people who
are going to be making money from building these data
centers and expanding their empires. It's really not about And
that's what I also want people to understand about like
AI and these technologies that are coming down the pike.
It's like the reason why the government is so hyper

(26:20):
focused on it. Anytime the government is doing all of this,
or you have a president that is like just so
laser focused on something, you have to start really thinking
where is the money going? That should always be the
first question, because if you're thinking about the greater good
of America and how do we make ourselves more technologically

(26:43):
advanced and all these things like that, you would invest
in education first and foremost. You would make sure that
people that everybody in the like everybody that is coming
up in the US education system, understands how to code,
that they understand like how a semic conductor chip works,
that they know how these models work.

Speaker 1 (27:04):
We're not doing that. We're not investing in education. They're
like build the things, and that's because when you build
the things and you start to make product, you can
sell it all over the world.

Speaker 2 (27:12):
That means people are making money. And so I'm just
like all of these executive orders, all of these it's
just a formal way of lining their own pockets. Because Trump,
he's thinking, after this presidency is over, I need to
make sure that I'm setting things up so I make
the most amount of money possible and make my rich
friends happy so that just in case I need them,
you know, they can you know, slim me a little,

(27:34):
a little some something. And so that's always what I
think about it. I'm like, it doesn't make sense for
you to say we want to be leaders in technology
but not investing in education. Who gonna do it.

Speaker 1 (27:48):
Who's gonna do it, Who's gonna advance us if we
are not making sure that college is affordable, that K
through twelve schools, that they have the resources that they need,
that teachers are being paid in a that a teacher
is being paid the amount that a living wage, that
they can go to these schools and be able to
arm these children with knowledge, training teachers, like making sure

(28:10):
that defunding academic institutions like Harvard and all these things
like that. It doesn't those two things don't go together.
You can't say we need to be the leaders in
this technology. But then also, Department of Education, you're going
to the dumps, and I think we're doing exactly the
opposite of what we should be doing. It's like we've

(28:30):
It's like this administration has clocked that an educated, critical,
upwardly mobile scenery is not is not going to be
conducive to the way that they're trying to do business.
I think I think all of all of the way
that they're moving tells me that that that's what they've deduced.

Speaker 2 (28:46):
Yes, absolutely, that's exactly it. I feel like you hit
the nail right on the head.

Speaker 1 (28:53):
More after a quick break, let's get right back into it.
So let's talk about the woke AI bit of this.

(29:16):
That's not me. I mean, I don't know, I don't
understand what that is same And so here's the thing.
When I was seeing woke AI everywhere in all the headlines,
I was like, well, certainly this is like they're extrapolating
or they're using that as a stand in. No, that
is actually the name of the executive order, the Executive
Order US called the preventing woke AI in the federal government. Hi,

(29:37):
when I tell you that, I would pay good money
to have Trump like sit down and explain to me, like,
what do you when you say WHOAOKAI? What do you mean?
Give me some specifics. And he probably couldn't.

Speaker 2 (29:53):
No, he probably couldn't because I think it's just like
anything any AI that would say anything negative about him
and his administration.

Speaker 1 (30:02):
Basically that's basically it.

Speaker 2 (30:04):
Yeah, and so now we're talking about another hit to
freedom of speech and things like that. And when people
are thinking about AI, they're like, oh, well, AI is
not a human. It's not AI is not built from nothing.
It The way that AI works is that it take
it crawls the entire Internet and grabs everything that's out there,

(30:28):
every single corner, every single piece of the Internet. That
is what it's working from. So if he's saying I
don't want AI to use to have anything in it
that says these specific things, that's basically putting a blackout
on parts of the Internet. When it comes to AI,
AI people think of AI as like this robot that's

(30:52):
just super fast at learning, and I'm like to learn something.
Things have to exist, and so like Yes, there are
issues with you know, creativity and using people's image and likeness.

Speaker 1 (31:05):
Yes, I get all of that. But when we're talking
about censoring AI, he's censoring us.

Speaker 2 (31:14):
He's ansoring the people. And that's what people need to understand.
It's like, oh yeah, sure, woki no wo ki. That
is words, that is images that are there that the
AI is learning from that will not be used. So
it's the same thing as burning down a library or
a whole section of a library. It's the same thing

(31:35):
as taking away access to certain parts of the Internet
like they.

Speaker 1 (31:40):
Do in China. You know, they don't allow them to
have things like TikTok, and they like the way that
their internet works is not the way our internet works.
And that's essentially what he's trying to do.

Speaker 2 (31:51):
And we have to really keep our eyes open because
you can come up with a lot of like fancy
white house house legal ways legal quote unquote to censor
whole populations of people.

Speaker 1 (32:07):
I mean, I had never even thought about it that way,
but that is something that's a drum that we beat
a lot on the show that AI is people, right,
it's yeah, by people trained by people. It's not you know,
it's it's not a computer brain or something right for us.
And so what you're saying is that this sort of
preventing woke AI executive order, that's like, no, AI must

(32:27):
not have anything that have Messa must not have any
kind of ideological agenda. That's just another way of censoring
the Internet and ultimately censoring us expression.

Speaker 2 (32:38):
Exactly because if I'm if I'm using uh, the Internet
and I'm creating blog posts that I'm talking about my
experience as a black woman and stem my experience as
a black woman in America, and I'm saying I feel
like this certain legislation is not right, or that i
feel like I'm being disenfranchised, and all these things like that,

(33:00):
the AI models will not be able to pull from
my blog in order to answer someone's question that might
be like, in twenty twenty five, what was it like
to live in America for a black woman? I'm silenced.
It's quiet, everybody on mute. And that's dangerous because you
essentially can rewrite history. Imagine if we could go into

(33:23):
every single library, take books and just rip out whole
sections and say, Nope, that didn't happen.

Speaker 1 (33:30):
That is what is happening. That's what the technology is.

Speaker 2 (33:32):
The Internet is something that we has is boundless, but
there is absolutely the ability to censor, and we have
to really be careful because these are the stories. These
are the stories that tell what history was. Our experiences.
People might you know, pooh pooh doing a blog or

(33:53):
you know, vlogging your life. These are archives that people
one hundred years from now will watch, Like when we
see videos from those old black and white videos that
they restore and we're like, wow, look at those.

Speaker 1 (34:04):
People walking around. Oh my goodness, that's so cool.

Speaker 2 (34:07):
Are you trying to tell me that we will that's
our whole experience as black women won't be there. We
won't like your podcast won't be my podcast won't be there.
Like images of us at marches, images of our family
like having like having community won't be there because they
might say, oh that that's woke. Black family reunions is

(34:28):
too woke for the internet. We really have to be careful.
We really have to like sit up and take notice
of these of these ways that we're also being censored
and that they're trying to rewrite history and silence us.

Speaker 1 (34:42):
Because if they silence us in this.

Speaker 2 (34:43):
Technology, we're in a really dangerous spot for the history
being told. And as we know, history always repeats itself,
and especially if that that narrative is not there. I mean,
look at how long we went. I feel like how
long America went with Native Americans and understanding the history

(35:04):
and the plight of Native Americans, like the indigenous people
of America.

Speaker 1 (35:10):
It's they should want it back in blood.

Speaker 2 (35:13):
Honestly, it's pretty sick that we have, you know, sports
teams that are like.

Speaker 1 (35:18):
The chiefs the Indians, like they should want it back
in blood. And that's what happens, That's what will happen
to whole populations.

Speaker 2 (35:27):
And some people who are part of Trump's space might
be like, oh, well, I don't care because that isn't me.

Speaker 1 (35:35):
It's gonna be you too.

Speaker 3 (35:36):
Yeah.

Speaker 2 (35:36):
You think you live in Appalachia and you think that
they want to hear about your struggles.

Speaker 1 (35:41):
No, they don't want to hear about your struggles either.
They don't want to hear that you can't afford to
pay to get groceries for your your family this week.
You want food stamps. They don't want to hear that either.
That's woke. We're all in danger. We are in danger,
and we really need to take notice. I mean, I
completely agree you put that so well, and I do

(36:01):
think when you look at the words and this executive
order under the like with the lens that you just
laid out, it's clear that's exactly what they're doing. They're right,
They're right there carving out any part of the experience
of being a human that they don't align with and
don't like. And what I find so fucked up is

(36:22):
that it relies on this idea that there is sort
of one objective truth and who gets to decide that
the Trump administration so exactly the truth of like, even
if you're a white person in Appalachia who can't afford groceries,
the truth that the Trump administration wants to give is
that groceries have never been cheaper and everybody. So that's
so get that right out of there. And yeah, I

(36:44):
mean the way that they in this executive Order, they
basically spell out that we get to decide what is
and is not objective, what is it is not truth?
What is it is not They've come up with this
term that they basically disinvented, that of whole cloth unbiased
AI principles, And essentially that means that they are only

(37:04):
supporting AI models that meet this term of unbiased that
they have just decided themselves.

Speaker 2 (37:13):
Yeah, the when you talk about anything that is man
made and saying that it is unbiased, it's just not possible.
There was a study done a while ago, I don't
I don't remember what year it was, but they had
an AI model crawl the entire Internet and basically create
a personality based on what they found on the Internet.

(37:34):
And that model was racist. It was sexist, and it
was xenophobic, and like, that's just based off of what's
on the Internet. And so it started out with nothing
and then said they said build from here. Just crawl
the internet, read as much as you can. And it
crawled the entire Internet. And that's what the result was.

(37:56):
Everything that is man made is biased. Everything. Like the
term biased, I think people always think of it in
one direction. If I were to make something, it's gonna
be biased. You know, if I make something for if
I make a product that I use for my hair, well,
my hair is very specific type. Okay, I have afro

(38:18):
hair that I love and that product is going to
work in my hair. Now you go next door to them,
white folks, they might not enjoy that product because I
made it and it was biased, absolutely, And so this
whole idea of something being unbiased, it is just again

(38:40):
a ruse, like it's just not possible. Anything made by
man is biased. You now, you can go to great
lengths to make sure that lots of biases are represented
in the making of a product. So if I again,
if I'm using that same example, let's say I want
to make a hair product, I can say, Okay, I'm
going to go talk to my my one of my

(39:03):
best friends who're from the Dominican Republic. I'm going to
talk to another one of my best friends who's black
but has a really loose curl pattern.

Speaker 1 (39:09):
I'm going to talk to one of my friends who
is white.

Speaker 2 (39:14):
I'm going to talk to one of my friends who
has alopecia and figure out, like, what are you looking
for in products? That is a way to take a
lot of biases into.

Speaker 1 (39:25):
Account when you're creating something.

Speaker 2 (39:26):
But to say something has no bias, it almost does
not make sense to me at all.

Speaker 3 (39:35):
More after a quick break, let's get right back into it.

Speaker 1 (39:50):
And I think one of the most frustrating things about
this e EO is that it's specifically names that it
does not want any kind of quote DEI in the
in AI, which already it's like, I mean, I hate
when Trump kind of makes me think through the logic
of the things they put out. I'm like, what do
they mean by that exactly? But in the real world,

(40:12):
having more voices at the table, having a diversity of
folks educated on AI, working in AI, building models, training models,
all of that that can only help us, Like that
is that is like one of the ways, one of
the tools in our toolbox to combat bias in AI AI.
Then it's harmful AI that doesn't recognize us for the
like multi faceted people that we are. Is having a

(40:34):
plurality of people and perspectives at the table, and this
executive order, it says. They say DEI and AI can
lead to discriminatory outcomes, distort and manipulate AI model outputs
in regard to race and sex, incorporate concepts like critical
race theory, transgenderism, unconscious bias, intersectionality, and systemic racism. DEI

(40:54):
displaces the commitment to truth in favor of preferred outcomes, and,
as recent history illustrate, it poses an existential threat to
reliable AI and that's what I think kills me is
that we know AI can be biased, can be sexist,
can be racist, all of this stuff, Like we know
that because it's trained on us, and like humans, those

(41:17):
are all pitfalls that humans unfortunately struggle with. And so
this idea that, oh, the way that we make AI
good is making sure that there there's no even whiff
of inclusion around AI. I mean, it's it's I shouldn't
even try to find any kind of logic in the
things that they put out. But it doesn't. It just
it just doesn't make any sense. And I think when

(41:39):
we're talking about something as critical as AI, that we're
all having this big conversation about how AIM is going
to change all of our lives, right, you know the
fact that we can we're not having like these are
not serious people, We're not there's a there are serious
conversations because you're having. But like in lieu of that,
we're putting out nonsense that doesn't even make any fucking

(41:59):
sense exactly.

Speaker 2 (42:01):
And that, I mean, that just goes back to what
I was saying earlier, where it's just like they will
try and make you intellectualize things that are at their
core not intellectual at all.

Speaker 1 (42:12):
It's just racist.

Speaker 2 (42:13):
Like when you think of the history of racism, there's
this really brilliant researcher, her name is Angela Saini, who
does a lot of research on race and the history
of racism, and I learned so much from her books.

Speaker 1 (42:29):
But racism is not like racism made up thing.

Speaker 2 (42:34):
We all know that, and racism was a science as
at one point, like the research that went into trying
to prove that there was a more superior race. But
then all of that research was racist, impious and not true,
not rooted in anything that was true. And so I
just feel like the more we don't call this what

(42:59):
it is when we see it, it makes it an
intellectual conversation and it's truly not. And so then when
people like you and I start to try and talk
about it, it makes it really difficult because and that's
why it's hard to have conversations with people like that,
because you just you.

Speaker 1 (43:18):
Don't even know what to say, because it's just like.

Speaker 2 (43:21):
If someone says I believe in unicorns, and you're like,
have you ever seen one?

Speaker 1 (43:26):
No, but they're they're real. It's like, okay, but you've
seen horses, right, Yes, but unicorns are the best. Do
you where are they? Does it matter? You tell me
where they are?

Speaker 3 (43:40):
What?

Speaker 2 (43:42):
These types of conversations just they are fruitless, and so
it's always just so frustrating. There's a quote from Maya
Angelou that Zakiah says all the time when it comes
to like talking to racists, and she says, somebody says
you have no language, and so you spend twenty years
proving that you do. Somebody says your head is and
shape properly, so you have scientists working on the fact

(44:02):
that it is. Someone says you have no art, so
you dredge that up. Somebody says you have no kingdoms,
and so you dredge that up. None of that is necessary.
There will always be one more thing.

Speaker 1 (44:15):
We I That's a quote that I find a lot
of wisdom in and I'm recurring to and I guess
that's my main point, is the distraction of it all.
And that's the thing. It's like from listening to your
episode on the on AI like AI is come, but
there there are lots of reasons to be critical of
AI and syptical of AI and absolute talk about the

(44:36):
ways that it's flawed and biased. I am right there,
but when we are when the dominant conversation that the
president puts out is something that is such a distraction,
and it's it's such a like culture war, like just
stoking the flames of division, when along the lines that's

(44:57):
something that is so important. I just I mean, there's
not I mean I should be used to it from
this administration by now, but I just think, what a miss.
And it takes me back to we did an episode
with doctor Joe Bolamwini, who was one of the people
who worked on the Biden administration's Executive Order on AI
right that EO. It wasn't perfect, but it definitely was like,

(45:18):
let's put a few guarbrails on this, let's get a
sense of like how this should be developed. Let's make
sure that the voices who are talking about this and
thinking about this are a little more diverse. The fact
that we went from that like like certainly not perfect,
but like someone who had some goddamn sense was talking
fucking sense on the issue to this in just a

(45:40):
few months. It really it makes my head.

Speaker 2 (45:42):
Spin, honestly, and I think that that's part of the tactic,
is to make us so tired that we will just
basically lie down and just be like enough, I like,
just let whatever happen happen.

Speaker 1 (45:56):
And so I think, like we recently interviewed Chelsea.

Speaker 2 (46:00):
Clinton on the show, and that was one of her
parting messages that she left with us, and it was
that we should not stop because that's like they're trying
to disorient us, They're trying to fatigue us, They're trying
to make us be quiet, and we should keep calling
things out. We should keep researching and asking questions and
asking hard questions and pushing our local lawmakers and our

(46:24):
state lawmakers and the president and his cabinet and making
sure that we don't let these like diversion tactics and
these tactics that they're using to make us be quiet work.
Because the first thirty days of this administration was just
everything was just coming so fast and curious, and I

(46:45):
was just like, oh my gosh, like I was having
to like in the morning, I would wake up and
be like, maybe I should just not look at my
phone for like an hour, have a cup of coffee,
like pretend like all is right in the world.

Speaker 1 (46:58):
But now I'm like, no, I need.

Speaker 2 (47:01):
Like I might give myself a little bit of time,
but I'm getting right back into it because being on
shows like this talking about these things and putting more
out there is so so important. And that's why I
absolutely love the work that you do, Bridget, because you
shine a light in so many on show, so many
areas that I feel like people they might not have

(47:21):
the words, but they come to you to find the words,
and you help them think through some of.

Speaker 1 (47:27):
A lot of these things that they might be.

Speaker 2 (47:30):
Like too tired to do the research on their own,
or like feel like they aren't whatever smart enough to
be able to get to the facts on their own.
You create these safe spaces for people to find facts
and to you know, laugh a little bit, but also
be like, Okay, now I know what I gotta do,
because you come to every single episode with this certain

(47:52):
energy that makes me be like, turn on my microphone,
I'm ready to go. You're so inspirational to so many
other podcasters that out there. I know, I know that's
not what we're talking about right now, but I just
want to say, because it came to my mind how
important you are to this space.

Speaker 1 (48:09):
You are not nobody. You are somebody to a lot
of people. You are somebody to me and like you,
I hope you never ever ever stop. Like I've been
following you for so long, and it's what you do
is amazing. I can only aspire t I'm gonna cry.

(48:29):
You're so important, You're so so important to the culture.
Feel the same way about what you do at Dope Labs,
and I just I think it's so important. I mean,
how many people that listen to Dope Labs tell you, like, oh,
it's so nice to have a show about science that
makes me feel like included and mean and welcomed into
the conversation, right that, Like what you're doing, Like you

(48:51):
do that so well, that's like what I want to
do here on the podcast. Like that's absolutely what you're doing.
You've no idea.

Speaker 2 (48:58):
I mean, I'm hi, You're you are perfect, like honestly,
oh my gosh.

Speaker 1 (49:06):
And it's I mean, I think we I think it
just goes to show that we for so long. I
feel like when it comes to stem, whether you're doing
a show about science or tech, like it's just very
easy for people to feel unseen and left out and
that and then when you feel unseen and left out,
you're checked out. And so when the president goes on
TV and it's talking about something AI related, and there

(49:28):
is nobody who is who can hear the sound of
my voice right now. There's nobody who's listening to this
podcast who is not smarter than Trump about it about
full stop, end of sentence, but also specifically when it
comes to tech and AI. If you were insen in
this podcast, you know more than him. End of sentence,
full stop, I guarantee you autely. It's just this very

(49:49):
I just hate this culture that can be so alienating
and can can prompt people to do exactly what you
just talked about, Like just check out and be like, Okay,
well they're gonna do what they're gonna do. I don't
need I'm not gonna. I'm not gonna try to like
arm myself with information. I'm not gonna. I'm not gonna
What difference does that mean? Right, you're going to pass
executive orders, are gonna sneak umperrate into the privacy policy.

(50:12):
I'm gonna use it anyway, Like, and I really want
folks to feel a culture shipt around that that's like, No,
you can arm yourself with information. You don't have to
be afraid. You don't have to be you know, you
don't have to be trepidacious about about being part of
the conversation because this the science and technology. It really
does unpack all of.

Speaker 2 (50:28):
Our lives and it's it's so true, and it's like
that's one of the things that Key and I always
try and do with the show is just trying to
create a space where people don't feel like we're preaching
to them, like we're learning. We are always learning with
the people who are listening to our show, and it's
so important for folks to know like the scientific process,
and we go through the scientific process in every single episode.

(50:51):
You do it here on your podcast as well. The
amount of research that goes into it. That is the
scientific method, and you get to a conclusion and that
is not you just pulling anything you know.

Speaker 1 (51:04):
Out of your head.

Speaker 2 (51:04):
You do a lot of research. They get to the
conclusions that you do that is scientific. You are a scientist,
you're a cultural scientist, and people need to start feeling
empowered to be to do that, to take the extra
step to do the research, ask an additional question, like
really not just read the headline, but read the full article,

(51:25):
see what the sources are, see what that study was, Oh, they.

Speaker 1 (51:28):
Only talk to ten people that night.

Speaker 2 (51:30):
Might not be enough to say, oh ninety percent of
people do this or do that. You probably need a
larger sample size. Those are the things that you should
be doing. And that's what we're trying to teach people.
We don't want anybody to feel like, oh, I'm not
smart enough, or I can't do this, or that.

Speaker 1 (51:45):
We want to.

Speaker 2 (51:45):
Empower folks to ask those questions and know that there
is no such thing as a silly question. And you know,
in some instances, like when we were talking about the
vaccine back in twenty twenty, like we really sounded the
alarm on why people are vaccine hesitant, why people of
color are vaccine hesitant, And it's not because they're stupid.

Speaker 1 (52:07):
It's because it's rooted in real.

Speaker 2 (52:10):
History, like medical history, where black folks were experimented on
and sterilized and all these things like that. So the
distrust of the medical community is not baseless. So there's
a certain level of care you need to come you
need to come into those conversations with because like people
of color are not just are not stupid, they're operating

(52:31):
within the framework of our history in this country, and
so When we talk about the vaccine, we say, listen,
I get it, but these are the things that we
know about vaccines. We break the vaccine down, we talk
about the different elements of it and why they should
they should feel like they can trust it. You know,
the creator of one of the vaccines is a black woman,
and we talk to her. And so these are the

(52:53):
things that we just try and arm people with. It's
like the tools necessary in order to get to what
the facts are. And then you can decide what you
want to do, what you do with the facts, that's
up to you. But I just want you to know
what the facts are. You know, because what I always
say is truth is subjective. Like your truth, my truth,
that person's truth. It's all your perspective. But fact it's

(53:14):
the same no matter who's looking at it. Do you
feel that in twenty twenty five, I don't know.

Speaker 1 (53:21):
I feel that we've gotten to a place where we
are so quick to just read the headline or throw
it into Google and then read the AI summary. I
mean what I like it's under It's so like and
I just worry that the art of fundamentally, the reason
why I do what I do is because, like at
my core, I am a nosy bitch and I want

(53:43):
to find out what I want to find out. So
when I call you a scientist, not a nosy listen,
I think I think it's one and the same. I
think there's more, you know, you know, like it's like
because it's about yeah, but when you read something and
you're like, huh, that's interesting, let me read more, let
me look into that way, let me read the study.
Oh the PI on the study, let me google them.

(54:04):
What do we have that like you? I can only
I'm just a nosy bitch. Yeah, I just want to know.
And I wonder like, are we losing the art of
just wanting to know? Are we losing the art of
like I worry that in twenty twenty five, it's it's
so trendy to be like media literacy is dead. But
I do think you can. You can, You truly can
go on the internet and say anything and people will

(54:26):
not check. People will not check. You could say anything
and people won't say anything.

Speaker 2 (54:29):
Listen, I'm not gonna throw nobody under the bus, but
let me tell you there's a lot of things happening
on whats that Yes, with some of these older people.

Speaker 1 (54:38):
That are parent age.

Speaker 2 (54:42):
And that is I'm just like, hey, where did you
find this information? Like did you did you try and
do some research about it? Like they're telling you never
drink ice water.

Speaker 1 (54:52):
Why I don't think that that's how they're like, oh,
it'll freeze your your goot.

Speaker 2 (54:57):
I don't think that that's how ice water works, truly.
It's just it's crazy. But I mean, yes, I do
agree with you.

Speaker 1 (55:05):
I think that.

Speaker 2 (55:06):
I think that with the way the Internet is now,
because like we're carrying it around with us constantly, it's
always right there, people are like losing their ability to wonder.
Like if I say, how tall do you think? How
tall do you think Lebron James is? Somebody's just gonna go, Siri,

(55:30):
how tall is Lebron James? Like nobody's gonna say, hm, well,
he was standing next to did you hear that?

Speaker 1 (55:39):
Next meet nine inches? Siri? Please, I'm recording an episode
with the bridget Todd Thank you. Oh my god. I
didn't even mean to be talking to her. I didn't
even press the button. I guess she heard her name.

Speaker 2 (55:50):
But like you won't say, oh, I saw him standing
next to this thing, and that's probably about this height.
Nobody wonders anymore, and I think that that is part
of the problem, part of the problem, and we just
go straight to our phones. And then because we're not wondering,
we're not doing any like, we're not creating these pathways

(56:11):
in our brain that are reinforced. So Lebron James is
six foot nine. This just told me I didn't do
any work to get there. So that piece of information,
I'm gonna forget it.

Speaker 3 (56:21):
Now.

Speaker 2 (56:21):
If I did a lot of research and took time
and tried to figure out how tall is see exactly
and do did measurements, I won't forget because that's just
how our brains are set up.

Speaker 1 (56:32):
You know.

Speaker 2 (56:32):
There are some people who are very, very intelligent that
I only have to see things one time or hear
things one time and they'll remember forever. But the way
that the average brain works is that it needs reinforcement
and in order to like, information needs reinforcement in order
for it to stick. And part of that is the
wondering and thinking about something for a long time, and

(56:53):
that's what helps us to remember and to recall. So
without the ability to wonder and like have curiosity and
try to get to the facts, without just doing the
Google or talking to Siri. Yeah, it makes It affects
your attention span, it affects your ability to remember things,
it affects your ability to be able to connect dots

(57:14):
in other aspects of your life, and you know, everything
becomes very very difficult. Like I mean, yeah, I'm guilty
of that myself. Like I'll be like, Okay, I'm gonna
go to the roasty store. It's like that sesame street
thing that stick a buttick of butter, milk, some bread,
and then I get there. I'm not like that little
black girl.

Speaker 1 (57:33):
I'm like, you don't get the little flash on top
of her head that's like the butter, the milk, the bread. No,
I'm like, did I need I think I need a
chicken nuggets. That's what I came out here for.

Speaker 2 (57:46):
And then I get back home and I'm like, I
did need butter and I did not get it. Like
I'm guilty of it too, because I use my phone
as a crutch for a lot of things. But exercising
your brain and exercising those muscles is so important to
like being able to understand what we're looking at and
not just reading the headlines, because when you exercise those muscles,

(58:07):
the headline will never be enough.

Speaker 1 (58:09):
It won't be a thousand percent. And when I I mean,
when I hear about young people in college who are
really using chaship et to get them through college. On
the one hand, I get it, because it's like, you
have a million classes, you have a lot of course work.
Da da da da. But I think back to when
I was in college a million years ago, and it's like,
I definitely had nights where if chad sheep Et existed,

(58:31):
I would have totally used chaship bet to test to
help me get through coursework. However, I also had nights
of discovery where I learned something that turns something on
in my brain or ignited a curiosity in me that
I still have today years later. And you know, I
don't think that we should be thinking about all of
this as just tasks to be offloaded or automated, because

(58:56):
that's I mean. I can't I mean, I cannot even
express how many times I would encounter something and it
unlocked a lifelong passion or interest or curiosity. And we
do our in our hyper connected world where we it's
so easy to offload things to technology, we do have
to make room for like our brain getting the tingles

(59:18):
because we read something cool and we want to learn
more about it, Like that stuff happens by happenstance, and
if you are not giving yourself experience, like it's almost
a like I worry that people are robbing themselves of
the experience of that happenstance encountering something that like turns
something on for you or unlock something for you.

Speaker 2 (59:35):
Yeah, I mean, and I totally agree, And I think
that folks can do that in a chat ept or
Claude or whatever, Gemini or copilot, whatever your AI model
that you want to use is because those the output.

Speaker 1 (59:50):
Is only as good as the input.

Speaker 2 (59:51):
If you're curious and you and you can ask the
right questions and follow up questions and like you're learning
as them, as the AI agent or whoever is giving
you this information, you can still discover Like chat GBT
has deep research and sometimes I'll put in there, it'll

(01:00:12):
spit out like a twenty page thing. Now this is
not something I'm copy pasting. That is something that I
use to research. Like I'm reading it and I'm like,
oh my gosh, and I'm highlighting things. I'm like, oh,
what's this source, I'll click on it, It'll take me
to whatever that source is. I'll read that and I'll say, huh,
well based on that, do you think that? H And
what about if you were to blah blah blah blah blah,
like you can still get there with AI.

Speaker 1 (01:00:35):
It's it's like AI is. It's like going to the library,
but instead of having to get the little car. Did
they still do that with the little cards and everything?
I hope? I doubt it. O.

Speaker 2 (01:00:46):
Man, Well, it's like going to the library and instead
of having to check out fifty books, AI can can
tell you what the books are. It can help you
get to the exact pair that you would need that
information from, and then from there you can build from there.
Like sometimes people are like, oh, well, I put in

(01:01:07):
I put in chat GBT tell me I'm gonna be
on a show with Bridgetod and what questions? Uh do
you think she's gonna ask me? And it gave me
all the wrong answers. I'm like, well, you could have also.

Speaker 1 (01:01:19):
Said Bridgetad is a cultural scientist AKA who loves some drama,
loves some t and just wants to know and and
you know she is talking about current events. She is
very funny. I would also like to be funny on
this show. Can you like get me up to speed

(01:01:42):
with what's going on in current events that she might
be talking about, and it will give you all of
these different responses and help you understand things more deeply,
like honestly, just like I was saying with the calculator,
a calculator is only as good as as you are.

Speaker 2 (01:01:59):
Like somebody can give you the most advanced calculator, the
most advanced software. They can give you the most advanced
coding software. If you don't know how to use it,
it's useless. Like you could put stuff in there, you'd
be like one plus one is two D already knew that. Okay, Well,
do you know how to do the area under curve?
Do you know how to do can you do differential

(01:02:20):
equations like you have? Like you have to know how
to use these things in order for to maximize your
potential when you're using it. And so that's what I
always say to people. I'm like, if you're interested in
using AI to like advance your life, really sit down
and spend time like going back and forth with it,
and like try not to be too much of a

(01:02:41):
Senate and see how it can help you, because a
lot of a lot of jobs now.

Speaker 1 (01:02:47):
So Morgan to mom, who's the owner of Blavity. She
was she had made a post where she said that
she was looking for looking through resumes for folks to hire,
and she was like, it's a shame because a lot
of these folks are not saying that they are proficient
in like catch ept in using AI and that is

(01:03:07):
mandatory to work in my company. And that's the path
we're on now.

Speaker 2 (01:03:12):
And so for folks to be like, I will never,
like know where we're headed before you say I will never,
because you really don't want to be left behind when
it comes to knowing how to use this technology, like
you we if you work in corporate America, you a
we all know, like an old person who is still
struggling with emails and turning their camera on when they're

(01:03:35):
on zoom. And you're just like, if you don't just
learn this stuff, don't be that person when you still
have the ability to learn how to use these things
at your job.

Speaker 1 (01:03:45):
And it's interesting that you say this because one of
the things I was looking at to prepare for this
conversation was this study in Harvard Business Review about how
women are a bit slower to adopt AI in their
work than then, and they they actually gave a lot
of like interesting expl day for why that might be
women caring more than men about some of the ethical
considerations of AI. Well, that makes sense, but also one

(01:04:07):
of the things they pointed out was women being feeling
like they're under more scrutiny for using AI and feeling
they might face more judgment than men if they do
use AI in their work. And I really felt like
that was so interesting because yeah, it is like, on
the one hand, I totally agree, but in twenty twenty five,
you want to make yourself seem like ative, a competitive candidate,

(01:04:29):
you like you should be able to speak articulately about AI. Absolutely. Also,
I wonder like, are is are are women who are
self reporting like, well, I don't want to use it
too much because I don't want to looking at me crazy, Right,
What do you say to somebody who feels that way?
I would say, uh, don't box yourself out in that way,
because the men are using it, and they're using it proudly, honestly,

(01:04:54):
And if you are in an industry where you feel
like it will help you be more efficient. You should
use it, and you should use it proudly, because even
in job interviews, they'll they'll they'll give you a scenario,
and if you don't say I would use AI for
this part of it, they're like that too slow is
you're gonna be way too slow.

Speaker 2 (01:05:12):
Like oh I would research, No, say I would plug
that into chat GBT Chat GBT will tell me this
and then from there I'll be able to pinpoint the
right people within the organization to blah blah blah, like
you don't want to box yourself out.

Speaker 1 (01:05:27):
Don't don't let these don't let these men. Don't let
me putting pro that they are, that they are prompt
engineers on their resumeation, I know how to ask cha.
I have seen it. I can confirm it with my
own eyes.

Speaker 2 (01:05:43):
And I remember I can't remember what book I read
it in, But it was like when a man applies
for a job, if there are five qualifications and he
only has one, he's he's more likely to apply for
it than a woman who has three or four of
those qualifications, like she will she will deny herself if
she has only four of the five qualifications.

Speaker 1 (01:06:02):
When a man will apply when he just has one.
We need to move with the.

Speaker 2 (01:06:07):
Boldness of unqualified white men.

Speaker 1 (01:06:13):
Yes, and I've said this on the show before. Is
that my first ever job in podcasting? It was back
in like twenty twelve, before Cereal was a thing, before
anybody knew what a podcast was. I applied. I was
just somebody who listened to podcasts and I need to
work on a podcast, and I put out on my
resume or I think I said in the interview that

(01:06:33):
I knew how to do final cut pro. In reality,
I had never I did not know how to do
final cut pro. It is tho. This was just a
fucking whole lie. And then I got the job, and
then I learned on YouTube. So have confidence. I am
an advocate for people lying themselves into the job that
they want to. Why not? You know, like you can
learn it.

Speaker 2 (01:06:54):
And that's the thing is that in any job you
can learn. You can learn any of the stuff if
they if you have enough of the quasis, you're a
fast learner and you feel like you could pick stuff up.

Speaker 1 (01:07:03):
Just do what you gotta do to get in there.
You learn when you get there. Okay, Gezi, I have
one last question.

Speaker 3 (01:07:08):
For you.

Speaker 1 (01:07:09):
Yeah, you know in your in your episode about AI,
you really I will put it in the Showwes because
people really should hear it. I am often asked, So
I am I feel similarly to you that people should
be testing out how AI might fit into their own
lives and their own work. People often say, what about
the like environmental and ethical considerations which are real? I

(01:07:31):
was gonna talk recently about the use of AI and
podcasting and how it actually does show up in some
podcasts even though peole don't want to admit it. What
do you say to that? People are like, well, how
do you square that with all the ethical and environmental considerations?

Speaker 2 (01:07:43):
I always say that there are ethical and environmental considerations
for all things, and if you want to hyper focus
on AI, then sure, But I don't know if that's
the hill you want to die on, because if folks
want to like if you can.

Speaker 1 (01:08:00):
So sometimes people talk about water.

Speaker 2 (01:08:02):
To make a serving of rice, the amount of water
that's consumed is seventy two liters. One hamburger patty is
two thousand, four hundred liters. And it's not just to
cook it. It's like from the cow, the slaughter all
the way into your happy meal twenty four hundred liters
of water and one chat gbtique query is point zero five. Leaders, Like,

(01:08:27):
there are a lot of things people have talked about
almonds and how much water it uses, rice and how
much water it uses. It's like, I don't want to
discourage people from taking AI to task and saying, hey,
we need to make sure that as we're developing this
technology that we are considering the environment. We all got
to live here, and I don't want this earth to

(01:08:49):
burn up.

Speaker 1 (01:08:49):
I want it to last for a long time. I
love you Earth. But we also have.

Speaker 2 (01:08:55):
To think about things holistically and the costs for a
lot of other things as well. Like we also need
to we haven't figured out the whole emissions thing, we
haven't figured out the fossil fuels thing. But it's like
people aren't talking about it anymore, and I'm just like
everybody's hyper focused on AI and which, you know, when

(01:09:19):
you think about some of the awful things that have
been done with some of these data centers and people,
you know, losing the access to clean air and things
like that, Yes, but there are so many things where
we're just not there yet, and we've kind of lost
the plot a little bit and we've lost focus. And
I think that we have to just continue to beat

(01:09:39):
the drum about doing things in a sustainable way and
doing things that don't harm our environment, but also advancing
ourselves because AI might be able to get us to
a place where we could find alternatives to fossil fuels.
There are scientists that are actively working to make sure
that AI is more sustainable. But I'll tell you this

(01:10:01):
right now, those the fossil fuel folks, I won't say
no names, I won't be specific.

Speaker 1 (01:10:06):
They're not trying to do that. No, they're not trying
to do that.

Speaker 2 (01:10:13):
But I know that the folks that are that are
researching and trying to get us to a better spot
when it comes to AI and its environmental costs, they're
they're absolutely taking it, taking it into consideration, and it's
gonna take some time and it's gonna take some innovation.
But uh yeah, I think that folks really have to
ask themselves other questions as well when they're thinking about

(01:10:36):
the impacts. I'm like, it's not like, oh, everything is bad,
so might as well just do it.

Speaker 1 (01:10:41):
It's like, yes, there are some bad aspects to AI
that is bad. For the environment. Let's let's tackle that.
And let's also not forget about fossil fuels. And let's
also not forget about fast fashion. Like, let's also not
forget about the landfill. Let's also not forget about the

(01:11:03):
trash that's floating in the ocean. Let's also not forget
about vaping. Hello, let's also not forget about.

Speaker 2 (01:11:11):
Like there's just I just want us to always like
a lot of these things are you know, taking our
attention away? I just want us to always like remember
that there are so many things that we have to
tackle as.

Speaker 1 (01:11:24):
A global community, and we can't lose sight of that
t ty beautiful closing word, thank you so much for
being here. You're such a breadth of fresh air. Oh
my goodness, I'm so happy to be here. Are you kidding?
I mean, this is like a mutual podcast love affair. Honestly,
I'm I'm sased. Where can folks listen to the podcast?

(01:11:47):
What should they know about the podcast? Follow? You give
us all the deeps?

Speaker 2 (01:11:50):
Yes, so you can listen to Dope Labs wherever you
get your podcasts, So you just look for a Dope
Labs podcast. We have a really cool black and yellow
low it's so good. You can find us on Instagram
mainly at Dope Labs podcast. You can find me at
d R Underscore t Sho, and if you want to
follow my co host Sakia, you can find her at

(01:12:12):
z set So. You can also go to dope blabpodcast
dot com where we put a whole bunch of stuff
on fun stuff, and you could subscribe to our newsletter
where we put a lot of little tidbits about ourselves
and like what we're up to and the things that
we're into.

Speaker 1 (01:12:25):
We will hopefully be hearing Zakiya on the pod very
soon as well. Thank you so much for being here.
I mean, it's been a pleasure and thank you for
the work that you're doing. I have ever since I
the first time I heard about the concept for Dope
Lab is, I was like, this is it? We need
this in the world. It's just I'm so glad that
you all made this beautiful podcast and that it exists

(01:12:45):
in the world. Thank you, Thank you may so much
for being a shining example for us to follow. I mean,
it's hard out here in these podcast streets, but seeing
you do this and do this so well for so long,
it's always so encouraging and it's really an honor to
be here. Like when the email came in, I was like, I,

(01:13:06):
this clearly is a phishing was any whatever link is
in this?

Speaker 2 (01:13:12):
I know it's fishing because what is going on? I
cannot go oh my gosh, well, thank you so many time.
We love having you come back.

Speaker 1 (01:13:20):
Anytime I hope to be back.

Speaker 2 (01:13:22):
Please have me back whenever I'm ready.

Speaker 1 (01:13:30):
Got a story about an interesting thing in tech, or
just want to say hi, you can be just said
hello at tengody dot com. You can also find transcripts
for today's episode at tengody dot com. There Are No
Girls on the Internet was created by me Bridget Tod.
It's a production of iHeartRadio, an unbossed creative. Jonathan Strickland
is our executive producer. Terry Harrison is our producer and
sound engineer. Michael Almato is our contributing producer. I'm your host,

(01:13:52):
bridget Todd. If you want to help us grow, rate
and review us.

Speaker 3 (01:13:55):
On Apple Podcasts.

Speaker 1 (01:13:57):
For more podcasts from iHeartRadio, check out the iHeart You app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.