Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
I'm t T and I'm Zakiyah and this is Dope Labs.
Welcome to Dope Labs, a weekly podcast that mixes hardcore
science with pop culture and a healthy dose of friendship.
We don't really show our faces a lot or at
(00:25):
all with the podcast. We don't do because I know
a lot of podcasts now are like, oh, you can
come and watch us, Yeah, you can come and watch
us on YouTube or whatever, and we don't do that
because we don't feel like we need.
Speaker 2 (00:40):
To do that.
Speaker 3 (00:40):
But what I have said this is one of the
things that I did do in preparation for this lab
was I asked chat GBT to make a photo of us,
and so I uploaded some reference photos and I just
sent you what they were, and I want your initial reactions.
Speaker 1 (01:03):
That's the first draft, right because this is giving AI,
you know it is it is. And I'm confused because
I see other people that are like, I generated this
using AI and it doesn't look like it looks very realistic.
Speaker 2 (01:17):
Uh huh.
Speaker 1 (01:18):
There's this black woman, her name is Jessica E Boyd,
So she's on Instagram at Jessica E Boyd and she
does these incredible AI images. It's just a great job
and it looks like real afro texture hair, you know,
but this one you created no shade. But how many
fingers do I have?
Speaker 3 (01:37):
It looks like you've either got fourteen or six.
Speaker 2 (01:41):
I can't really tell.
Speaker 3 (01:42):
All the fingers are melting together, and I'm like, this
is fun.
Speaker 1 (01:47):
I see people that have been making themselves, like not barbies,
but like action figures in the plastic.
Speaker 2 (01:51):
I've been seeing that. I've been seeing that.
Speaker 1 (01:54):
I've seen people do themselves in animation.
Speaker 2 (01:56):
Yeah, the studio ghibli ones. Those are very cute.
Speaker 1 (01:59):
But I think like, no, don't get me wrong. I
love all the latest technology, and you know, I've been
out here trying everything okay, but sometimes I have to
stop and wonder, like who's doing all this stuff? Whoever's
doing it, or wherever's being done. It's just so far
away from us, and it makes me think, like, what
is the cost to generate these outputs?
Speaker 2 (02:19):
Honestly?
Speaker 3 (02:20):
And I think that's something that has been talked about
a lot on social media. Most recently, I've been seeing
people saying it takes sixty seven trillion gallons of water
to just if you say hello, chat GPT, And I'm like,
is that how that works.
Speaker 2 (02:36):
I don't know if that's how that works.
Speaker 1 (02:38):
I don't think that's how that works. But it's time
to pull back the curtain.
Speaker 2 (02:42):
Let's jump into the recitation.
Speaker 1 (02:46):
So today we're diving into AI. We're gonna talk a
little bit about automation, large language models like chat, GPT
and their impacts, specifically their environmental impacts. So with what
we know TT.
Speaker 2 (03:02):
So we know what automation is.
Speaker 3 (03:04):
That's the use of technology to perform tasks without human intervention.
So it's often rule based, so like if then if
I do this, then it'll do this or do this
until and it's also repetitive, so it's able to keep
going over and over and over again without need for
a human.
Speaker 1 (03:25):
But I think a step above that or beyond that
is artificial intelligence, and sometimes we see people conflating those two.
Artificial intelligence is when you have systems that are designed
to mimic human intelligence and so unlike automation, which is
just a rule and it just every time it encounters
this one thing, it does whatever you told it to,
artificial intelligence is capable of learning and problem solving, and
(03:49):
so it'll say, okay, if you put this input, you
tend to like this, so the next time I see
that input, I'll bring it up. Even if that's not
a rule, it's a learning. It's able to expand its
knowledge based on what the inputs you give it.
Speaker 3 (04:02):
Right, So like if you had a button factory and said,
every time this piece of plastic comes here, poke three
holes in it, and it does that, and it does
it over and over again, that's automation.
Speaker 2 (04:13):
Now if you had a machine that was able.
Speaker 3 (04:16):
To say, based on fashion trends, I'm going to make
this button out of wood instead of plastic, or I'm
gonna make this button bigger because people started liking bigger buttons.
Speaker 2 (04:24):
Now that would be aim.
Speaker 1 (04:27):
Then you have large language models, and that's advanced AIS
systems that are trained on vast amounts of data, particularly
text data, to understand and generate human like language. So
that's CHAT, GPT, Gemini, all of that.
Speaker 3 (04:41):
Yeah, they're calling the Internet and gathering all of this
information and creating context around all of that information so
that they're able to learn. The next thing is proprietary models,
and what they do is they highlight that companies develop
specialized models.
Speaker 2 (04:57):
So open AI they have.
Speaker 3 (05:00):
The GPT variant, so we know open AI, most commonly
for chat GPT and chat GBT within it, that's just
the big umbrella. There's multiple types of chat GPT, so
they have four oh they have I think it's four
and four mini. They also have deep research now, so
it's tailored specific for a different task.
Speaker 1 (05:23):
So now that we all have kind of the same
understanding of what these terms mean, let's talk about what
we want to know.
Speaker 3 (05:30):
I want to know how the energy consumption of training
these AI models and running these AI models compared to
other technologies, like is it one to one or is
AI the big bad wolf?
Speaker 1 (05:44):
And then like what are the implications? So like water
usage is one that I've seen people talking about on
social media, so about cooling data centers and all of
this stuff. I was like, I thought we were all
in the water cycle. So all the word is here
is here, but I don't know we did an expert
for that. I also want to know the mechanics. You know,
my background isn't engineering all of those things that it's
(06:06):
just interesting to me.
Speaker 2 (06:07):
So what processes occur when you're.
Speaker 3 (06:11):
Using a large language model like chat GBT or Gemini.
Speaker 1 (06:15):
We also want a future cast and the same way
that we looked at what policies are around new technology.
We thought about this in our Space episode. You know,
we say, like, all right, now that all these people
are going to space, who's making the rules, who's said
in the speed limit? The same thing for AI? You know,
who's making the rules, who's saying how much is too much?
Speaker 2 (06:35):
What do you use?
Speaker 1 (06:36):
What's the costs, what's you know, there's a cost and
benefit for all these things, and so what is it
for AI? It's out of site, out of mind really.
Speaker 3 (06:43):
Right, Well, I think that sets the stage perfectly for
us to jump into the dissection.
Speaker 1 (06:53):
For today's dissection, we are talking to doctor Challet Wren.
Doctor Wren, we wanted to start by kind of setting
the stage about them and some measurement. You know, I
think for all tech and innovation, there's a cost and
a benefit, and often those costs are immutely presented in
price and things you buy to maintain the technology. Sometimes
(07:13):
it's subscriptions for software, Like I remember when I had
a game Boy, it was the batteries. Okay, yeah, burn
was like burning through batteries. For cars, it's gas, it's maintenance,
it's inspections for emissions. But when we're physically removed from
the technology, it can be harder to have a good
grasp of like what it costs to run these things.
(07:33):
How do we understand and appreciate costs now.
Speaker 4 (07:37):
So this is a huge energy consumption, and I mean,
by the way, using energy itself, I think is a
great thing because it just shows that we have more productivity.
You know, usually energy consumption is a indicator of the
economic activity, but the consequence of the using energy that's
not always good. For example, the energy produces heat and
(07:58):
we need water to take away the heat, and that
consumes our natural resources. And also when we generate how
we generate the electricity, how we generate the energy. Usually,
even in the United States, a large fraction of the
energy is coming from fossil fuels, and that comes with
the carbon emission problem. Right. So this, you know, it
(08:19):
has this long term impact on the climate change as
many of the research that they have shown.
Speaker 1 (08:25):
I think that's a great point. What doctor Renna is
saying is it costs even before we get to water,
there's just energy. This is something I'm always talking about
the people who drive electric cars. I'm like, Okay, you're
not using gas, but what do you think how you
think we get the energy that's coming to you.
Speaker 3 (08:40):
Right, That is one of the laws of thermodynamics. Energy
cannot be created or destroyed. So we're not creating this
energy out of nowhere and you can't get rid of it.
Speaker 2 (08:53):
It all goes somewhere.
Speaker 3 (08:54):
It's a closed loop like energy is flowing from one
thing to another.
Speaker 1 (08:59):
And I think the other part that he mentioned that
I didn't really think about for these data centers is
the air quality. So PM two point five that means
particular matter that is two point five micrometers or smaller
in diameter. Now, biologically, when you're that small, you can
get very deep into the lungs, okay, and so whatever
(09:22):
you have can get into the lungs and it can
potentially enter the bloodstream. Not always, but potential is enough
for me to be like humhm, right, that's enough for me.
Speaker 2 (09:34):
That sounds scary.
Speaker 4 (09:35):
So the air pllutants includes particular matter two point five
PM two point five and also nitrogen dioxide soultur dioxide.
So these type of air pollutants can travel hundreds of miles.
So even though we don't leave next to a data
center or next to some power plants, we are the
air pollutants that we breathe in could be coming from
(09:57):
those places, and so this air just have a direct
toll on people's health. They could create asthma, heart attack,
lung cancer, and also even premature deaths. And there's been many,
many research studies showing that causal relationship between these air
pollutants and people health.
Speaker 3 (10:18):
But what about the water that's associated with running these
AI models?
Speaker 2 (10:23):
Is it really as bad as folks are saying.
Speaker 4 (10:26):
Our study find based on the very limited data in
the public domain, we show that if you have a
conversation with a large language model, you likely consume about
five hundred million water for ten to fifty queries, depending
on where have you random model, And that's for the
(10:48):
inference and also when you train the model, it's also
consumed quite a bit of water.
Speaker 1 (11:07):
Okay, so we're talking to doctor Charlet Wrenn about AI
and water usage, and doctor Rand said, basically, with ten
to fifty queries, you use about five hundred millileaters of water,
which is sixteen point nine ounces or a Deer Park
water bottle. Now, I think there are two things to
unpack their tt the water usage and the note that
(11:28):
he said about it depending on where you are when
you run the model. Let's start with the water usage.
Speaker 3 (11:33):
Okay, so before we get too deep into the water consumption,
I really want to take a step back and like
set the stage for everybody about what actually happens when
you type something into chat GPT. When you put a
prompt into CHATGPT, it travels through the Internet to a
data center where that model. Remember we talked about those
different specific models associated with different organizations, So open ai
(11:56):
has the GPT models. So where that model is hosted,
it goes to that data center, and then the servers
process your input by running it through that model, which
consists of layers of neural networks, and then the servers
generate a response and send it back to your device,
so your phone or your laptop or however you're using
that AI model.
Speaker 2 (12:17):
And so it's a very fast process.
Speaker 3 (12:20):
If you've ever used any of these AI tools, you
know it happens in seconds. So when we're talking about
water consumption, I think knowing where that water comes in
because it's like, what are we talking about here? Why
is water associated with the Internet. Data centers use a
significant amount of water to cool those servers down because
(12:41):
they generate a lot of heat.
Speaker 2 (12:43):
We all know that.
Speaker 3 (12:43):
This technology gets hot when we're on our laptops for
a long time, on a computer for a long time,
when we're on our phones for a long time, they
start to feel a little bit warm, right, And that's
because you know, all of the technology inside is working
and it's generating heat to do that work. And so
what they do is they're running this cooling system which
uses water to cool down these servers.
Speaker 4 (13:06):
When we cool down the data center facility, it involves
two typically involves two stages. The first stage is moving
the heat from the servers to a heat exchanger or
to the facility. This process either uses air or uses
a closed loop with some special liquids so there's no
water loss, and there shouldn't be any water loss unless
(13:26):
there's a leak, okay. And the second stage is moving
the heat from the heat exchanger to the outside environment,
to the atmosphere, to the sky and this process, of course,
there are different options for moving the heat, but one
Cammer approach is using water evaporation. And water evaporation just
(13:47):
is the water that is temporarily lost into the sky,
and that's what we call water consumption. So of course
the water is still within our planet. It doesn't go
to the sound whatever somewhere. Yeah, yeah, within our globe.
But it's just you know, when the water comes back
(14:09):
and where it'll come back is highly uncertain, especially in
those route regions, and also even when the water comes back,
it may not be in the in the freshwater state.
Speaker 1 (14:22):
But I think in the same way that they're using
water to cool down a service. Isn't that the same
way like basically air conditioning works and cooler and stuff
in there.
Speaker 3 (14:31):
And the other thing is like the Internet been using servers,
like AI using servers. That's not new. That's not new technology.
We've always had servers. AI is just advancing and using
those exame, exact servers that we're already being used for
the Internet, like the same types of servers.
Speaker 2 (14:49):
They just have different models in it.
Speaker 3 (14:50):
But when you type in www, dot Dope Labs, podcast
dot com, that information has to go to a server
in order.
Speaker 1 (14:58):
To type it in right now.
Speaker 2 (15:00):
He tied it right now and click on stuff. Oh
my goodness. Yeah, it's it's they're all doing the same things.
Speaker 1 (15:09):
I think one of the things that was really interesting
to me that Charlett explained is that sometimes people are
comparing apples and oranges because that water that's those servers
are using is being recycled.
Speaker 2 (15:21):
It's being used.
Speaker 1 (15:22):
Over and over again. And they were comparing it to
like eating a hamburger, but they were looking at the
water to cow, the water to water the grass.
Speaker 4 (15:31):
Like a few weeks ago, the CEO of a leading
AI company said, you know the study by the study
on as water US. It was something made up by
those anti AI crowd. But this, I would say, this
is really misleading comparison because the Hamburg's water days quoted
includes the rainwater in the in the soils to grow
(15:54):
the grasses, to feed the cattle, to make the paddy.
So that's really live cycle water using and it's mostly
water which we call green water in the technical community.
But the air water consumption is only for the operational
stage when we actually random all, we're not talking about
the water used for mining the rare earth or making
the AI chapes or recycling the summer. So it's not
(16:17):
life cycle for water for AI, and so it's a
different scope of water comparison, different type of water. And
the air water is mostly blue water, that's the water
in the rivers, lakes, and groundwater sources that human can
directly use. So I think those type of comparison are
just showing that they either do not have the necessary
(16:41):
knowledge in this space or they're having some other agendas.
So yeah, those are not really scientifically correct comparison.
Speaker 2 (16:52):
Oh okay, I think I understand.
Speaker 3 (16:54):
So the water consumption is the water that doesn't make
it into the drain and into our sewage. So that
would be the water that in the washing your hands example,
is being absorbed into your skin a little bit and
then also absorbed in like a paper towel or a
dish towel, or however you wipe your hands off after
(17:14):
you wash your hands. I hope y'alla washing our hands.
Speaker 4 (17:17):
Yeah, So technically that's called water withdrawal, and the water
you put it that goes into the sewage is called
water discharge. The difference in water wizdow and water discharge
for washing our hands is very minimum, so okay, the
difference is called water consumption.
Speaker 3 (17:35):
So Challet did a study about how AI and human
work compared to each other, about the efficiency and about
the energy required to perform a task, and what they
found was is that of course, you know, in a
lot of situations, AI is more efficient. It can work faster,
(17:58):
it can work longer. Know where humans require you know,
lunch breaks and sleep. AI doesn't it's constantly running. But
the energy consumption of AI is a lot higher, you know.
So there's like pluses and minuses to both our studies.
Speaker 4 (18:13):
Not trying to say, let's AI is bad and that's
not our purpose. Its AI is great. It's why why
it's bad. It's it's using some resources. It's just like
anything else, right, So we need to have an understanding
of the cost. And turns out the water cost is
just one part of the cost, and where there are
also other costs like the climate, the energy, the public health.
(18:37):
So we need to look at this, evalu the cost
more holistically and think about what is the best way
to support the technology. That's our purpose. Our research goal
is not like hey, please stop using it now, No,
we shure to use AI for good purposes and you
(18:59):
know more response way.
Speaker 3 (19:01):
Can you talk about that a little bit more the
cost to the environment, public health, so that people can
understand what you mean by that.
Speaker 4 (19:10):
So because this is a huge energy consumption, and I mean,
by the way, using energy itself is not I think
is a great thing because we want I mean, it
just shows that we have more productivity. You know, usually
energy consumption is the indicator of the economic activity. So,
but the consequence of using energy that's probably not always good.
(19:31):
For example, the energy produces heat, and we need water
to take away the heat, and that consumes our natural resources.
And also when we generate how we generate the electricity,
how we generate energy. Usually even in the United States,
a large fraction of the energy is coming from fossil fuels,
and that comes with the carbon emission problem. Right, So this, uh,
(19:55):
you know, it has this long term impact on the
climate change. As many of the research I've shown, I feel.
Speaker 1 (20:14):
Like so much of the conversation has been framed as
an all or nothing is don't have any costs or
use AI, And I'm like, what people don't think about
is how many people are using Zoom, Google Meet, Skype
if that's still operational. I don't know Microsoft teams and
you know, when the pandemic hit, Zoom saw elite from
(20:36):
ten million daily users to over three hundred million daily users.
And there was a study from Purdue University. A group
there said that an hour on zoom generates one hundred
and fifty two one thousand grams of CO two. Nobody
was thinking about that when we were trying to all
stay connected and stay saying now turn off your camera
can help with that is whether this study is in.
(20:58):
But I think us considering, you know, there's a cost
to all of these things that we often don't think about,
and so I think it just becomes important to consider, like, hey,
everything here has a cost.
Speaker 3 (21:11):
Right, and the cost varies depending on where you are.
So the amount of water required for the AI chatbots
using chat GBT four to generate a one hundred word
email varies by location. The amount goes up depending on
where you are, Like in Washington State, it'll require almost
(21:31):
fifteen hundred milli liters of water, while in Texas it's
only two hundred and thirty five. So we can't just say, oh,
all of it requires all this water, and that's not
necessarily true.
Speaker 2 (21:45):
Is it using water?
Speaker 3 (21:46):
Yes, in a closed loop, because the water we have
is the water we've got, But also other forms of
technology are also using water as well. It takes six
nine hundred and nine liters of water to produce one
pound of beef one pound of beef six ninety two liters.
(22:07):
Training a chat GBT three has the same water cost
as producing one hundred pounds of beef.
Speaker 1 (22:15):
So if one hundred people do meatless mondays yea, then.
Speaker 3 (22:19):
It's nearly double the amount and average American eats in
a year. So the amount of water you're using to
eat beef is way more than you're using to run
a CHATGBT model. So in another comparison, is it takes
nine and fifty six liters of water to produce one
pound of rice because rice has grown completely submerged underwater.
Speaker 2 (22:39):
Requires a lot of water to grow rice.
Speaker 3 (22:42):
But we haven't been hearing nothing about that, or maybe
people been talking about it, and I just don't know that.
Speaker 1 (22:47):
I know that I'm.
Speaker 3 (22:49):
People been talking about AI and how all this water
and all these things like that, and I'm just like, man,
we really got to think about everything else that requires water.
I remember we talked about it when I said I
was drinking almond milk and they were like, you know,
text that growing all men, And I was like, damn.
Speaker 1 (23:07):
You know how much water it takes up. If I
have a reaction to where I'm lactose.
Speaker 2 (23:12):
Intolerant I know.
Speaker 1 (23:16):
People, you know, I just think I really, when it
comes down to it, I think we are just we're
bringing in so much information, but we're not grounding it
at anything. No, really, I blame the media. Uh and
not our media, but like Hollywood, because everything you see
represented about AI, nothing is really talking about this kind
(23:39):
of stuff, or you would never see. I don't think
there's a lot of grounding what the cost is to
the benefit and convenience of things, and so when people
start talking costs, it's like, oh my goodness. I'm sure
people thought the same thing when the radio hit the
market or when the calculator started being used. I mean,
I feel like AI, the way people talk about AI
(23:59):
in the education system, they always just put it in
this thing where it's like they're cheating, They're cheating, they're cheating.
Speaker 2 (24:05):
I'm like, first of all, that's the thing in the past.
Speaker 3 (24:07):
That was something that was an issue with early AI,
early chat GBT.
Speaker 2 (24:11):
That's no longer an issue anymore.
Speaker 1 (24:13):
I think the main thing I can tell people I
would want our listeners to consider is to consider all
of this in the grand scope of the natural resources
that we have available. Think about it when you turn
it on your ring light when you are live streaming
but not even looking at your device, when you are.
Speaker 3 (24:35):
Like scrolling on TikTok for hours and hours and hours
and falling asleep with the TikTok running and the TikTok
keeps looping and looping.
Speaker 1 (24:43):
Yes, I think there is a cost to all of
the things that are entertaining, convenient and brought to us
through our devices, and I think we should be considering
that for everything. I think one of the key things
that I'm interested in is how public policy begins to shape,
Like are we going to have Now this may be
a little dystopian, I'm like, are we gonna have AI tokens?
(25:05):
You know, not only certain can you use it? But
who knows?
Speaker 3 (25:11):
Yeah, I mean I think that that is what's up next.
Agentic ai is the next big thing where you know,
you can make your own personal assistant. You can create
these AI models that do a task for you, that
can you know, read emails and populate your calendar and
really work as a personal assistant for you. And that's
where where a lot of industries are moving towards. A
(25:33):
lot of these corporations they want agentic ai to be
a part of everyone's daily life. So as soon as
you wake up and log into your computer, your agent
can tell you, Hey, this is what you missed. This
is what you need to do, this is what you
should be prioritizing.
Speaker 1 (25:48):
No, this is what you missed.
Speaker 2 (25:49):
I was sleep.
Speaker 1 (25:53):
I think that is definitely already in the works, So
I think it'll just be the next wave of adoption.
Shouldn't that we see next? Yeah.
Speaker 3 (26:01):
And the other thing that I think people should be
thinking about and like keeping an eye on is the
fact that a lot of regulations don't exist yet, you know,
and we've already seen it with like the actors strike
because of folks being able to use their face and
(26:22):
make them do things, and like it's a it's a
really big issue when it comes to creativity because now,
you know, artists are feeling like, okay, art is going
to be lost if you can just take Michael B.
Jordan's face and body and get him to do whatever
you want him to do a jail, like he should
(26:46):
have a say in that, Like there should be loss
that say that that is illegal, and the laws have
not caught up yet.
Speaker 2 (26:52):
The regulations have not caught up yet, So keep trying.
Speaker 1 (26:56):
There yeah. Yeah.
Speaker 3 (27:05):
You can find us on X and Instagram at Dope
Labs podcast, tt.
Speaker 1 (27:10):
Is on X and Instagram at d R Underscore T Shoe.
Speaker 2 (27:13):
And you can find Takiya at z said So.
Speaker 1 (27:16):
Dope Labs is a production of Lamanada Media.
Speaker 3 (27:19):
Our senior supervising producer is Kristin Lapour and our associate
producer is Isara Savez.
Speaker 1 (27:26):
Dope Labs is sound design, edited and mixed by James Farber.
Lamanada Media's Vice President of Partnerships and Production is Jackie Danziger.
Executive producer from iHeart podcast is Katrina Norvil. Marketing lead
is Alison Kanter. Original music composed and produced by Taka
Yatsuzawa and Alex sugi Ura, with additional music by Elijah Harvey.
(27:50):
Dope Lab is executive produced by US T T Show
Dia and Nakia Wattley.