Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Already and this is the daily This is the daily
ohs oh, now it makes sense.
Speaker 2 (00:14):
Good morning, and welcome to the Daily Ods. It's Monday,
the tenth of February. I'm emma, and I'm a told.
Artificial intelligence or AI has become increasingly ingrained in our
lives in recent years, and we've seen this technology used
for both good and for bad, from life changing medical
discoveries to the rise of explicit deep fakes. But during
(00:36):
the recent AI boom, we've heard a lot of conversations
around ethics and safety. As those conversations have become more intense,
there is a lesser discussed concern that is now gaining attention,
and that's all about the environmental impact of AI.
Speaker 1 (00:53):
AI requires immense quantities of resources, and this includes electricity, water,
and finite minerals. Now, with a growing amount of AI,
it's carbon footprint is only expected to rise in coming decades. Today,
we're going to explore that early Red Flag's environmental experts
are raising and what they mean for the sustainability of AI.
Speaker 2 (01:15):
Chol you have gone deep on this topic for us
because I think we've seen a lot of headlines floating
around lately about AI not being great for the environment.
But it's a confusing space to start with. Let alone,
before we get into the specifics of this, it does
feel like a really new advancement. But I was surprised
when you told me. AI is a term that was
(01:36):
first coined in nineteen fifty five by an American computer scientist.
His name was John McCarthy. But these days, when we're
talking about AI, I think most people are probably thinking
of what's called generative AI. So that's like open AI's
chat GPT, right, so those chatbot based systems which are
built on large language models, and that means they use
(02:00):
these vast databases of online text and images to generate
new content. Now that chat GPT has become so incredibly popular, though,
people are turning to this discussion about its demands, So achold,
what does it actually take to power generative AI like
chat GPT?
Speaker 1 (02:19):
Yeah, so popular might just be an understatement. So since
launching in November twenty twenty two, chat gpt has amassed
around three hundred million weekly active users worldwide. Wow, I know.
So every time someone asks chat gpt to complete a task,
it uses around two point nine what's per hour of energy? Now,
(02:39):
that's according to the International Energy Agency. To really put
this into perspective, a recent study found that chat GPT
consumes enough power annually to charge over three million electric
cars or about fifty million iPhones.
Speaker 2 (02:53):
Wow, I got to be honest with you, I didn't
know what what's per hour energy really meant. So that
definitely helps put it into perspective.
Speaker 1 (03:01):
Yeah. Absolutely. Now, Chat GPT has dominated the AI space
for the past three years that it's been running. However,
this year we witnessed a new generative AI called deep
Seek disrupt global financial markets and you probably saw headlines left,
right and center earlier this year. Now, the Chinese model
introduced itself as a cheaper, more energy efficient alternative to
(03:23):
its American competitors. Just like open AIS Chat GPT, deep
Sea can summarize texts, answer questions, and generate writing based
on prompts. Now, what's really fascinating about deep Seek is
that it performs just as well as the leading American
AI systems, but only for a fraction of the cost,
and it claims it can use up to forty times
(03:45):
less energy.
Speaker 2 (03:46):
Yeah, this is a big claim from deep Seek, and
I think we're all kind of waiting to hear a
bit more about how it plans on achieving that or
if it will really be able to offer the same
service as a chat GPT. But when we think about
the environmental impact of generative AI, it's not just about
this massive strain on energy grids. There are other strains
(04:09):
on resources right.
Speaker 1 (04:11):
Exactly right. The resources needed to run, support and train
generative AI a house in these facilities called data centers. Now,
while the exact figures are still being debated, it's estimated
that AI data centers account for up to one point
five percent of global electricity usage. Now that might seem small,
but it means that a single data center could consume
(04:33):
enough energy to heat fifty thousand homes for a whole year. Okay,
these facilities are expected to use as much power annually
as countries like Japan and Russia by next year. That's
according to a recent study from mit SO. This raises
concerns amongst experts who fear it could potentially put additional
strain on electricity on global electricity grids, including countries like
(04:57):
Australia where we are regularly experiencing blackouts.
Speaker 2 (05:01):
Yeah, it's hard to imagine, I guess in places like
Australia where there is that strain on grids and uncertainty
about you know what the long term sustainability of those
grids looks like. Then this huge extra strain, it sounds
like a lot of work. So ATOL, you've got these
massive facilities that you've talked us through that store this
(05:22):
large infrastructure and machinery. So we've learned that, you know,
AI puts a strain on electricity demands because of the
actual process of asking it questions. Then there's also the
physical places where the computing lives. I can only imagine
how hot those rooms could get. I mean think about
like when you have a laptop, a small laptop on
(05:43):
your lap, the fan is going into overdrive. It gets
really hot on your lap. That's like a small laptop
that generates that kind of a heat. I mean, if
you've got this much machinery in these facilities, I'm sure
it must get really warm.
Speaker 1 (05:56):
Oh exactly. I mean, AI centers need advance cooling system
so basically keep the technology from overheating. Traditionally, data centers
rely on air cooling to manage rising temperatures caused by
the heat emitted from the hardware. However, this isn't quite
sufficient for AI technology. So what that means is modern
data centers are using liquid cooling systems that rely on
(06:18):
water to keep temperatures between the ideal terms of twenty
one and twenty four degrees.
Speaker 2 (06:24):
Okay, I'm trying to write my head around this concept
because the idea of water and computers pretty much goes
against everything we've ever been told.
Speaker 1 (06:31):
Right exactly, But I wouldn't recommend throwing a glass of
water at your laptop right now, okay noted, So researchers
in the US predict that by twenty twenty seven, up
to six point six billion cubic meters of water will
be needed annually to meet global AI demands. That's equivalent
to half of the UK's yearly water consumption. Okay, a
(06:54):
lot of water, a lot exactly. These statistics raise concerns
for climate experts, who basically say, in a country like
Australia who experiences droughts regularly, this would be detrimental.
Speaker 2 (07:06):
With droughts. I mean, if we look at the climate science,
its droughts are expected to intensify those kind of long
periods without rain, and this water usage I can imagine
for those climate experts. Kind of sounds an alarm exactly.
Speaker 1 (07:21):
Now. Water consumption isn't the only environmental issue linked to
data centers. The greenhouse gas emissions released from these facilities
are also sounding the alarm in the fight against global warming.
Exact figures on AI's contribution to global emissions still remain unclear.
The International Energy Agency estimates that data centers account for
(07:42):
zero point six percent of annual emissions, while Science and
Technology Australia say this figure has already hit one percent.
Speaker 2 (07:50):
So some kind of differences in the scale there, but
anywhere between kind of half to one percent of emissions
exactly now.
Speaker 1 (07:59):
Recent reports warned that if AI adoption continues at the
current pace that it's going, data centers could account for
fourteen percent of yearly emissions by twenty forty. Wow.
Speaker 2 (08:10):
Predictions like that, I mean, might just become a reality.
If we look to the reports of rising emissions among
tech giants using AI, they have been transparent about this,
and I guess that's kind of why we've taken note
and why we're talking about it today. What do we
know from those tech giants about their emissions?
Speaker 1 (08:30):
Yeah, So, in the case of Microsoft, in its latest
Sustainability Report, it attributed a thirty percent increase in its
carbon emissions since twenty twenty due to AI models and
services that it provides.
Speaker 2 (08:42):
Wow, so Microsoft is saying that it's emissions, like the
whole of Microsoft, increased by thirty percent because of what
it takes to run AI.
Speaker 1 (08:51):
Exactly that. Now, despite this, we know that tech giants
have no plans on scaling back their AI programs. So
basically this means greater demand for infrastructure and resources, which
brings us to mining. So lithium is one of the
key materials used to produce the rechargeable batteries that powers
AI technology, and Australia is one of the largest producers
(09:13):
of the mineral.
Speaker 2 (09:14):
We have heard a lot about lithium in recent years.
Of course, it's something that goes in phones, computers, batteries
of all kinds. And you're right, Australia is the world's
largest lithium producer. But we might not be forever.
Speaker 1 (09:30):
No, not exactly. So a twenty twenty study from a
German university found that global lithium deposits could be depleted
sometime within the next seventy five years. So with the
rapid uptake of AI services. Some experts predict that lithium
shortages could occur as soon as twenty forty now. Additionally,
(09:50):
the amount of eWays generated through the frequent maintenance of
AI equipment poses a significant challenge to Australia's waste and
recycling systems.
Speaker 2 (09:59):
Wow, there are so many aspects to the sustainability concerns here.
We've heard about the power it takes to ask chat
GPT a question, the electricity and water it takes for
these computers to be stored, and then we've also got
this e waste and the mining of lithium to think about.
You're right that AI isn't going anywhere. We are only
(10:22):
hearing more and more about its advancements and how we
can live alongside AI or integrate it into our life,
into our work. But what are the experts saying about
how we ensure that this technology doesn't set us back
environmentally even while it might kind of bring us forward technologically.
Speaker 1 (10:42):
Well, that's exactly what global leaders are trying to figure
out with the help of these environmental and technological experts.
So last year in Australia, the Federal Senate launched an
inquiry into AI to explore both its opportunities and its impacts.
So the inquiry held six public hearings and received submissions
from two hundred and forty five experts, academics, business leaders
(11:06):
and members of the public.
Speaker 2 (11:08):
That's a lot of submissions. Were there any kind of
common themes that came up.
Speaker 1 (11:13):
Yeah, So the environmental impacts of AI was a major
concern basically across all submissions and in its omissions. UNSWAI
Institute noted that the impacts of AI are currently difficult
to quantify due to few standards for its reporting.
Speaker 2 (11:29):
Okay, so I think that's kind of been reflected in
some of the numbers we've talked about today.
Speaker 1 (11:33):
Exactly.
Speaker 2 (11:34):
There's a bit of a range scope. There's not kind
of a definitive regulatory body that says this is what
AI is doing and these are the emissions that it's
contributing and that kind of thing. So I guess that
makes it difficult to get a real sense of what's
going on. Right.
Speaker 1 (11:47):
So, Science and Technology Australia, which is the peak body
for Australia science and technology sector, called on governments to
ensure that renewable energy policy and net zero investments play
a key role in developing digital infrastructure to support AI
use in Australia. Now, what's interesting is that recent innovations
show that AI can actually help us tackle environmental challenges.
Speaker 2 (12:12):
Interesting, I know.
Speaker 1 (12:13):
So the Federal Department of Industry, Science and Resources showed
that AI could help address some of the world's most
pressing climate change issues. The Australian Human Rights Commission said
that AI has the potential to positively impact the environment
in several ways, including by improving energy efficiency and enhancing
sustainable practices.
Speaker 2 (12:34):
One of the interesting ones that the Federal government flagged
I remember is how AI could be used for firefighting technologies.
That they developed an AI technology that could detect small
fires and predict fire behavior patterns, which is of course,
you know, beneficial to the environment on the other side
of this coin exactly.
Speaker 1 (12:53):
But despite that, the Australian Human Rights Commission did warn
AI pose a significant risk. Now, ultimate knowledge is power,
so by increasing transparency around the potential environmental impacts of AI,
the risk may be mitigated atoll.
Speaker 2 (13:09):
Thank you so much for breaking that down for us.
A very very big, complicated story, but you've made sense
of it for us, So we thank you for joining
us on the podcast today. And thank you for listening.
If you like today's episode, if you learn something, feel
free to pass it on to a friend. Don't forget
to follow or subscribe wherever you listen to The Daily
(13:29):
Os or if you're watching us over on our YouTube.
We will be back a little bit later today with
the evening headlines. Until then, have a great day.
Speaker 1 (13:41):
My name is Lily Madden and I'm a proud Arunda
Bunje lung Kalkotin woman from Gadigl Country. The Daily Os
acknowledges that this podcast is recorded on the lands of
the Gadighl people and pays respect to all Aboriginal and
Torres Straight Island and nations. We pay our respects to
the first peoples of these countries, both past and present