Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Okay, let's unpack this. For the last few years, I mean,
we've all been obsessed with the AI revolution, and you know,
the algorithms, the large language models, the promise of a
truly digital future, right, But beneath all that digital hype,
beneath the code and the claims, there is this massive
physical and often kind of silent infrastructure boom happening right now.
Speaker 2 (00:22):
Exactly. We are talking about an unprecedented level of investment
in well, concrete and steel. These aren't just server closets.
These are colossal data centers. Huge, yeah, often spanning the
size of multiple aircraft hangers, and they are rapidly becoming
the unseen backbone of the well, the global economy.
Speaker 1 (00:40):
Really, and when we talk about investment, we mean staggering,
world changing money, don't be absolutely. Giants like Microsoft, Google, Amazon,
and Meta are pouring hundreds of billions of dollars into
facilities that are just strouting up everywhere. You see them
in the sun baked deserts of Arizona, the inductrial zones
of Virginia.
Speaker 2 (00:57):
The scale is absolutely critical to understand here because it
explains the resulting strain on physical resources. The US Apartment
of Energy noted that back in twenty twenty the country
host did about what twenty seven hundred hyper scale data centers. Okay,
the current projection that this number will surge to over
five thousand by twenty twenty seven.
Speaker 1 (01:17):
Wow, almost double in just a few years exactly.
Speaker 2 (01:19):
And globally, AI capacity in these centers is expected to
triple by the end of the decade. The US is
very much the forefront of that, well, that construction frenzy that.
Speaker 1 (01:29):
Sounds like pure, unadulterated technological progress on the surface, right,
new facilities mean breakthroughs in medicine, logistics, science.
Speaker 2 (01:37):
It certainly holds that promise.
Speaker 1 (01:38):
But you've made it very clear that this digital promise
might be built on a hidden cost structure, a kind
of darker reality that is rapidly emerging.
Speaker 2 (01:48):
That is precisely our mission for this deep dive. We
are here to provide a balanced examination of those shadow costs,
the impacts that could fundamentally reshape society both environmentally and
you know, economically, if we leave them unchecked and unregulated.
We really need to look past the shiny innovation promises
and assess the physical damage being done. Our comprehensive source
(02:09):
material highlights basically four core areas of concern.
Speaker 1 (02:13):
Okay, what are they?
Speaker 2 (02:14):
First, the massive accelerating resource consumption, Second, the environmental degradation
that follows, including reliance on fossil fuels. Third, the growing
social inequities imposed on the communities that host these things.
And finally, the unprecedented concentration of economic power.
Speaker 1 (02:31):
That sounds like a heavy bill to pay for a
digital revolution. Let's start where the immediate conflict is maybe
most visible, most shocking. Water. These huge facilities are sometimes
called the thirsty behemoths.
Speaker 2 (02:45):
Yeah, and when you look at the raw numbers, that
moniker is well earned. People are often stunned, genuinely stunned
to learn that a single large data center can evaporate
up to five million gallons of water.
Speaker 1 (02:57):
Rarely five million gallons a day a day.
Speaker 2 (03:00):
That is the equivalent of the total daily usage of
a small US city, just poof vanishing into the atmosphere.
Speaker 1 (03:05):
So what is it about the mechanics? Why does training
these vast language models like GPT four create this incredible thirst.
Speaker 2 (03:16):
It all boils down to intense heat generation. Training llms
requires these really powerful interconnected GPU systems that push computation
limits far far beyond traditional computing. The resulting thermal output
is immense. I saw one study put it like boiling
thousands of kettle simultaneously in one building.
Speaker 1 (03:37):
That's a lot of heat.
Speaker 2 (03:38):
It is so to keep those very expensive servers from
while frying, massive cooling systems are required, and the most
common and frankly cheapest method currently remains evaporative cooling towers.
Speaker 1 (03:50):
And these evaporative cooling systems, they're the source of the
high water loss. The sources we have here are pretty
damning on the efficiency front. They paint a picture of
these cooling systems being incredibly wasteful, right they are.
Speaker 2 (04:00):
I mean, while evaporative cooling is thermally superb, it does
lead to huge water loss. A detailed twenty twenty four
analysis I think it was from UC Riverside showed that
for every single leader of water actually evaporated to cool
the server, up to one point eight additional leaders are
lost due to other factors.
Speaker 1 (04:16):
Wait, hang on one point eight leaders extra lost for
every leader used for cooling. How where does that extra
water go?
Speaker 2 (04:24):
It's not just evaporating, right, It's not just the core evaporation.
That excess loss comes primarily from two processes. One's called
blowdown and the other is drift.
Speaker 1 (04:34):
Okay, blowdown and drink Yeah.
Speaker 2 (04:35):
Blowdown is basically the intentional draining of water from the
cooling system. As water evaporates, minerals and sediments get concentrated, right,
it makes sense if they aren't purged, they cause scaling
and corrosion, which damages the equipment, so they have to
regularly flush some water out.
Speaker 1 (04:52):
That's blowdown okay, got it. And drift.
Speaker 2 (04:54):
Drift is the fine missed little water droplets that get
carried away by the airflow in the cooling itself. Think
of it like spray escaping.
Speaker 1 (05:03):
Oh okay.
Speaker 2 (05:04):
Both processes are essential for maintaining the system, but they
contribute massively to the overall waste rate. That's why the
total consumption figures are just so alarming.
Speaker 1 (05:12):
And this issue it moves from being just you know,
theoretically worrying to critically urgent when we look at where
these facilities are physically being built. Our sources show that
forty percent four zero percent of US data centers are
located in areas already designated as water stressed by the
Department of Energy.
Speaker 2 (05:31):
Exactly, that's just pouring gasoline on the fire. Really, this
creates immediate dire regional conflicts. Consider the situation in Phoenix, Arizona.
Speaker 1 (05:39):
Right the Mega drought area.
Speaker 2 (05:40):
Precisely that region is grappling with a decades long mega drought,
yet it is a major hub for data center construction.
The contrast is stark. Google's facilities in Mesa, for example,
they withdrew over one billion gallons of water in twenty
twenty two, a billion gallons, and Microsoft's facilities in that
same general area consumed an even higher figure two point
(06:03):
five billion gallons in twenty twenty one. That's according to
data from their own sustainability reports.
Speaker 1 (06:07):
By the way, the tension must be palpable. I mean
imagine local residents being told, you know, stricter water use,
let your lawns go brown, and then they see these
massive corporate structures siphoning billions of gallons, often using subsidized
municipal services.
Speaker 2 (06:22):
It creates a sense of profound injustice, and you see
it elsewhere too. Another clear flashpoint is the Dallas, Oregon, Ah. Yeah,
local residents, environmental groups, they've protested Google's massive expansion plans there.
They're deeply concerned about depleting flows in the nearby Columbia River. Understandable,
and what often makes it worse kind of adds insult
(06:44):
to injury, is that these corporations are frequently given generous
tax breaks by the local municipality, which allows them access
to subsidized city water supplies, while they avoid contributing fairly
to the tax space that actually maintains that local infrat structure.
Speaker 1 (07:00):
And this isn't just a Western US issue anymore, is it?
Where water scarcity is you know the obvious headline, the
growth is now straining resources on the East coast as well.
Speaker 2 (07:09):
Absolutely, it's becoming a national issue. Look at South Carolina.
For instance, Microsoft has proposed a new one million square
foot facility in Mount Pleasant. Estimates suggest that single facility
could consume somewhere between one and two million gallons.
Speaker 1 (07:23):
Of water daily one to two million daily.
Speaker 2 (07:26):
In South Carolina. That level of industrial consumption, it strains
aquifers that are already heavily relied upon by the rapidly
growing local population and the agricultural sector in the region.
It sets up a conflict.
Speaker 1 (07:39):
This brings us directly to this concept of environmental injustice.
It sounds like, you know, low income or minority heavy
communities often end up bearing the true brunt of this
massive resource extraction.
Speaker 2 (07:51):
That's the pattern we see playing out repeatedly. Take Loudun County, Virginia,
famously nicknamed Data Center.
Speaker 1 (07:57):
Alley right near DC.
Speaker 2 (07:58):
Yeah, it's estimated nearly seventy percent of the world's Internet
traffic passes through this really concentrated hub. These facilities draw
immense quantitches of water from the Potomac River, But the
surrounding communities often lower income. What do they see rising
utility bills and declining local resources. The financial benefits flow
primarily to Silicon Valley and the immediate specialized tech sector.
(08:21):
How about the costs, while the environmental and infrastructure costs
are effectively socialized borne by the public.
Speaker 1 (08:28):
And this feeds into that cynical term I read about
water washing. What does that concept actually describe?
Speaker 2 (08:34):
Yeah, water washing critics like Greenpeace define it as the
process where tech giants selectively highlight their maybe small scale
sustainability efforts like a pilot program for water reuse.
Speaker 1 (08:45):
Somewhere Okay, the positive pr exactly.
Speaker 2 (08:48):
While they simultaneously rely on massive taxpayer subsidies and frankly
government inaction that allows them to continue unchecked large scale
resource extraction. It allows them to maintain a grip green
public image while offloading the true environmental costs onto the
public sector.
Speaker 1 (09:05):
Clever, but deceptive pretty much. So are there any genuine
technological fixes on the horizon? We hear buzzwords like immersion cooling?
Is there? Hope? There?
Speaker 2 (09:14):
There are genuine exciting innovations, absolutely, but the problem is
scaling them quickly enough. Immersion cooling, which yeah, involves emerging
servers in these non conductive.
Speaker 1 (09:22):
Fluids, sounds futuristic.
Speaker 2 (09:24):
It is pretty cool, and it's being tested by companies
like Nvidia and Equinix. It really does have the potential
to reduce water usage by maybe up to ninety percent.
Speaker 1 (09:33):
Ninety percent, that's huge.
Speaker 2 (09:35):
It is huge. However, retrofitting the thousands of existing hyper
scale centers it's immensely expensive and technically very challenging. So
these solutions are realistically years away from widespread adoption across
the massive existing infrastructure base.
Speaker 1 (09:51):
And the high profile promises we see, like Meta's goal
for net zero water by twenty thirty, should we take
those with a grain of salt?
Speaker 2 (09:59):
They often lack teeth. Unfortunately, these corporate pledges are frequently
non binding, and critically, they are very difficult for outside
regulatory bodies or even journalists to actually verify independently. The
core problem remains the exponential nature of AI growth itself.
AI model parameters they're doubling every few months, some estimates
say incredible speed, which means water demands are accelerating at
(10:22):
a pace that voluntary frameworks just cannot possibly contain without
strong federally mandated efficiency standards, which you know, the Biden
administration is currently avoided preferring these voluntary frameworks. Right, the
US risks turning critical water stressed areas into server farms
that actively accelerate regional droughts. It's a dangerous trajectory.
Speaker 1 (10:45):
It's clear the physical toll starts with water, but once
these thirsty behemoths are stated, the next resource they devour
is electricity, gallons and gallons of it. Metaphorically speaking exactly.
This brings us right into the heart of the gridlock
on the grid Why what is the lifeblood? Electricity is
the enormous heartbeat of AI, and these facilities are absolute,
(11:06):
undeniable energy hogs.
Speaker 2 (11:08):
We have to start by just quantifying the difference AI
makes compared to normal Internet use. We know a standard
Google search draws.
Speaker 1 (11:14):
Some electricity, right, sure, a tiny bit.
Speaker 2 (11:16):
But a single query to a large language model like
chat GPT it uses ten times the electricity of that
standard search. That's according to a Pivotal twenty twenty three Cornell.
Speaker 1 (11:26):
Study, ten times for one question.
Speaker 2 (11:29):
Ten times. Now multiply that by potentially billions of users
making queries every day, the consumption becomes staggering really fast.
Speaker 1 (11:37):
Okay, give us a sense of the power required just
to teach the machine in the first place.
Speaker 2 (11:41):
The training cost, right, the initial training for these large
models is like an energy sprint. Training GBT three alone
consumed uh two hundred and eighty seven megawatt hours MWh.
Speaker 1 (11:54):
Okay, two and eighty seven miliwad hours. What does that
mean in real terms?
Speaker 2 (11:58):
To put that into perspective, if that's enough power to
run about one hundred and twenty typical US homes.
Speaker 1 (12:03):
For a full year for one model training run, for.
Speaker 2 (12:05):
One training run, and the leap to the next iteration
GPT four, Leaked estimates from industry insiders suggest that consumption
exceeded ten thousand millwaters ten thousand, Well, that's the cost
just to build the intelligence before it's even deployed for
general use.
Speaker 1 (12:19):
So what does this aggressive scale up, this massive demand
mean for our national grid infrastructure? Is it keeping up well?
Speaker 2 (12:26):
The total share is growing exponentially. Data centers accounted for
about two point five percent of total US electricity consumption
in twenty twenty.
Speaker 1 (12:33):
Two kay two point five percent, but the Electric.
Speaker 2 (12:35):
Power Research Institute EPRI projects that number will nearly quadruple.
It's expected to hit eight percent by twenty thirty, eight.
Speaker 1 (12:42):
Percent of all US electricity just for data centers.
Speaker 2 (12:45):
By twenty thirty. Yes, and AI is the singular key
accelerator of this. McKinsey estimates that AI alone could add
somewhere between three hundred and five hundred tarawa hours annually
to the total demand by twenty twenty seven.
Speaker 1 (12:58):
Three to five hundred t wh. That's what comparable to
entire states.
Speaker 2 (13:03):
That level of consumption rivals the current total electricity usage
of entire large states like Pennsylvania or Ohio. Just for AI.
Speaker 1 (13:11):
That kind of surging, NonStop, non negotiable demand must be
putting an incredible strain on our aging regional infrastructure, especially
in those energy dense regions like Virginia.
Speaker 2 (13:20):
It absolutely is, and the grid operators are sounding really
urgent alarms. In Virginia, home to data center Alley, Yeah,
Dominion Energy has issued a stark warning. They've said that
if the current growth rate persists, they might need to
build as many as fifteen new natural gas power plants
by twenty forty.
Speaker 1 (13:36):
Fifteen new gas plants.
Speaker 2 (13:38):
Purely to meet the demands of the data centers in
their service area.
Speaker 1 (13:41):
That seems to fly directly in the face of national
efforts to decarbonize, doesn't.
Speaker 2 (13:45):
It It does, And that's the central paradox here. Look
at the PJAM interconnection. They operate the grid in thirteen
states plus DC, a huge area.
Speaker 1 (13:52):
Right.
Speaker 2 (13:52):
PJAM reported in twenty twenty four that the massive new
demand from data centers is actually forcing them to delay
the scheduled retirements for old, inefficient coal fired power plants.
Speaker 1 (14:03):
Wait, why why would new demand require keeping old coal
plants online? That seems backwards.
Speaker 2 (14:09):
It's about quick ramping capacity. Natural gas and yes, coal
plants provide a reliable, always on baseline power and critically,
they can ramp up output very quickly to meet sudden
spikes in demand.
Speaker 1 (14:22):
Ah, the spikes from AI workloads.
Speaker 2 (14:24):
Exactly renewables like solar and wind are intermittent, They fluctuate.
The grid operators need what they call firm capacity power
that can respond instantly to the massive, sometimes unpredictable power
surges created by these AI workloads.
Speaker 1 (14:38):
And right now the system often relies on old coal
and gas infrastructure to provide that instant stability.
Speaker 2 (14:44):
That's often the case. Unfortunately, this growth is also necessitating billions,
literally billions of dollars in transmission line upgrades, all designed
primarily to accommodate this tech bone.
Speaker 1 (14:54):
But hang on. The tech giants constantly tell us they're
running one hundred percent renewable energy. They buy huge, huge
solar and wind power purchase agreements PPAs. How can they
make that claim if the grid is still leaning on
vossal fuels.
Speaker 2 (15:06):
Absolutely by the PPAs, often in massive volumes. And you
know they deserve credit for driving investment in renewables. That
part is true, okay. However, the reality of the immediate
real time power mix powering their facility at any given
moment is different. A twenty twenty four analysis by Bloomberg
NEF found that despite all the PPA buying, about sixty
percent of data center power still comes from conventional coal
(15:29):
and natural gas.
Speaker 1 (15:29):
Sixty percent, So the PPAs don't cover everything in real time.
Speaker 2 (15:33):
No, because of intermittency. When the sun sets or the
wind dies down, those data centers must immediately draw power
from whatever is available on the grid mix at that moment,
and in most regions that mix remains dominated by fossil fuels,
especially for providing that baseline and peak power.
Speaker 1 (15:49):
So when they say one hundred percent renewable, we should
really be reading that as carbon neutral on paper thanks
to some clever accounting.
Speaker 2 (15:57):
Precisely, this is what some Princeton research is flat out
called greenwashing. Their one hundred percent renewable claims rely heavily
on purchasing renewable energy credits or ric's the credits right,
an RIC signifies one meguat hour of clean energy generated
somewhere on the grid. By buying these credits, a company
(16:17):
can offset its consumption on paper. But crucially, it does
not mean the energy the data center is consuming in
real time at its physical location is actually coming from
a clean source at that moment, so.
Speaker 1 (16:29):
It masks the reliance on the dirtier, fossil fueled baseline grid.
The power is the actual region where the data center
sits exactly.
Speaker 2 (16:37):
It's an accounting mechanism, not always a reflection of physical
reality second by second.
Speaker 1 (16:41):
Beyond just the cost and the source of the power,
this surge introduces severe operational risks, doesn't it threatening basic
grid stability?
Speaker 2 (16:49):
Absolutely. We saw a near miss scenario play out in
northern Virginia during that intense heat wave back in twenty
twenty two.
Speaker 1 (16:55):
I remember hearing about that.
Speaker 2 (16:57):
Yeah, the local grid was nearly overwhelmed. It actually forced
data centers to voluntarily reduce their operations what's called curtailment.
Speaker 1 (17:05):
Okay, so they throttled back, they did, But.
Speaker 2 (17:07):
That voluntary system probably won't work much longer. Gartner project
AI inference, that's the real time use of AI tools
by everybody, to grow one hundred times by twenty thirty.
Speaker 1 (17:17):
One hundredfold increase in usage.
Speaker 2 (17:19):
That's the projection. If the infrastructure cannot handle that kind
of load reliably, we face not just economic damage from outages,
but genuine national security risks and potentially widespread power disruptions
for everyone.
Speaker 1 (17:32):
I understand some European nations have already hit a wall
on this. They've had actually capped data center growth.
Speaker 2 (17:38):
They have Ireland is the key example. It became an early,
very popular hub for data centers because of tax advantages
and climate. But they had to cap the growth of
new facilities back in twenty twenty one after the concentration
of energy demand was deemed a demonstrable threat to the
reliability of the entire national electricity supply.
Speaker 1 (17:56):
Wow, So the US has no equivalent federal policy or
capacity cap like that.
Speaker 2 (18:00):
No, it's basically state by state allowing this exponential growth
to continue largely unrestrained at the national level.
Speaker 1 (18:07):
And all of these massive infrastructure upgrades, the new power lines,
the substations, maybe those fifteen potential natural gas plants in Virginia.
Who ends up footing the bill for this massive AI
infrastructure boom? Is it the tech companies?
Speaker 2 (18:22):
Well, largely the cost is socialized. It falls to the
ordinary ratepayer. Look at Georgia for example. Georgia Power's recent
infrastructure rate hikes, which are explicitly necessary to fund the
upgrades driven largely by new data center demand, are adding
somewhere between ten to twenty dollars monthly to average household
utility bills in their territory.
Speaker 1 (18:42):
Ten to twenty dollars extra per month for regular folks.
Speaker 2 (18:44):
Correct, So the economic benefit accrues to a handful of
giant tech companies, who, remember, may also be receiving massive
local tax incentives, while the cost of providing the necessary
energy stability is borne by the average citizen and local
businesses through higher bills.
Speaker 1 (19:00):
Dark side here seems pretty clear. Then, this explosion of
data center construction risks reviving the fossil fuel sector almost
under the guise of supporting innovative AI, which directly undermines
major climate initiatives like the Inflation Reduction Act.
Speaker 2 (19:17):
It puts them in direct conflict that energy consumption naturally
leads us to the next inevitable consequence, the carbon footprint.
Data centers aren't just energy hogs. They are rapidly becoming
major atmospheric polluters on a global scale.
Speaker 1 (19:31):
Let's put that into global context first, right, how much
CO two are we actually talking about from data centers worldwide?
Speaker 2 (19:37):
Okay, globally, data centers collectively emitted somewhere between two hundred
and three hundred million tons of CO two in twenty
twenty two. That's according to the International Energy Agency the IEA.
Speaker 1 (19:47):
Two to three hundred million tons.
Speaker 2 (19:48):
And the US share of that is immense, around one
hundred million tons annually.
Speaker 1 (19:52):
One hundred million tons just from US data centers. How
does that compare to other sectors?
Speaker 2 (19:56):
To provide a memorable comparison, that US total is actually
more than and the entire pre COVID global airline industry
was emitting.
Speaker 1 (20:03):
Per year more than airlines.
Speaker 2 (20:05):
Wow, and AI is driving a compounding growth in the
sector's emissions, maybe twenty to thirty percent year every year.
It's accelerating fast.
Speaker 1 (20:13):
It's often easy to forget, isn't it that a purely
digital process has a direct physical emissions cost. Can you
give us a specific, maybe visceral example of the model
training cost in terms of CO two.
Speaker 2 (20:25):
Yeah, the emissions tied to initial training are immense. A
key study I think, refined by researchers at UMass Amherst,
found that training a single large model, Specifically, they looked
at the blue model with one hundred and seventy six
billion parameters, and it's approximately six hundred and twenty six
thousand pounds of CO two six.
Speaker 1 (20:44):
Hundred thousand pounds of CO two for one software.
Speaker 2 (20:48):
Training runt for a single software development effort that might
only last a few weeks or months it's a staggering footprint.
Speaker 1 (20:54):
And then there's the daily usage the inference we talked
about earlier, which scales globally across millions or billions of users.
Speaker 2 (20:59):
Right, the cumulative effect of inference is vast simply because
of mass adoption. If we project, say just one billion
daily users for a tool like chatgipt, the collective emissions
from all those individual queries, you know, the tiny digital
carbon footprint of each answer generated, could be equivalent to
the yearly emissions of about fifty thousand passenger cars, fifty
(21:20):
thousand cars worth of emissions just from people using one
AI tool. If it reaches that scale. Yes, it shows
how rapid widespread adoption quickly translates tiny individual power uses
into a significant global climate burden.
Speaker 1 (21:34):
Now, the tech giants they love to highlight their long
standing commitments. Google claims has been carbon neutral since two
thousand and seven. Microsoft pledges carbon negativity by twenty thirty.
Are these decades old commitments actually holding up now that
they are feeding this AI beast?
Speaker 2 (21:49):
Well, the pledges are definitely under massive strain. A twenty
twenty four investigation into Microsoft's environmental record found that their
overall emissions have actually risen by thirty percent since twenty.
Speaker 1 (21:58):
Twenty, risen despite the pledges.
Speaker 2 (22:01):
Risen, and this rise is driven almost entirely by the
explosive growth of their cloud services to host ai, despite
their continued use of carbon offsets. And Amazon's much publicized
climate pledge also faces harsh criticism because their own filings
reveal they exclude crucial Scope three emissions.
Speaker 1 (22:20):
Okay, let's clarify Scope three emissions again, that sounds like
a potentially massive loophole if they're excluded. It is.
Speaker 2 (22:26):
Scope three emissions encompass the huge amount of emissions generated
across a company's entire value chain that they don't.
Speaker 1 (22:33):
Directly control, like what specifically For.
Speaker 2 (22:35):
A company like Amazon or Microsoft or Google, this includes
the vast amount of carbon emitted from manufacturing all the
servers and network equipment they buy, the construction of the buildings,
the transportation and shipping involved employee commuting, and eventually the
disposal of all that electronic waste.
Speaker 1 (22:52):
So basically most of the upstream and downstream impact pretty much.
Speaker 2 (22:56):
By excluding Scope three, which many companies do in their
primary they might only be reporting on a small fraction
of their true total climate impact. You simply cannot claim
genuine environmental leadership while ignoring the most significant sources of
your pollution footprint.
Speaker 1 (23:11):
The geographical hypocrisy we touched on with power sources seems
like a major component of this climate contradiction too. Placing
centers where power is cheap, even if it's dirty power
from coal.
Speaker 2 (23:22):
It's often a calculated business move. Yes, many centers are
strategically placed where land and power are cheapest, which frequently
means grids that rely heavily on coal. Meta, for example,
has major facilities in West Virginia, drawing from that PGM
grid mix we discussed earlier. Google has established huge centers
in Iowa and other coal heavy grid state, and even
(23:43):
in grids that appear cleaner on average, like California's, the
massive instantaneous power surges required for AI workloads mean that
utilities like pg and E often have to rely heavily
on quick firing natural gas peaker plants to handle the
sudden demand spikes. So even clean grids have dirty backup, and.
Speaker 1 (24:02):
We see the leaders of these AI companies, people like
Sam Altman, talking about grand futuristic solutions like nuclear power.
Speaker 2 (24:09):
Right fusion, even.
Speaker 1 (24:10):
While their day to day operations are still fundamentally locked
into fossil fuels right now.
Speaker 2 (24:14):
It is a massive disconnect between the long term vision
and the present reality. Sam Altman, head of open AI,
is a very vocal champion of nuclear energy for future
AI needs. It makes sense on paper, sure, Yet his
company's current models, which are hosted on Microsoft's Andrew Cloud,
rely entirely on the current power mix of Microsoft's existing
(24:34):
largely fossil backed data centers right now today today. Similarly,
Elon Musk's XAI announced its big supercluster compute facility in
Memphis in twenty twenty four. It's tapping into the Tennessee
Valley Authorities, TVA.
Speaker 1 (24:48):
Power Grid, and TVA's mixes.
Speaker 2 (24:51):
It's a mix of coal, nuclear, and hydro, so not
entirely clean. The long term vision might sound green, but
the immediate implementation is decidedly brown and well dirty in
many cases.
Speaker 1 (25:01):
Okay, So if technology is constantly becoming more efficient, we know,
for example, and video GPUs get maybe thirty percent more
efficient year every year, shouldn't that efficiency eventually solve the
energy and emissions problem on its own when things just
get cleaner automatically.
Speaker 2 (25:13):
Ah, this is where we run straight into one of
the most kind of chilling ideas in environmental economics, the
Jevins paradox.
Speaker 1 (25:21):
Jevins paradox, Okay, what's that.
Speaker 2 (25:23):
We certainly have incredible efficiency gains in computing, absolutely year
after year. But the paradox, first observed with coal use
in the nineteenth century, states that these technological savings, these
efficiency gains, aren't typically used to reduce overall consumption.
Speaker 1 (25:39):
They're not.
Speaker 2 (25:39):
What happens then they are immediately used to enable the
creation of even larger, more complex, and ultimately more power
intensive AI models and workloads. The efficiency makes new, bigger
things possible and affordable.
Speaker 1 (25:54):
Wait, so you're saying that the harder in video works
to make their chips efficient, the faster the planet burns. Potentially, Yeah,
that sounds completely counterintuitive. We need to unpack that.
Speaker 2 (26:03):
That is the tragic or at least paradoxical implication. Yes,
because AI companies are locked in this intense race to
build the largest, most capable models.
Speaker 1 (26:12):
Right, the parameter wars exactly.
Speaker 2 (26:13):
Any efficiency gain simply lowers the marginal cost of compute power.
That lowered cost then encourages them to scale up the
next model even faster and bigger, using more parameters, more
training data.
Speaker 1 (26:24):
So the efficiency is effectively swallowed up or even overwhelmed
by the explosion of scale it enables precisely.
Speaker 2 (26:30):
For instance, new more efficient training techniques might save, say
twenty percent of the energy needed per parameter, but the
resulting model might have five times the total parameters we're requiring,
maybe four times the total energy of the previous, less
efficient model. The net environmental benefit is nullified or even reversed.
Efficiency gains fuel greater overall consumption.
Speaker 1 (26:52):
That is a really dire warning. What is the ultimate
prediction then, if this runaway growth fueled by efficiency gains
entirely unchecked.
Speaker 2 (27:01):
Well, a critical twenty twenty five paper in the journal
Nature outline the most concerning prediction. It warned that if
the AI sector continues its current growth trajectory without some
kind of severe regulatory intervention on energy and emissions, yeah,
it could cumulatively add between one and two gigatons of
CO two to the atmosphere by twenty thirty one to.
Speaker 1 (27:20):
Two gigatons just from AI.
Speaker 2 (27:21):
Growth, just from the AI sector's expansion. To truly grasp
that scale, that volume of additional emissions growth could effectively
erase all the hard won environmental gains achieved globally through
things like the widespread adoption of electric vehicles and the
transition to renewable energy sources over the same period.
Speaker 1 (27:41):
So we risk trading a digital future for a massive
global climate setback.
Speaker 2 (27:47):
That's the potential trade off if we don't manage the
physical footprint. Yes, we are essentially building a potentially planet
heating machine.
Speaker 1 (27:54):
The impacts aren't just ecological or atmospheric, though, they are
profoundly social, hitting specific communities and workers directly. Let's pivot
now to the human costs, the unseen labor force and
the communities bearing the localized burden of this boom.
Speaker 2 (28:08):
It's absolutely essential to remember the physical foundations of the cloud.
These digital fortresses, they don't build themselves. They are built
by an army of over one hundred thousand construction workers,
many of whom are low wage or migrant laborers.
Speaker 1 (28:20):
Right the actual construction crews.
Speaker 2 (28:22):
And a powerful twenty twenty three pro Publica expose focused
on the massive data center construction sites in Virginia. It
revealed a pretty grim reality. Subcontractors under intense pressure to
finish projects quickly and cheekly, yeah, were found to be
cutting corners, leading to repeated OSHA safety violations, rampant wage
theft from workers, and even workplace.
Speaker 1 (28:43):
Fatalities fatalities on site. That's harrowing. When we visualize the
clean cool server racks inside, we often forget the dangerous,
often low wage human labor that actually builds the physical cloud.
And once the centers are actually built and running, the
jobs created aren't necessarily the high skill, high wage tech
jobs that towns often expect when these companies move in.
Speaker 2 (29:06):
Are they no not typically for the ongoing operations. While
the initial promise from developers is often massive job creation, sure,
the long term operational sector mostly involves lower wage, often
non unionized, technical roles. These james focus on things like
monitoring servers, maintaining cooling systems. Security. Data from the Bureau
of Labor Statistics shows the medium wages for these data
(29:27):
center operational roles often falls significantly below the national average
for tech jobs, and it stands in striking contrast to
the high skill jobs that AI itself is rapidly automating
away in other sectors of the economy.
Speaker 1 (29:39):
So it's not necessarily a ticket to broad local prosperity.
Let's discuss the specific, sometimes destructive impact on local community economics,
especially housing.
Speaker 2 (29:49):
One textbook example often cited is Quincy, Washington, Okay. Quincy,
the arrival of massive Microsoft data centers there quickly displaced
existing farms and frankly destabilize the local economy. Local reports
documented that housing costs and Quincy's skyrocketed by fifty percent
in the immediate aftermath of the construction boom.
Speaker 1 (30:09):
Fifty percent increase in housing costs.
Speaker 2 (30:11):
Yeah, this pattern is pretty common. The tech influx drives
up property values and property taxes, raises rents, and can
end up forcing out long term residents and smaller local
businesses that can no longer afford to operate there.
Speaker 1 (30:23):
And beyond the purely economic impacts, there are a quality
of life issues and genuine health concerns for people living
near these massive industrialized facilities are Oh definitely.
Speaker 2 (30:34):
The environmental degradation is localized in some really painful ways.
The constant noise pollution from the massive cooling fans is
a major issue that communities complain about.
Speaker 1 (30:44):
How loud are we talking?
Speaker 2 (30:45):
These facilities can generate noise levels constantly between seventy and
ninety decibels. That is like living next to a highway
constant disruptive industrial noise far exceeding standard residential.
Speaker 1 (30:57):
Limits all day, all night, all day, all night.
Speaker 2 (30:59):
Furthermore, a twenty twenty four EPA draft report drew a
critical link between clusters of data centers and spikes in
localized asthma rates.
Speaker 1 (31:07):
ASTHMA why primarily due.
Speaker 2 (31:09):
To the emissions from the pollutant heavy diesel backup generators.
These are required by the hundreds at large sites to
ensure continuous operation during grid failures or fluctuations. They test
them regularly, emitting pollutants.
Speaker 1 (31:22):
So the backup systems themselves are a health hazard.
Speaker 2 (31:24):
They can be especially when clustered densely near residential areas.
Speaker 1 (31:28):
And the very identity of these rural or semi rural
areas must change permanently when these giants arrive and take
over huge tracts of land.
Speaker 2 (31:38):
The change is profound and often irreversible. Ashburn, Virginia Data
Center Alley is the prime example. It's now effectively an
industrial zone. Data centers cover something like twenty five percent
of its total land.
Speaker 1 (31:49):
Area quarter of the county.
Speaker 2 (31:50):
Roughly, while the area gained temporary construction jobs and some
tax revenue, residents report a profound loss of local rural
identity as agricultural acreage and green space are just paved
over for these vast, anonymous, windowless industrial campuses.
Speaker 1 (32:08):
This feeds directly into the larger theme we discuss inequality
and resource extraction. The financial benefits seem to flow primarily
to Silicon Valley shareholders and executives, while local areas shoulder
the resource train and the social costs.
Speaker 2 (32:20):
And this imbalance is actively cemented, even encouraged by state
policy through the extensive use of tax breaks. The criticism
here is very loud because state and local governments grant
over ten billion dollars yearly in tax abatements and subsidies
to these companies. That's according to research from the watchdog
group Good Jobs.
Speaker 1 (32:37):
First ten billion a year in breaks.
Speaker 2 (32:40):
Yeah, we see examples like Nevada granting Tesla for its
battery and data operations a one billion dollar tax break
package for a facility which critically Siphon's revenue that would
otherwise have funded local schools, roads, or vital community services.
Speaker 1 (32:56):
So citizens are essentially forced to subsidize the very corporate
entities that are draining their local resources and straining their infrastructure.
Speaker 2 (33:04):
That's the core critique.
Speaker 1 (33:05):
Yes, and often the most vulnerable populations see their sacred
or essential resources strained without proper consultation or legal recourse,
don't they.
Speaker 2 (33:13):
The indigenous impacts are a particularly painful and recurring example
of this extraction model. In Arizona, for instance, the massive
data centers being built near Navajo Lands are straining sacred
water resources. Resource is crucial for survival in that arid environment,
and often this happens without adequate consultation or consent from
tribal leaders, following historical patterns. It really illustrates how the
(33:36):
drive for rapid infrastructure development can consistently bypass democratic processes
and cultural considerations, viewing land and resources purely as inputs
for corporate assets.
Speaker 1 (33:47):
Okay, so, moving beyond the direct environmental and social fallout,
we really must address the issue of economic control and
the profound concentration of power that's being embedded in this
new AI infrastructure.
Speaker 2 (34:01):
Yes, this is a huge piece of the puzzle.
Speaker 1 (34:03):
The cloud market was already heavily concentrated before the current
AI explosion, wasn't it. How are these new incredibly expensive
data centers actively entrenching that existing oligopoly.
Speaker 2 (34:14):
The market control is already stunning. Currently, just four massive
companies Amazon Web Services, Microsoft Azure, Google Cloud and Metas
Infrastructure control approximately seventy percent of the entire global cloud market.
That's according to Synergy Research.
Speaker 1 (34:27):
Group, seventy percent held by four companies.
Speaker 2 (34:29):
Yeah, and because building an AI ready data center is
so intensely capital intensive billions upon billions, only these existing
hyperscalers can realistically afford to build and scale at the
necessary pace, so.
Speaker 1 (34:43):
It creates an enormous barrier to entry.
Speaker 2 (34:45):
Exactly, this massive investment barrier reinforces their dominance, making it
almost impossible for smaller competitors or startups to emerge in
the foundational infrastructure layer. This leads directly to acute antitrust
concerns like.
Speaker 1 (34:59):
The DOJ lawsuit against Google.
Speaker 2 (35:00):
Right, the Department of Justice lawsuit against Google in twenty
twenty four highlighted data dominance as a core antitrust issue,
and the physical infrastructure required for AI only amplifies that dominance.
Whoever owns the compute owns the future in a way.
Speaker 1 (35:14):
The centralization of so much global data and just a
few US based facilities also raises massive geopolitical and security
risks for everyone involved.
Speaker 2 (35:22):
Doesn't it that centralization is definitely a dual threat. First,
When vast amounts of global data, everything from personal emails
and photos to sensitive corporate R and D and government information,
are centralized in a few physical hubs, primarily in the US,
it makes them incredibly high value targets for espionage and
cyber attack from foreign adversaries or sophisticated criminal groups.
Speaker 1 (35:45):
A single point of failure almost.
Speaker 2 (35:47):
In some ways. Yes, this also feeds into the global
concern over data sovereignty, the idea that countries want control
over their citizens' data, which we see playing out in
political battles over platuls like TikTok and data localization laws.
Speaker 1 (36:02):
And the supply chain risk.
Speaker 2 (36:03):
The supply chain vulnerability is also massive. The entire industry's
hyperdependence on Taiwan's TSMC for manufacturing the most cutting EDGEAI
chips means any geopolitical disruption in that region.
Speaker 1 (36:15):
Like a conflict over Taiwan.
Speaker 2 (36:17):
Could immediately and catastrophically halt data center construction and expansion
timelines worldwide. It's a huge single point of failure for
the physical hardware.
Speaker 1 (36:25):
And what about foreign investment in these critical physical infrastructure
assets here in the US. Does that pose a national
security concern?
Speaker 2 (36:32):
It absolutely does, and it's getting more attention. When foreign entities,
particularly those potentially backed by state actors like China or
say the UAE, invest heavily in US data centers, it
raises significant questions about who ultimately controls the physical levers
of our future technological infrastructure and what access or influence
(36:54):
they might demand in return for their capital.
Speaker 1 (36:56):
So the overall effect is.
Speaker 2 (36:58):
The overall effect is arguably in a row of democratic
controls over vital global infrastructure, we risk moving towards a
form of technological feudalism, where a tiny handful of corporations,
primarily American, control the essential infrastructure underpinning all modern commerce
and communication globally.
Speaker 1 (37:17):
And this entire seemingly unregulated structure is allowed to flourish
in part because of what our sources describe as a
regulatory vacuum. Who is actually supposed to be overseeing this explosive,
resource guzzling growth. Is anyone in charge?
Speaker 2 (37:32):
Well, that's the problem. Currently there's a fundamental lack of
unified federal oversight, specifically targeting the environmental and resource impacts.
Speaker 1 (37:39):
Of data centers, so agencies like the EPA or FERC
aren't stepping in.
Speaker 2 (37:44):
Federal agencies like the FCC for communications or FEERC for
energy transmission, largely defer responsibility for citing an environmental review
to state and local government yea. But state and local
governments are often simply ill equipped, under resourced, and frankly
sometimes out to handle the scale and technical complexity of
the demand and the companies involved.
Speaker 1 (38:05):
And the White House any action there.
Speaker 2 (38:07):
Crucially, the Biden Administration's big twenty twenty three AI Executive Order,
while important, focus very heavily on AI safety bias and
ethical deployment, it barely addressed the environmental or physical infrastructure
footprint supporting it all. That was a notable omission.
Speaker 1 (38:24):
And the states, rather than collaborating on sensible regulations, are
actively competing against each other to attract these facilities, which
makes meaningful regulation almost impossible.
Speaker 2 (38:35):
Right exactly. The states are locked in this sort of
vicious incentive war. They offer massive tax breaks remember that
ten billion dollar yearly figure we cited as leverage to
lure data center investments. A race to the bottom, It
creates a regulatory race to the bottom, because any state
that dares to impose meaningful environmental mandates, say on water
efficiency remissions, immediately fears the company will just pack up
(38:58):
and move to the neighboring state that's offering a sweeter, cheaper,
less regulated deal.
Speaker 1 (39:03):
So companies can effectively engage in policy backed greenwashing, relying
on this fractured regulatory landscape to look good without making
systemic changes.
Speaker 2 (39:14):
Yes, precisely, they can loudly proclaim sustainable AI in their
public relations campaigns using those racs and offsets we talked about,
but this often masks the reality via accounting maneuvers and.
Speaker 1 (39:26):
Loopholes like excluding scope three, like.
Speaker 2 (39:28):
Excluding scope three, or take Apple for instance, they famously
tout carbon neutral claims for their corporate operations, but then
explicitly exclude the massive emissions generated by the third party
data centers that actually host their cloud services, like iCloud ah.
Speaker 1 (39:42):
So it's not their data center, so it's not their
emissions on the books.
Speaker 2 (39:45):
It's a convenient accounting fiction. Critics would say that allows
them to avoid addressing the true massive physical cost of
their operations that rely on that infrastructure.
Speaker 1 (39:55):
Okay, so we've laid out the pretty staggering costs water wars,
energy crises, soaring emissions, the human toll on communities and workers,
the concentration of power. It's a bleak picture. But what
are the pathways forward? We can't just stifle innovation entirely,
can we? There have to be a better way.
Speaker 2 (40:15):
No, absolutely not. The goal isn't to eliminate progress. It's
about achieving mitigation and accountability. The consensus from most experts
and environmental policy groups is that the fundamental shift needed
is to move away from relying solely on voluntary corporate
pledges which aren't really working, which clearly aren't keeping pace.
We need to move towards implementing mandatory, federally enforced action.
(40:37):
We simply must mandate high water efficiency standards for cooling
systems and strict power usage effectiveness or PUE standards for
all new large data.
Speaker 1 (40:46):
Centers, similar to what Europe is starting to do.
Speaker 2 (40:49):
Similar to the path the European Union is currently beginning
to take with its energy efficiency directives for data centers. Yes,
it can be done, and.
Speaker 1 (40:57):
What specific fiscal tools could be used to balance the
ledger to make the companies actually pay the true costs
they are currently largely socializing?
Speaker 2 (41:07):
Fiscal tools are potentially highly effective for internalizing these external costs.
Obvious examples include things like taxing carbon emissions directly to
reflect the environmental damage.
Speaker 1 (41:18):
A carbon tax for data centers.
Speaker 2 (41:19):
Potentially yes, and then using that generated revenue specifically to
fund the crucial grid upgrades necessitated by AI demand, rather
than placing that burden solely on the average rate pair.
Speaker 1 (41:31):
Okay, what else?
Speaker 2 (41:32):
Furthermore, we need mechanisms like community benefits agreements or cbas.
These are legally binding contracts negotiated before a project.
Speaker 1 (41:40):
Is approved h between the company and the local community.
Speaker 2 (41:43):
Exactly, they ensure that the local populations who will bear
the resource strain and environmental stress, receive tangible, guaranteed compensation
That could be infrastructure upgrades funded by the company, dedicated
funds for local water reuse projects, guaranteed local hiring or
job training programs, real benefits, not just vague promises.
Speaker 1 (42:03):
We talked about centralization being a key geopolitical and resource risk.
Is decentralization a viable technological part of this solution moving
things away from these mega hubs.
Speaker 2 (42:14):
It is a critical long term investment for resilience and efficiency. Yes,
we must invest more heavily in edge Computing.
Speaker 1 (42:20):
Edge Computing explain that briefly.
Speaker 2 (42:22):
It basically involves shifting processing power away from these massive,
single location hyperscale centers towards smaller, more numerous localized facilities
that are physically closer to the end users.
Speaker 1 (42:32):
So processing data nearer to where it's generated or needed.
Speaker 2 (42:36):
Right, this would reduce the immense transmission energy loss that
occurs when electricity and data have to be shipped across
very long distances, and it would drastically reduce the concentrated
stress on single regional water and energy resources by distributing
the load.
Speaker 1 (42:51):
And finally, are there any immediate real world examples of
data centers acting as part of a genuine circular solution,
maybe turning their waste products like heat into a community
benefit instead of just venting it.
Speaker 2 (43:05):
Yes, absolutely, there are successful models of this kind of
industrial symbiosis or circularity that we really must emulate and
stale up. The most frequently cited example is Denmark.
Speaker 1 (43:16):
Denmark, what are they doing?
Speaker 2 (43:17):
Due to progressive government mandates and planning, they have successfully
implemented systems where the massive amounts of excess heat generated
by data centers.
Speaker 1 (43:25):
Is captured instead of just wasted.
Speaker 2 (43:26):
Instead, of just vented into the atmosphere. Yes, it's captured
and then routed through pipes to warm residential homes and
local district heating systems, providing low cost heat to the community.
Speaker 1 (43:36):
That's brilliant, speaking a massive waste product, the heat, and
turning it into a necessary community utility exactly.
Speaker 2 (43:43):
That model is powerful because it demonstrates that smart regulation
can genuinely drive innovation that benefits both the industry and
the host community. It shows that we can integrate these
facilities better, creating social good not just resource drain.
Speaker 1 (43:57):
Okay, Well, that brings us towards the end of our
deep today into the hidden physical infrastructure of AI. It
seems incredibly clear that America's AI data center explosion, Yeah,
is powering some really incredible technological progress, no doubt about it,
for sure. But that progress comes at a profound and
currently dangerously underpriced cost across our water resources, our energy stability,
(44:20):
global carbon emissions, and crucially, social equity in the communities affected.
Speaker 2 (44:24):
The summary is really this, The imbalance that we've detailed
today is untenable, and it's growing daily. Transparency is the
first step, but beyond that, mandatory regulation, particularly at the
federal level for consistency and committed mandated innovation, not just
voluntary pledges, are the only tools that can realistically balance
this ledger.
Speaker 1 (44:43):
So action is needed.
Speaker 2 (44:45):
Action is needed now. Policymakers, tech companies themselves, and frankly, citizens,
all of us must address this fundamental imbalance immediately before
the sheer physical scale of the servers literally overheats the
planet and bankrupts critical resource in the places they inhabit.
Speaker 1 (45:01):
All right, so here is the provocative thought we want
to leave you, the listener with today. The AI data
center explosion promises a truly revolutionary digital future, one filled
with incredible potential. But if left unchecked, driven only to
corporate profit and speed, we risk that physical infrastructure literally
overheating the planet and turning the dream of technological progress
(45:24):
into a resource nightmare, especially for the host communities bearing
the burden.
Speaker 2 (45:28):
So the question for you is what role will you
play in demanding accountability for the physical footprint of the cloud.