All Episodes

April 15, 2024 54 mins

For years and years, utilities in the US haven't seen much growth in electricity demand. The economy is generally mature and has been able to grow even without needing much more electrical power. But all that's changing now and a big contributing factor is the boom in datacenter demand. It's particularly acute for AI datacenters, which need more power than traditional datacenters, and are growing like crazy ever since ChatGPT brought generative AI to everyone's collective consciousness. So how will utilities handle the sudden surge in load growth? On this episode, we speak with Brian Janous, co-founder and chief strategy officer at Cloverleaf Infrastructure. Brian spent 12 years at Microsoft, where he was the company's first ever energy-focused hire, so he has seen the rise of datacenter electricity consumption first hand, and how AI is kicking it up even further. He now works alongside utilities to figure out how they'll meet this growing demand. We talk about how there's likely to be more gas plants being built, how datacenters and utilities can get more energy out of existing infrastructure, the politics of AI datacenters, and what this all means for the net-zero commitments of major tech companies.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, Podcasts, Radio News.

Speaker 2 (00:17):
Hello and welcome to another episode of the Odd Lots podcast.
I'm Joe Wisenthal and.

Speaker 3 (00:22):
I'm Tracy Alloway.

Speaker 2 (00:23):
So, Tracy, a thing that keeps coming up is that
all these AI data centers are going to use a
lot of electricity. I keep hearing that.

Speaker 3 (00:32):
Yes. Also, I just realized every time you use chat
GPT to write like a satirical song, you're diverting energy
away from someone turning on a light bulb or something
like that, potentially something like zero sum game.

Speaker 4 (00:45):
Yeah, so be.

Speaker 2 (00:46):
Careful about your your random chat GPT queries. Although I
think the training is like the more Yeah, like I think,
but you know, like so maybe your song there, maybe
it's okay. I don't think it's that bad.

Speaker 3 (00:58):
In the grand scheme of things, probably not. But there
is this overarching conversation about AI's energy use. So what
exactly it is? This is a big question I have.
How do you disaggregate AI servers from your run of
the mill software servers, How much it's going to consume,
how that capacity is going to get allocated and built out?

(01:20):
And I think there is this sense that we could
end up going into very different, very extreme directions here.
So you could have this great situation that because AI
is a desirable activity, because it's profitable in many respects,
that big tech ends up accelerating the energy capacity build out,

(01:41):
maybe they even start building more green technology capabilities in
an ideal world. But then you have the polar opposite
scenario where you need all this power to develop this
technology there isn't enough and it's sort of a race
to the bottom where you have tech companies just trying
to get energy wherever they can, maybe they even start

(02:01):
using coal and things like that. So it feels like
there's two very different paths that we could be going
down here.

Speaker 2 (02:09):
Yeah, there's a lot here for us.

Speaker 4 (02:10):
You know.

Speaker 2 (02:11):
I remember Jigger shaw at Or when we interviewed him
at the Texas Tribune conference like over a year ago,
he brought this up. It's getting more and more attention.
It keeps coming up. On the side of episode. Steve
Eisman obviously recently talked about it. But I feel like
it's time to like sort of make it a central
part of the conversation and actually learn about numbers and

(02:31):
where this power is generated from and like, yeah, like
how much are we really talking about here. We know
the you know, the tech company is highly aware of it.
There was a headline recently about Microsoft maybe wanting to
do something with on site nuclear development. They also, you know,
they did do something.

Speaker 3 (02:45):
So I think they headline, well, didn't they buy a
data center maybe next to a nuclear power plant? Susqahanna thing?
I thought they did.

Speaker 2 (02:53):
Yeah, I think you're right about that. But then the
other element too, is you mentioned that one solution here
is just fossil fuels and dirty energy, except that all
these tech companies are very like progressive minded and they
all have these net zero commitments by you know, we're
going to get it all from you know, windmills or sorry,
wind turbines and solar and batteries and stores.

Speaker 3 (03:13):
Windmills next to AI servers would be an interesting one.

Speaker 2 (03:17):
But like you know, at some point, the rubber's got
to hit the road with like how realistic are there
net zero commitments or how can they achieve them if
they're engaging in this investment activity that is highly energy intensive.

Speaker 4 (03:28):
No.

Speaker 3 (03:28):
Absolutely, And you are seeing a lot of this discussion
reflected in the conversation around AI investment at this point.
So I think a lot of people feel like they
missed that first wave of chips around in video, so
everyone's looking for the sort of second order investment play
and a lot of people now are talking about energy
or cooling and HVAC, so we need to talk about it.

Speaker 4 (03:50):
Well.

Speaker 2 (03:50):
I am really excited because I do believe we have
the perfect guest for this topic, someone who spent twelve
years at Microsoft, way before it was hot to talk
about how tech companies needed all this electricity and energy.
He was the first energy hire at Microsoft, and he
recently left last year. He left to start his own

(04:13):
firm to work on this problem specifically. So we were
going to be speaking with Brian Janis. He is the
co founder and chief strategy officer at Cloverleaf Infrastructure, a
power development company that works closely with utilities on solving
this problem. So, Brian, thank you so much for coming
on out lots.

Speaker 4 (04:29):
Thank you for having me. Really excited to be here.

Speaker 2 (04:31):
So you got hired for Microsoft twelve years ago to
do energy, and at the time, I don't think anyone
was talking about energy as being like a particularly important
aspect of these software companies or these big tech companies strategies.
What was going on back then, or like, what did
they see when they felt like, hey, we need to

(04:52):
hire a VP of energy here.

Speaker 4 (04:54):
Yeah. It was actually funny because I had spent my
career prior to that working with mainly large energy consumers
who were the big who you'd expect it to be,
the big industrial companies, And so when Microsoft came calling
and said, hey, we need to get a full time
energy person, I told them it sounded like a dead
end job to be the energy person at a tech company,
because why would they ever actually care about this issue.

(05:17):
And the person that was recruiting me said, Hey, I
think there is something to this, this whole cloud thing,
and I think energy is going to start to be
pretty central to what we're doing as a company. And
you'll fast forward a decade and I remember having a
conversation before I left the company. I was talking to
the head of corporate strategy and he said to me,
He's like, I don't think people quite realize the degree

(05:39):
to which Microsoft is really just an energy company. We
need power and we need silicon. We need chips. That's it,
that's the business. If we don't have one of those
two things, we're in a lot of trouble and so
it was it was remarkable to see the shift over
that decade plus of maybe one or two people at
the company starting to think that energy might be important

(05:59):
to us, when day to energy is actually absolutely central
to everything the business does.

Speaker 3 (06:06):
So talk to us a little bit more about that
cultural shift, because Joe and I heard from someone else recently,
but they were saying that a bunch of the big
tech companies that you would be very familiar with had
representatives down in Houston for zero Week, so the annual
energy conference, which they were describing as kind of a
new development. But how familiar are tech companies nowadays with

(06:30):
energy usage or needs and how much expertise have they
actually built out in that capacity.

Speaker 4 (06:39):
Yeah, it's been a tremendous shift. And I mean if
you would have gone to Seria Week even five years ago,
you would not have seen a whole lot of engagement
from the tech industry. But as their businesses have shifted
to the cloud, and as the business opportunities that sit
in front of them, particularly when it comes down to AI,

(07:00):
as those have arisen, there's been a recognition that power
really is central to what they're doing. And it was
a slow shift. If you go back to the advent
of the first cloud data centers. It really was about
being close to the network, and so that was the
driver of strategically, where do you put data centers, Well,

(07:20):
you put them where the biggest network cubs are. So
that's why we have lots of data centers in Northern Virginia.
That's why we have lots of data centers in Amsterdam.
Everyone was chasing network. Probably middle of last decade, there
was a shift and it started to go, actually, we
want to be close to eyeballs. So this started a
sort of a land grab of all the cloud data

(07:41):
centers starting to build lots of new data SISIs new
countries because they wanted to be close to where the
customers were. And so from about fall of twenty nineteen
through probably spring of twenty twenty two, I think Microsoft
added was adding close to a region a month in
terms of new data center regions they were establishing around
the world. And then in mid twenty twenty two that's

(08:04):
when the realization started to sink in that wait a minute,
this whole game is about power, because that's when we
were first starting to hear rumblings of what open aye
was working on and the scale of what chat gp
T three was going to be, which was sort of
the first big release where everyone was like, wait a minute,
this is this is kind of a big deal what

(08:26):
AI is doing and how fast it's moving. And then
when we had that release of chat gpt T three
in the fall of twenty two, and then shortly thereafter,
three point five was released, and there was a massive
increase in capability in that release if you recall, you know,
in terms of what it could do on you know,

(08:46):
getting scores on various tests and things. And it was
that moment that I realized, this technology is moving way
faster than the utility industry is moving. If we can
make this much improvement in this technology in a six
month time horizon, we're in a lot of trouble because

(09:09):
the power industry does not move that fast.

Speaker 2 (09:12):
So I'm really fascinated by this idea that the release,
like you know, like you were at Microsoft and so
you had a front row seat to what open ai
was doing with GPT one and GPD two, and there
were a lot of people aware of this, and I'm
sort of fascinated by this idea that it was that
commercialization or sort of making easy in public CHATJEP. When

(09:32):
it became chat GPT, they're like, oh, this is serious,
and then we sell everyone rushing to buy in video
chips and all these vcs pivoting to AI, et cetera.
So talk to us about like the math there. It
feels like there has been this sort of level shift
up in sort of expectations of data center demand growth

(09:53):
basically as a function of all of the excitement for AI.

Speaker 4 (09:57):
Yeah. I think then you're right. I mean, it's not
like we didn't know that Microsoft had a partnership with
open Ai and that that you know, AI was going
to consume energy. I think everyone though, was a bit
surprised at just how quickly what chat GPT could do

(10:18):
just captured the collective consciousness. Yeah, and I think, I
mean you probably remember when when that was released. I
mean it it really sort of surprised everyone, and it
became this thing where suddenly, even though we sort of
knew what we were working on, it wasn't until you
sort of put it out into the world that you
realize maybe what you've created. And I mean that's where

(10:39):
we realized we are we are running up this curve
of capability a lot faster than we thought, uh, and
the a number of applications that are getting built on this,
in the number of different ways that it's being used,
and how it's just become sort of common parlance. I mean,
everyone knows what chat GPG three is, and no one
what it was the month before that, right, So there

(10:59):
there as a bit of a I think of a
surprise in terms of just how quickly it was going
to capture, you know, the collective consciousness and then you know,
obviously lead to everything that's sort of being created as
a result. And so we just we just moved up
that curve so quickly, and I think that's that's where
the industry maybe got you know, certainly the utilities were

(11:20):
behind because as you may have seen there, a lot
of them are starting to restate their load growth expectations
and that was something that was not happening right before that.
And so we've had massive changes just in the last
two years of how utilities are one of the number
cap So you know, if you take a look at
a utility like Dominion in Virginia, so that's the largest

(11:44):
concentration of data centers in the United States, so they're
pretty good representative of what's happening. If you go back
to twenty twenty one, they were forecasting load growth over
a period of fifteen years of just a few percent.
I mean it was about it was single digit growth
over that entire period, so not yearly growth, but over
fifteen years, single digit growth. By twenty twenty three, they

(12:07):
were forecasting to grow two x over fifteen years. Wow.
Now keep in mind this is electric utility. They do
ten year planning cycles. So because they have very long
lead times for equipment, for getting rights of way for
transmission lines, they aren't companies that easily respond to a

(12:28):
two x order of magnitude, you know, growth change over
a period of fifteen years. I mean that is a
that is a massive change for electric utility, particularly given
the fact that the growth rate over the last fifteen
to twenty years has been close to zero, so there's
been relatively no load growth in fifteen to twenty years.

(12:48):
Now suddenly you have utilities having to pivot to doubling
the size of their system in that same horizon.

Speaker 3 (13:10):
I want to ask a very basic question, but I
think it will probably inform the rest of this conversation.
But when we say that AI consumes a lot of energy.
Where is that consumption actually coming from? And Joe touched
on this in the intro, but is it you know,
the sheer scale of users on these platforms, is it

(13:31):
I imagine the training that you need in order to
develop these models, and then does that energy usage differ
in any way from more traditional technologies.

Speaker 4 (13:43):
Yeah. So whenever I think about the consumption of electricity
for AI or really any other application, I think you
have to start at sort of the core of what
we're talking about, which is really the human capacity for data.
Like whether it's AI or cloud. Humans have a massive
capacity to consume data. And if you think about where

(14:06):
we are in this curve, I mean we're on some
form of S curve right of human data consumption, which
then directly ties to data centers, devices, energy consumption ultimately,
because what we're doing is we're turning energy into data.
We take electrons, we convert them to light, we move
them around to your TV screens and your phones and

(14:29):
your laptops, etc. So that's the uber trend that we're
riding up right now. And so we're climbing this S curve.
I don't know that anyone has a good sense of
how steep or how long this curve will go. If
you go back to look at something like electricity, it
was roughly about one hundred year. S curve started in

(14:51):
the beginning of last century, and it really started to flatline,
as I mentioned before, towards the beginning of this century.
Now we have this new trajectory that we're ringing, this
new S curve that we're entering, that's going to sort
of change that narrative. But you know that S curve
for electricity took about one hundred years. No one knows
where we are on that data curve today. So when
you inject something like AI, you create a whole new

(15:14):
opportunity for humans to consume data, to do new things
with data that we couldn't do before, and so you
accelerate us up this curve. Right, So, we were sitting
somewhere along this curve. AI comes along, and now we're
just moving up even further, And of course that means
more energy consumption because the energy intensity of running an

(15:35):
AI query versus a traditional search is much higher. Now
what you can do with AI obviously is also much
greater than what you can do with a traditional search,
So there is a positive return on that invested energy.
So that's you know when when oftentimes when this conversation
comes up, there's a lot of consternation and panic over well,

(15:58):
what are we going to do? You know, we're going
to we're going to run out of energy. The nice
thing about electricity is we can always make more. We're
we're never going to run out of electricity. Not to
say that there's not times where the grid is under
constraint and there you know, you have risks of brownouts
and blackouts. That's that's the reality. But we can invest
we can invest more in transmission lines, we can invest

(16:19):
more in power plants, and we can create enough electricity
to to match that demand.

Speaker 2 (16:26):
Just to sort of clarify a point in adding on
to Tracy's question, you mentioned that doing an AI query
is more energy intensive than say, if I had just
done a Google search, or if I had done a
Being search or something like that. Like, what is it
about the process of delivering these capabilities that makes it

(16:47):
more computationally intensive or energy intensive then the previous generation
of data usage or data querying online.

Speaker 4 (16:57):
There's two aspects to it, and we sort of alluded
to it earlier. But the first is the training. So
The first is the building of the large language model
that itself is very energy intensive. These are extraordinarily large
machines collections of machines that use very dense chips to

(17:18):
create these language models that ultimately then get queried when
you do an inference. So then you go to CHATGBT
and you ask it to give you a menu or
dinner party you want to have this weekend. It's then
referencing that large language model and creating this response. And
of course that process is more computationally intensive because it's

(17:40):
it's doing a lot more things than a traditional searchces
personal search just matched the words you put into a
database of knowledge that it put together. But these large
language models are much more complex, and then the therefore
the things you're asking to do is more complex. So
it will will almost by definition, be a more energy
intensive process. Now, it's not to say that it can't

(18:03):
get more efficient, and it will. And Nvidio just last
week was releasing, you know, some data on some of
its next generation chips that are going to be significantly
more efficient than the prior generation. But one of the
things that we need to be careful of is to
think that because something becomes more efficient, then therefore we're

(18:24):
going to use less of the input resource, in this
case electricity. That's that's not how it works, because going
back to the concept of human capacity for consuming data,
all we do is we find more things to compute.
And this is you've probably heard of Jabon's paradox, and
this is the idea that, well, if we make more

(18:46):
efficient steam engines. He was an economists in the eighteen
hundreds and he said, well, if you make more efficient
steam engines, then we'll use us coal. And he's like, no,
that's not what's going to happen. We're going to use
more coal because we're going to mechanize more things. And
that's exactly what we do with data just because we
because we've had more law for years and so chips
have become incredibly more efficient than they were decades ago,

(19:06):
but we didn't use less energy. We used much more
energy because we could put chips in everything. So that's
the trend line that we're on. It's still climbing that
curve of consumption, and so no amount of efficiency is
going to take us at this point at least, because
I don't believe we're anywhere close to the bend in

(19:27):
that s curve. No amount of efficiency is going to
take us off of continuing to consume more electricity, and
at least in the near term.

Speaker 3 (19:36):
So I have another basic building block kind of question.
But when we say that technology companies are aware of
the importance of energy usage or availability, and that this
is something they happened working on, what exactly is the
process by which a tech company gets its energy. So

(19:56):
you know you have a big data center, I imagine
you have some sort of agreement with whatever utility is
in that area. But I also imagine that that agreement
looks very very different to like my household energy bill
or something like that.

Speaker 4 (20:14):
I'm certain it does hopefully significant orders of magnitude. Yes,
So there's two components. I mean, one is, if you're
building a data center, you have to plug it in somewhere.
You've got to plug it into the grid, right, So
there you're working with your local electric utility or transmission
company and doing planning for how big is the facility

(20:38):
to be, how much power is it going to pull
off the grid at any given time, and then over
a period of time, because these facilities just tend to
grow forever, and so that's the physical nuts and bolts
of connecting to the grid. Now, the second piece, of course,
is there needs to be some generation source as well,
like where's the power going to come from? And so
those two things are related, but they could be somewhat disconnected.

(21:01):
And so this is where you see you know, these
especially the tech companies who've really been leaders in this space,
entering into all these power purchase agreements for wind energy
and for solar energy and in some cases nuclear. You
mentioned the project earlier that's actually an AWS project where
they cite it next to the Susquehanna Nuclear plant. Right,

(21:23):
So all of that is around where are the electrons
going to come from? And how can with that purchasing
power of being some of the largest energy consumers in
the planet, how can they begin to influence the mix
of generation on the grid? Right? And that's the critical issue,
is that you're trying to influence where that power is

(21:43):
being generated from. It's not. And one thing just to
keep in mind is that you know the electrons you get,
you know, whether it's at your house or the data
center down the street, they're all the same electrons. You're
all pulling from the same grid. But what you're trying
to do is influence how that that generation is being created,
and that's where these purchase agreements come in for all

(22:05):
these different sources of energy.

Speaker 2 (22:07):
All right, now, let's bring the question back to say
the utility side or save.

Speaker 4 (22:11):
The dominion side.

Speaker 2 (22:12):
So the dominion executives for decades have basically seen no growth,
and then suddenly in the span of the year, they're like, oh,
actually we're going to double. What do they do? What
are they doing right now today on we're recording this
April tenth, twenty twenty four. What are they doing right
now to expand generation or expand the grid or whatever
it is to meet that doubling of demand.

Speaker 4 (22:35):
Well, this is where this is where it gets a
little concerning, is that you have these tech companies that
have these really ambitious commitments to being carbon neutral, carbon negative,
having one hundred percent zero carbon energy one hundred percent
of the time, and you have to give them credit
for the work they've done. I mean, that industry has
done amazing work over the last decade to build absolutely

(22:59):
just gigawatts upon gigawatts of new renewable energy projects in
the United States, all over the world. They've been some
of the biggest drivers in the corporate focus on decarbonization,
and so you really have to give that industry credit
for all it's done, and all the big tech companies
have done some amazing work there. The challenge though, that

(23:22):
we have is the environment that they did that in.
Was that no growth environment we were talking about. They were
all growing, but they were starting from a relatively small
denominator ten or fifteen years ago. So there and there
was a lot of overhang in the utility system at
that time because the utilities had overbuilt ahead of that

(23:42):
sort of flatlining, so there was excess capacity on the system.
They were growing inside of a system that wasn't itself
growing on a net basis. Yeah, So everything they did,
every new wind project you brought on, every new solar
project you brought on, those we're all incrementally reducing the
amount of carbon in the system. It was all net positive.

(24:06):
Now we get into this new world where their growth
rates are exceeding what the utilities have ever imagine in
terms of the absolute impact on the system. The utilities
response is the only thing we can do in the
time horizon that we have is basically build more gas
plants or keep online gas plants or coal plants that

(24:28):
we were planning on shuttering. And so now the commitments
that they have to zero carbon energy, to be carbon negative,
et cetera, are coming into contrast with the response that
the utilities are laying out in their what's called integrated
resource plans or IRPs. And we've seen this recently, just

(24:49):
last week in Georgia. We've seen it in Duke in
North Carolina, Dominion and Virginia. Every single one of those
utilities is saying, with all the demand that we're seeing
come into our system, we have to put more fossil
fuel resources on the grid. It's the only way that
we can manage it in the time rise that we have.
Now there's a lot of debate about whether that is true,

(25:10):
but it is what's happening.

Speaker 3 (25:11):
So when push comes to shove, it seems like some
of the green priorities are getting superseded by existential pressures
on the business model. Perhaps, and we could debate how
transferable AI actually is at this point and how big
a moat you have over something like chat, GPT or

(25:32):
Claude or something like that, but there does seem to
be the sense of urgency among tech companies where if
you're not building something out right now and trying to
dominate the market and really produce the best thing possible,
well you're either losing, you know, billions of dollars or
you're going to be superseded by someone who does manage

(25:52):
to do that successfully.

Speaker 4 (25:55):
That's exactly right, and it's probably not billions of dollars.
There's probably trillions of time. Yes, yes, And that's where
the competitive pressure is coming in. And this is why
there's there's such a focus right now in this industry
on where is the power going to come from? Because
the ability to at least envision and on paper design

(26:16):
training models that are absolutely enormous, just orders of magnitude
bigger than anything that we've ever built in terms of
a data center are coming in to start contrast with
the reality of the power system of one, is that
power even available? And two, if it could be available,
is there a way to do it with a zero
carbon approach, which is again what these companies are committed to.

(26:39):
And that's that's the tension that we're in right now
of how do we quickly accelerate the delivery and growth
of the electric grid? Which is which and I think
I just want to the quick aside on this consuming
electricity is in the context where it's talking about is

(27:00):
a really great thing. I mean, this is something that
leads to economic growth, it leads to job creation. All
of this, I mean, this whole problem that we have
right now of electric utilities having to think about this
whole new era of growth. It's all because we're on
shoring manufacturing in the United States. We're building these data
centers and creating all sorts of amazing tools and creating

(27:23):
efficiency across all sorts of sectors. And we're also in
the same vein we're also electrifying transportation and heating. Like,
all of this is good. It's all goodness. And we
didn't even get to things like hydrogen production and other
ways that we're going to use electricity. The real rub
of this, though, is that we're in this situation right
now where again the electric electricity industry was somewhat surprised

(27:49):
by this. They weren't prepared for over a period of
a couple of years. Again going back to the case
of Dominion having to double their load forecast flexively, They're
going to go to the one thing they know how
to do, which is build gas plants because they know
that works. That's the easy way out. There are other
things we can do, though. There are ways we can

(28:10):
leverage the existing system more effectively. We can use things
called grid enhancing technologies, where we through sensoring, through better
dynamic rating of power lines, we can actually get more
out of the existing system we have. There's ways we
can use storage more effectively, because really what we're trying
to manage is just these system peaks. Most of the

(28:32):
time there's plenty of power. It's really just during the
modest summer hours or the coldest winter hours that the
system gets constrained, and that's what's driving a lot of
the need for utilities to want to build this new capacity.
But we can manage it in other ways. And it's
really incumbent upon the data center industry to lean in
on this, to think through how can we be more

(28:55):
of a party to solving this problem. Because data centers
have lots of opportunities to be more flexible. They have
behind the meter generation, they have behind the meter storage.
They can actually be part of the solution, not just
part of the problem.

Speaker 3 (29:08):
I just want to press you on this point because
I know people will have questions about this, and I
take the point about in many respects we're talking about
increased energy usage as a result of new things that
are leading to you know, new jobs and new productive industry,
and also the idea that, well, we can produce more

(29:28):
electricity in different ways, or we can make the delivery
of electricity more efficient, and all those types of things.
But I think one of the reservations people might have
about this is the idea of you know, competition with
large tech companies that have a lot of money and
that potentially have a lot of influence over the utility companies,

(29:51):
and the idea that maybe you could get a situation
where I don't know, Amazon gets like one hundred percent
off take from some power plant in whatever state, and
maybe other people are left with either you know, not
enough electricity or more likely much more expensive electricity. Can

(30:11):
you talk about that zero? I was being somewhat facetious
in the intro talking about zero sum game. But there
is this idea of like competition and there might not
be enough to go around, at least at the precise
times that everyone might want it.

Speaker 4 (30:29):
That's right, And that's the big challenge that good planners
have today is what loads do you say yes to
and what are the long term implications of that and this?
And we've seen this play out over the rest of
the globe where you've had these concentrations of data centers.
This is a story that we saw in Dublin, We've

(30:52):
seen it in Singapore, we've seen it in Amsterdam, and
these governments start to get really worried of wait a minute,
we have too many data centers as a sort of
percentage of overall energy consumption. And what inevitably happens is
a move towards putting either moratoriums on data center build
out or putting very tight restrictions on what they can

(31:14):
do and the scale at which they can do it.
And so, you know, we haven't yet seen that to
any material degree in the United States, but I do
think that's a real risk, and it's a risk that
the data center industry faces, I think somewhat uniquely in that,
you know, if you're the governor of state and you
have a choice to give power to a say new

(31:35):
you know, ev car factory that's going to produce fifteen
hundred and two thousand jobs versus a data center that's
going to produce significantly less than that you're going to
give it to the factory. Right, the data centers are
actually the ones that are going to face likely the
most constraints as governments, utilities, regulators start wrestling with this

(31:57):
trade off of Oh, we're going to have to say
no to somebody. And that's the real risk that I
think the AI and data center industry faces today is
that they are the easiest target because everyone loves what
data centers do, but no one particularly loves just having
a data center next door to their house. And so
that's a real challenge for the industry is that they

(32:22):
will start to get in the crosshairs of these regulators, leaders,
whomever who's pulling the strings as these decisions start to get.

Speaker 2 (32:47):
So I just want to make two random thoughts that
were in my head. I walked by a film set
in the East Villaers the other day. They were filming
this movie, and they are all these like big, like
thick electrical cables like you know, like that are powering
the lights and and all that stuff. And I thought
to myself, Oh, it would be so great when they
can just make all the movies on AI with Sora
or something like that, and then you know, we'll also

(33:09):
get electricity savings because we won't have to have human
actors with actual lights and stuff like that, so that'll
be exciting and then being a little facetious about the
end of human actors, but you know, in theory that
could be exciting, and then you could you know, you
mentioned it was like, well, the utilities got surprised by
the low you know, this the spike in demand. But
it sounds to me like we can't really blame the

(33:29):
utilities too much because if even the people inside Microsoft
got a bit caught unsurprised by the explosion of AI
interest in the fall of twenty twenty two, then I
guess like we can't really blame Dominion if they hit
they were private. Further away from the issue, you mentioned
peak demand, and this gets to like power, the type

(33:49):
of power, because people talk about like this sort of need.
The problem with renewables as well, at least when we're
talking about solar and wind, there's this intermittency problem. It's
not always sunny, even if it's hot. When it's hot,
it's not always windy, there's nighttime, et cetera. How much
does that constrain the ability of more renewables to be

(34:10):
sort of the solution to the utilities problem.

Speaker 4 (34:14):
It's a real challenge because again, as you noted, we're
trying to manage peak demand. That's what all this growth
is about. So peak demand is about the certainty that
you're going to have power during those highest system piece,
the hottest days, the coldest winter nights, and you can't
always guarantee that renewable generation will be online during those times.

(34:36):
And this is the role of the system planner is
to look at all these different resources and figure out
how can we assure that we have the sufficient reserve
margin to ensure that we're not going to have things
like rolling brownouts or black eyes. Now there's a lot
of tools though, that we have to help manage that uncertainty,
and we have increasingly, you know, month after a month,

(35:00):
it seems like lower cost battery options which give us
more duration that we can deploy to solve some of
these issues. We have the ability of even the loads
through like virtual power plants to be more responsive during
these times of system peaks, right, So we have tools
that we can use to manage that uncertainty. The problem

(35:23):
is that it is a very complex problem. I mean,
you're talking about, you know, millions of different data points
that you're trying to manage, and the way that utilities
have historically managed these things has been fairly rudimentary in
terms of their sophistication, and so they're having to go
through this learning curve of how do we ensure that

(35:44):
we can achieve the load growth that all these industries
you know, are expecting and meet the reliability, cost availability
expectations of our customers. And that's where that's where the
challenge comes in. And this is where the whole problem,
it's frankly really interesting, is that there are lots of
levers that we have and we don't just have to

(36:05):
throw more fossil fuel plants at this problem. Does that
mean we're not going to build any new gas plants
in this country? I certainly will. I don't think there's
a way around this problem, at least in the short run,
without having some incremental addition of fossil based resources. But
there's also a lot of other things we could be

(36:26):
doing that would significantly reduce dependence on fossil based resources
to achieve the growth objectives that we have as a country.

Speaker 3 (36:36):
What are the levers specifically on the tech company or
the data center side, because I again, so much of
the focus of this conversation is on what can the
utilities do, what can we do in terms of enhancing
the grid managing supply more efficiently? But are there novel
or interesting things that the data centers themselves can do

(36:59):
here in terms of managing their own energy usage.

Speaker 4 (37:02):
Yes, I there's there's a few things. I mean, one is,
data centers have substantial ability to be more flexible in
terms of the power that they're taking from the grid
at any given time. As I mentioned before, every data
center or nearly every data data center has some form
of backup generation. They have some form of energy storage

(37:23):
built into this. So there the way a data center
is designed. Its designed like a power plant with an
energy storage plant that just happens to be sitting next
to a room full of servers, right, And so when
you when you break it down to those components, you say, okay, well,
how can we better optimize this power plant to be
more of a grid resource? How can we have to
optimize the storage plant to be more of a grid resource?

(37:45):
And then in terms of even the servers themselves, how
can we optimize the way the software actually operates and
is architected to be more of a grid resource. And
that is that sort of thinking is what is being
forced on the industry. Frankly, we've always had this capability.
I mean, we were doing I mean we did a
project like twenty sixteen with a utility where we put

(38:07):
in flexible gas generators behind our meter because the utility
was going to have to build a new power plant
if we didn't have a way to be more flexible.
So we've always known that we can do this, but
the industry has never been pressurized to really think innovatively
about how can we utilize all these assets that we

(38:28):
have inside of the data center plant itself to be
more part of the grid. Right, So that's I think
the most important thing is really thinking about how data
centers become more flexible. There's a whole nother line of thinking,
which is this idea of well, utility is not going
to be fast enough, so data centers just need to
build all their own power plants. And this is where

(38:49):
you start hearing about nuclear and SMRs in confusion, which
is interesting except it doesn't solve the problem this decade.
It doesn't solve the problem that we're facing right now
because none of that stuff is actually ready for prime time.
We don't have an SMR that we can build today predictably,

(39:10):
on time, on budget, So we are dependent on the
tools that we have today, which are things like batteries,
great enhancing technologies, flexible load reconductoring, transmission lines to get
more power over existing rights of ways. So there's a
number of things we can do with technologies we have

(39:32):
today that are going to be very meaningful this decade,
and we should keep investing in things that are going
to be really meaningful next decade. I'm very bullish on
what we can do with new forms of nuclear technology.
They're just not relevant in the time horise and the
problem we're talking about.

Speaker 2 (39:50):
At some point, at some point, we're going to do
an Odd Lots episode specifically on the promise of small
modular reactors and why we still don't have them despite
the seeming benefits. But do you have like a sort
of succinct answer for why this sort of seeming solution
of manufacturing them faster, et cetera like has not translated

(40:11):
into anything in production.

Speaker 4 (40:14):
Well, quite simply, we just forgot how to do it.
We used to be able to build nuclear in this country.
We did in the seventies, we did in the eighties,
but every person that was involved in any one of
those projects is either not alive or certainly not still
a project manager at a company that would be building
nuclear plants. Right. We I think we underestimate human capacity

(40:36):
to forget things right. Just because we've done something in
the past doesn't mean that we necessarily can do it again.
We have to relearn these things. And as a country
like we do not have a supply chain, we don't
have a labor force, we don't have people that manage
construction projects that know how to do any of these things.
And so when you look at what South Korea is doing,

(40:58):
you look at what China's doing, you know, they are
building nuclear plants with regularity, they're doing it at at
a very attractive costs, they're doing it on a predictable
time horizon. But they have actually built all of those
resources that we just simply don't have in this country
that we need and we need to rebuild that capability.

(41:18):
It just doesn't exist today. You know.

Speaker 2 (41:19):
One of the things that in the when we're talking
about utilities, they're like weird companies because they're not like
normal businesses. They're sort of natural monopolies. They price set,
in my understanding, is based on how much they invest,
and so they have to then petition some local regulators say, look,
we had to invest this much, and that's why I
want to raise the prices this much, et cetera. Are

(41:40):
there regulatory hurdles or things about the regulatory system right
now that are going to make that doubling of demand
more challenging than they need to be?

Speaker 4 (41:50):
Absolutely, And so you go back to the era that
we've been in of relative no load growth.

Speaker 2 (41:58):
Yeah.

Speaker 4 (41:58):
You know, if you're a utility regulator and utility comes
and asks you for a billion dollars for new investment,
and you're used to saying no, used to saying, well,
wait a minute, why why do you need this, what's what?
What is this for? How is this going to help
you know, manage again, reliability, cost, predictability, et cetera. Now

(42:19):
you're in this whole new world and going back to
this concept of like we easily forget things. No one
who's a regulator today or the head of a utility
today has ever lived through an environment where we've had
this massive expansion of the demand for electricity. So everyone
now including the regulators, are having to relearn, Okay, how

(42:40):
do we enable utility investment in a growth environment. It's
not something they've ever done before, and so they're having
to figure out, Okay, how do we create the sort
of the bandwidth for utilities to make these investments, because
one of the fundamental challenges that utilities have is that

(43:01):
they struggle to invest. If there's no customer sitting there
you asking for the request, right, so they can't sort
of invest. I mean, if I'm in Vidia and I'm
thinking about the world five years from now and think, wow,
how many chips do I want to sell in twenty thirty,
I can go out and build a new factory. I
can go out and invest capital, and I can go

(43:23):
do all this. I mean, I don't need to have
an order from a Microsoft or an Amazon or a
Meta to go do that. I can build speculatively. Utilities
can't really do that. They're basically waiting for the customer
to come ask for it. But when you have all
this demand show up at the same time, well what happens.
The lead times start to extend and so instead of

(43:44):
saying yeah, I'll give you that power in a year
or two years. It's now like we I'll give it
to you in five to seven years. And so that's
an unsustainable way to run the electric utility grid. So
we do need regulators to adapt and evolve to this
new era of growth.

Speaker 3 (44:00):
This is actually exactly something that I wanted to ask you,
which is we're sort of used to at this point
when we talk about industrial policy, the importance of an
end buyer for whatever capacity that we're building out, and utilities,
you know, to some degree, have struggled with that in
recent decades, at least this idea that they have huge

(44:23):
investment requirements. And while there is clearly demand for electricity
and maybe new types of electricity, it's not always certain
and you're sort of managing these day to day cycles
and things like that. But if we know that AI
is booming, and we know this is a future area
of growth, and we see these headlines like AI servers

(44:45):
are going to require like one hundred tarawatt hours per
year and things like that, does that potentially give utilities
more certainty or more confidence in the future investment outlook.

Speaker 4 (44:58):
I mean, I would say in some respects it does.
I mean, they're certainly. And I've been spending a lot
of time with utilities well for most of my career,
but even in the last several months having this conversation
about how are they thinking about this future growth? And
you know, they're they're struggling a little bit because like,
all they know is what the customers, you know, show

(45:21):
up at their door and say that they want. Right,
they say, well, okay, I talk to X y Z
data center and this is what they say they want.
But they don't necessarily have view to the long term,
like what really is the demand behind that? Like I'm
getting a request because one data center bought one parsonal
Land and they need five hundred mega loots of power,

(45:42):
and then they're trying to extrapolate from that, well, what
is that underlying demand for data? Right? How much more
growth should I expect after that? And that's where the
utilities I think are really struggling is that they don't
they can't see much beyond the requests that they have,
and so they're trying to then extra appolid Okay, what
is it? What are these trends? You know? And really

(46:03):
the only the only way to get a good sense
of the the real demand for data and the trends
is you have to actually go back to the probably
to the NVIDIAs and the intels of the world and
go what's the forecast for chip sales? Like, what's the
forecast for how many chips you're going to make? Not
I mean not even sales, but really how much they
produce because frankly, I think every chip they can produce,

(46:26):
it will get plugged in something. Someone will buy it
and it will get plugged in. So that's that's probably
the best estimates that you can come up with for
what utility load growth should look like, at least as
it relates to data center. Right. But you know, you
have thousands of utilities in the United States, so you
don't have you know, there's not even like a single
source you can go to to say, Okay, what's the

(46:46):
forecast next year for electricity loads? Like nobody has that.
I mean people, there's numbers out there, but they're not
really based on anything other than speculation. So this is
the challenge that utilities is that they don't have a
good view into what load growth really is going to
look like over the next five, seven, ten years.

Speaker 2 (47:09):
Brian Jenna's fascinating conversation. There's probably like ten more follow
ups that we could do specifically with you, and maybe
one day we'll do them. But in the meantime, thank
you so much for coming on, Odd Ladds. This is
a great conversation that we definitely needed to get done,
so really appreciate you joining it.

Speaker 4 (47:26):
Thank you, Joe and Tracy. You really appreciate it.

Speaker 2 (47:41):
Tracy, I thought that was great. I think actually the
first thing that sort of stands out from my mind
sort of like working backwards through the conversation, is just
sort of exactly what you talked about, which is that
there is this weird situation where you have this very
unpredictable demand. No one knows like what the steady state
demand is going to be for this stuff, and yet

(48:02):
the utilities are sort of legally constricted in the degree
to which they can say overbuild now or sort of
operate or plan for that demand.

Speaker 4 (48:11):
No.

Speaker 3 (48:11):
Absolutely, And also, well, going back to the beginning of
the conversation with Brian, the idea of a mismatch between
just how fast technology is going at the moment in
terms of developing AI versus utilities and their you know,
ten year investment programs that they need to get regulatory
approval for and all of that stuff. Now, there was

(48:34):
so much to pick out from that conversation. I also
thought it was interesting. So I think there is a
sense among a lot of commentators that there is going
to be competition for power at least at certain times.
But I thought Brian's point about how, in some respects
data centers might be the easy target for politicians to

(48:55):
kind of ignore, I thought that was really interesting. And
again his example of what, well, if you know, if
you're a governor or something, and there's a Tesla factory
that wants energy versus a data center that probably has
I don't know, like a handful of employees. Maybe that's
an exaggeration, then you're gonna go with the Tesla factory.

Speaker 2 (49:13):
Right, totally, so right, You're not going to shut down
the factory that employs people. You're not gonna politically tell
people to go without air conditioning on a hot day.
The data center is going to be the first target.
I thought that was interesting. You know, again, I do
think it's like striking, and I think this is not
even just in the energy context, But I still a
sort of fascinated by this idea that like open Ai

(49:36):
was this company, I think it was founded in twenty sixteen,
and people saw GPT one and GPT two and then
GPT three, which came out before Chad GPT.

Speaker 3 (49:45):
But it was.

Speaker 2 (49:45):
Really like that day. I mean, it was like that
day that Chick GPT was announced. Even though like the
technology was in development, there are also theories and stuff.
It was like that day of the commercialization of the
productization of this technology where everyone woke up and all
these different companies like, we're in like a totally different
new world and we have to revisit all of these
investment decisions, whether it's on chips or energy that we

(50:08):
had made maybe just a year ago.

Speaker 1 (50:10):
Yeah.

Speaker 3 (50:10):
It's also it's almost like bullwhip effect isn't the right term,
but I'm just thinking the utilities in some respects are
at the very end of that sort of demand cycle, right,
So even the tech companies woke up to it very
very suddenly, the boom and AI and how fast this
was all going to come about and all of that,
and the utilities are sort of the last ones to

(50:32):
know in that respect, and we're expecting them to react
very quickly to it. It's kind of funny.

Speaker 2 (50:37):
The other thing too, is like it'll be fastating to
see if some of these net zero commitments just have
to give, yeah, or something is going to happen there.
It sounds like it sounds like the rubber is going
to meet the road. But it does not sound like,
in the short term anyway, that there is a way
to accommodate this much increased demand with renewable energy. It

(50:58):
doesn't seem like it. And so like something is gonna
it seems like something's gonna have to give.

Speaker 3 (51:03):
I think we're back to the very start of this conversation,
which is the idea of we have these two very
different paths where in an ideal world, if everything goes perfectly,
you have all this new commercial interest in technology that
requires a lot of energy usage, and so some of
those dollars get diverted into building out additional capacity in
terms of energy and maybe even additional green capacity. But

(51:28):
the other path is kind of depressing, where you have
a bunch of big tech companies that feel existential pressure
to do whatever it takes to win the AI race,
and maybe whatever it takes includes getting energy through coal
or something like that.

Speaker 2 (51:42):
You know what I think is interesting and it's sort
of it hadn't really clicked to me, but Brian talked
about how, you know, after the early eighties, the US
basically stopped building nuclear and we're like, oh, you know,
it's like a big mistake. Why do we stop building nuclear?
But you could sort of understand it in the context
of very little growth, right, So why make these like
really big investments in anything when obviously at the time

(52:06):
there wasn't as much concern or awareness about climate change
and the effects of fossil fuels, and there just wasn't
much demand growth, So why make these big things? And
so you think about like South Korea and China having
never really slowed down on the nuclear construction, but they're
also because they're developing countries or poor countries becoming richer,
they never presumably had that sort of demand plateau just

(52:28):
by dint of having started from somewhere lower.

Speaker 3 (52:31):
You know what we need what we need chat GPT
to design a small modular reactor, and then we need
a robot, then we need a robot yees to build it. Yeah,
all right, well it sounds like we're probably far away
from that. Maybe one day, Okay, shall we leave it there?

Speaker 4 (52:46):
Let's leave it there.

Speaker 3 (52:47):
This has been another episode of the All Blots podcast.
I'm Tracy Alloway. You can follow me at Tracy Alloway and.

Speaker 2 (52:53):
I'm Joe Wisenthal. You can follow me at the Stalwart.
Follow our producers Carmen Rodriguez at Carman Erman dash Oll
Bennett at dash Spot, Killbrooks at Kilbrooks. Thank you to
our producer Moses onm. For more Oddlots content, go to
Bloomberg dot com slash odd Lots, where we have transcripts
of blog in the newsletter and you could chat about
all of these things, including AI, energy and climate in

(53:14):
our chatroom Discord Discord dot gg slash odd Lots twenty
four to seven with fellow listener.

Speaker 3 (53:19):
And if you enjoy odd Lots, if you like it
when we dive into the energy usage of AI, then
please leave us a positive review on your favorite podcast platform.
And remember, if you are a Bloomberg subscriber, you can
listen to all of our episodes absolutely ad free. All
you need to do is connect your Bloomberg account with
Apple Podcasts. Thanks for listening,
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.