All Episodes

March 5, 2025 • 36 mins

Energy-hungry data centers are on the rise. Power demand driven by artificial intelligence has been met by an increase in power purchase agreements (PPAs) for low-carbon energy. Meanwhile, DeepSeek has reduced demand through more efficient computations. So what is driving decision making at tech companies that work in the AI and data center space? At the 2025 BloombergNEF Summit San Francisco, Mark Daly, BNEF’s head of technology and innovation, moderated a panel titled “Data Center Dynamics.” This episode brings listeners that panel, which featured Steven Carlini, chief advocate of data centers and AI at Schneider Electric; Will Conkling, head of data center energy for the Americas and EMEA at Google; Kleber Costa, chief commercial officer at AES Corporation; and Darwesh Singh, founder and CEO at Bolt Graphics.

To learn more about BNEF’s Summits taking place around the world and to see recordings of BNEF Talks at previous Summits, head to https://about.bnef.com/summit/.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Dana Perkins and you're listening to Switched on
the BNAF podcast. Today we bring you a recording from
our BNAF summit in San Francisco, which took place on
the fourth and fifth of February. The panel was titled
Data Dynamics. Now certainly a hot topic in energy circles
has been the growth in data centers and how AI

(00:20):
has led to a rise in demand for power. In
order to meet this, big tech companies are looking for
solutions and firms like Google are signing power purchase agreements
in an array of technologies like nuclear, long duration energy storage,
and even geothermal. While these data centers are undeniably energy intensive,
efficiencies can be found when it comes to reducing load requirements.

(00:42):
This is seen with the recent release of Deepseek's AI
model in January twenty twenty five. On today's show, the
panelists discuss the increasing capital expenditure on data centers, as
well as what has been driving the decision making of
big tech companies when it comes to this spend and
how potential bottlenecks might slow them down. The panelists include
Stephen Carlini, Chief Advocate Data Centers and AI at Schneider Electric,

(01:06):
Will Conkling, head of Data Center Energy for the America's
in Amia at Google, Kleiber Costa, chief commercial officer for
AS Corporation, and Darwesh Singh, the founder and CEO of
Bold Graphics. The panel was moderated by Mark Daily, BNF's
head of Technology and Innovation. For more information about BNF
Summit's taking place around the world, as well as our

(01:28):
upcoming event in New York on the twenty ninth and
thirtieth of April, an to view recordings from this and
other previous events, head to about dot BNF dot com
Forward Slash Summit. Right now, let's hear from our panel
regarding power demand and data centers.

Speaker 2 (01:52):
Thank you everyone very much for joining us here today.
We've heard an awful lot about the energy transition, like
at all be enough events, and I imagine a lot
of events that everyone here goes to. It's very energy
focused group, transport focused group, but there's lots of big
transitions happening in the global economy. We're here today to
talk about the intersection of the energy transition with one

(02:14):
of the other really big transformations in the economy, maybe
the biggest one of our lifetime. If you believe some people.
So we're here to talk about data centers, energy demand,
artificial intelligence. I'm joined here today by Stephen Carlini, Chief
Advocate Data Centers and AI at Schneider Electric, Will Conkling,
head of Data Center Energy America's Animea at Google, Clever Costa,

(02:38):
chief commercial Officer at the AEES Corporation, and Darbush Sing,
Founder and CEO of Bolt Graphics. So the first thing
that I want to talk about is really why has
there been so much interest in data centers this year? Like,
let's just start from the beginning. What's causing out go
to you first, Star Resh, and if everyone can actually
just introduce themselves and kind of their view on the

(03:01):
industry in their first answer, that'd be great.

Speaker 3 (03:03):
Yeah. Thanks Mark.

Speaker 4 (03:05):
I'm dar Wesh Funer Ceobo Graphics. We are a semi
conductory startup focusing on GPUs. My background is in building
data centers, so I share a lot of the pains
at the panel and in the industry in general. Mark,
I think to your question, compute requirements have always been increasing.
This is I think it's not new to anyone. We

(03:25):
have phones now that are more powerful than PlayStation fours
from twelve years ago. What is new, though, is the
demand for ten years in the future computing power right now.
And a lot of these AI companies, whether they're training models,
whether they're building data centers to host these models, or

(03:45):
whether they're making phones that can run inference on these things,
they want it right now. And so I think that
creates a lot of interest, a lot of hype. It
also creates an opportunity for new players in the market
to come in, whether they're DC builders that can build
data centers in six months instead of four years, small
modular nuclear reactor companies that are building these and seven

(04:07):
years instead of twenty five years. So I think it's
just like a timeline. Let's shift all this left. I
want to do right now and what does that really enable?

Speaker 2 (04:14):
Okay, and so Stephen, could you give us a primer
on how Schneider Electric relates to this conversation and when
did your job change in this kind of this new
hype cycle that we're going through.

Speaker 5 (04:27):
Yeah, Kinder Electric, if you're not familiar with this largest
power and cooling solution provider for data centers in the world,
and as you said in an earlier panel, you know
AI is not new AI has been around for a while.
We've been talking to a lot of the hyperscalers. We
have staff of people signed to each hyperscaler, each large COLO,
each large enterprise accounts. And we you know, we noticed

(04:53):
five or six years ago, you know, these data centers
can't be built overnight. And five or six years ago,
a lot of the hyper scalers were coming to us,
not only talking to us about higher densities, but higher
densities and scales that we've never heard of. You know,
twenty twenty megawat data centers. Back then it was a
big data center. They're talking one hundred, one hundred and
fifty three hundred megawat campuses back then, and we're like, wow,

(05:13):
this is this is a big change. So the densities,
you know, started to really change when Nvidia came out
with you know, the A one hundreds and the.

Speaker 3 (05:24):
A one hundreds.

Speaker 5 (05:26):
It was about three years ago and and the A
one hundreds were kind of the first you know, at
scale deployments for a lot of the hyperscalers, and a
lot of those were air cool they were twenty five
kilowats parac Then you saw the grace hoppers, which were
last year, which were thirty six kilowats are acting seventy
two kilowats parak and now we have the Blackwells. The

(05:48):
black Wells are one hundred and thirty two kilowats per rack,
which is really pushing the limit of what we can
get powered to these racks and cooling to these racks. Next,
they're working on Ruben, which is the next generation. They're
already we're already working on the designs for that at
two hundred and forty kilowats per rack. So it's just exponentially.

(06:08):
For years, it was just two socket x eighty six
pizza box servers in these data centers, and we were
doing ten kilo wats per rack. And in the last
four or five years, it's just exponentially, you know, increased,
and not just increased in small scale, but it's a
large scale that we talked about.

Speaker 2 (06:25):
Okay, so will you're you're coming from Google, the company
that started the all it was Google's R and D
Live a transformer and Google's obviously been a huge energy
procurement company for years and years though and when did
when did this translate into changes in your job? The
recent technology advances.

Speaker 6 (06:43):
Yeah, so the story for me is like I've been
at Google just over ten years in some way to
performed buying energy for data centers and buying renewal energy
for data centers, and working with utilities and working energy supply.
And you know, Google has been an AI first company
since about like twenty sixteen, twenty seventeen. Our CEO has
been saying that for a long time, right, And we've
been infusing AI and machine learning into our products since

(07:06):
then in various various ways that you know, we probably
don't notice as users, but it's been there. And then
in the last you know, eighteen months or so, it
see the emergence of you know, sort of more consumer
facing pure mL and AI products you know before so,
like i'd say, mid twenty twenty three was kind of
a watershed moment for us, but we saw the green

(07:28):
suits of this before that, when you know, the grid
started to see demand growth, you know, generally across a
number of sectors and industries, right from more manufacturing and
more batteries and manufacturing and more car manufacturing and more
s reshoring of a lot of stuff back to the US,
and that started to manifest itself, right, and utilities coming

(07:49):
to us and saying, you know, they have to buil
out their transmission system in ways that they hadn't anticipated
in order to continue to serve load. And that's sort
of you know, power for the course as business as
usual for us. You couple that sort of like growth
of going from twenty years ago or sorry for the
last twenty years twenty twenty three of zero point five
percent annual growth on the grid to to three four

(08:11):
now four and a half five percent annual growth from
the grid, and then you have an emergence of a
new sort of large you know, data center demand with
machine learning chips to what Steve has talked about with
with nvidious chips and then our own internal tens of
processing units. We saw in mid twenty twenty three the
sort of crossover point from you know, being able to

(08:32):
sort of generally source power where when we needed it
on a timescal that made sense to us, to an
acceleration of demand to what Darbush just said around you know,
needing it sooner, and then a push out of lead
times on some of the grid buildout and some of
and serving that demand by utilities, and it's a it
was kind of a collision of technology sort of product

(08:52):
development curves and infrastructure timelines that don't always mesh well.
And we've been kind of been in that that soup
ever since. And it's I like to say, it's been
one of the more dynamic times in my career. And
I don't thpect I'll see anything like I would again, So.

Speaker 2 (09:08):
And clever, let's get the energy company perspective on this.
When did you start noticing a big change from what
was happening.

Speaker 7 (09:14):
Look, I think if you hear my my my colleagues
here on the panel, it has been about a lot
of changes over a very short period of time, a
lot of volatility. I would say, you know, I think,
I think you'll see, uh, these these massive growth over
the past few years projections and then one day opened

(09:38):
the papers and there's deep, deep seek there changing everything
and you don't know what is true what is not.
So everybody's trying to figure figure that out. Look the industry,
I'm with AES corporation. I've been at AS for for
about seven years there, but I've been in the energy
business for about twenty five years. I started my career
right when air Ron was the greatest thing on earth.

(10:00):
Everybody wanted to work for and run not not after
that and Ron, we all know what happened. And then UH,
there's the boom and bust on the gas cycle. Hainesville
Economics showed up with Shell Gas UH. A lot of
companies also went bankrupt, and volatility in the in the

(10:21):
energy sector disappeared for a long period of time, with
very small growth over the past five years or maybe
a little a little more, as Will said, is explosive demand.
I think it's been one of the most dynamics periods
of times of my career as well.

Speaker 3 (10:39):
But I guess the point here.

Speaker 7 (10:40):
Is that the energy the industry is not UH is
very familiar with challenges and how to overcome those challenges.
So I think my job really changed when when we
moved from being providers of projects of or or or
or or technology to partnering with some of these hyperscalers,

(11:05):
some of these large data center companies and start putting
together solutions with them. Not only at AS, not only
we're one of the largest according to bn EF. Actually
we're the largest provider of renewable energy to corporates over

(11:25):
the past three years, so we know how that works.
But we also own utility companies. We own utilities in
Ohio and Indiana, and we work very closely with those
hyper scalers and data center companies to meet their needs
at the utility level as well.

Speaker 3 (11:41):
But again, I think the biggest takeaway.

Speaker 7 (11:44):
It feels very very uncertain in terms of demand projections.
We're going to talk about that later. But whatever projections
you look, however you slice and dice there. The growth
is here, is real, and I think the industry is
everybody said to meet those challenges and meet solutions for

(12:09):
that growth.

Speaker 2 (12:10):
So this growth is real, we need to deal with it.
What's been a big question that I've been asked actually
a lot in my role is does this look different
than the data center growth that came before? With the
idea being data centers have been cited close to populations
so that latency is low and your Netflix loads really quick.
But that might not be the case for a new

(12:32):
AI applications where actually training doesn't need to take place
close to people. Maybe it'll take place in a far
away place. Starsh what's your sense of this kind of
regional dynamic.

Speaker 4 (12:45):
Yeah, definitely, AI is changing the requirements for where you
build data centers, where you can source power and also
the conversation. Like I think four years ago, if I
had a conversation with someone about sourcing like hydrogen power
for a data center, be laughed at.

Speaker 3 (13:01):
And now I'd be laughed at, but like a lot less.

Speaker 4 (13:03):
I think I'd be pointed towards other areas maybe in
this direction.

Speaker 3 (13:08):
But definitely.

Speaker 4 (13:12):
The workload that that's running in the data center does
impact really heavily what those requirements are where I can
put that data center. And now with training and with
let's say training is a batch workload, all right, HPEC
supercomputing is also a batch workload that can also be
an maybe choose Jensen's common It can be an Antarctica somewhere, right,
it doesn't be very close to me. So yeah, that

(13:35):
definitely does change, But I don't think it changes the
economics that much because you still have to deliver a
lot of power to it. You still need a really
fat gigabit multi hunter gigabit network link, and these requirements
like haven't really changed, I think, to be honest, Like,
if you're going to build a lot of data centers,
you're already going in the direction of, hey, I need
to build these further and further away from large metropolitan

(13:57):
areas because I can't get power anyway.

Speaker 3 (13:59):
So this is a problem like three four years ago,
it's just worse now.

Speaker 2 (14:02):
Well, actually, interested in you, you're the energy part of
the equation at Google. Can you give us a bit
of information? How does that factor into decision making rom
where to put data centers? Is it it's decided where
it goes and then we need to find energy or
is it part of the decision making process?

Speaker 3 (14:18):
Is that the first thing you decide on?

Speaker 6 (14:21):
Yes? And yes it you know historically so before you know,
before the growth and the grid that I talked about
kind of started. You know, it's not quite true, but
you could almost like throw a dart on a map
and you know, if you were somewhere within a reasonable
distance of a metropolitan area, you could probably find power
and a utility would at least build something for you
in a couple of years, right, And and that was

(14:41):
generally okay. You know, today, available grid and available generation
on that grid are are more scarce, at least for
the you know, short to medium term, right, And so
you have to be sort of smarter and better at
picking locations that have available power in the time scale
you're looking for and and and sort of act quick
to go and and reserve it and use it, right. So,

(15:04):
so yet, yes, it's a it's a more important factor
for us than maybe it was in the past. But
it doesn't I think divers to your last point there.
It's like if you go in the middle of nowhere,
like there isn't power, there isn't fiber, there isn't there
aren't people to build things, right, There's just these things
require a certain amount of infrastructure and civilization around them

(15:26):
to support them, right, and no one wants to live
next to it. You have to have people that work
there all the time, right like so so so there's
there's that factor. And and then our products also like
don't want to be in the middle of nowhere if
they can help it. Because if you build a building
for a data center, and and and so I'll back up.
There's there's a few different ways that or its fuveent
things that happen in an mL data center, right or

(15:46):
a data center. One, you could be training at Google
at least an internal model too. You could have a
customer paying you to be able to train their model, right,
or Three, you could be serving a model or serve
doing inference and serving AI to customers, right. Only that
first thing really is that flexible because our customers still
want to be like within spitting distance of their of
their footprint on the cloud. And then for inference, we

(16:07):
still want to be having little latency service to our
to our consumers, right. And so the internal training, yes,
it's more flexible. But if you build a data in
the middle of nowhere just for that and then you
finish that job, you actually have a Strandard asset, right.
And so if you think about efficiency of capital and
how you want to be able to reuse your capital
and recycle it, you actually don't necessarily always want to
be just going far afield to to you know, the

(16:28):
antarcticas or the deserts. We're might to be power, but
no people or no no users or no customers. So
so we tend to still have to follow where our
products want to be. And and then within those you
know regions or those uber regions we have to we
have to go then you know, find power availability.

Speaker 2 (16:42):
So okay, great, and Stephen, you you have great insight
into the entire kind of data center supply chain in
your ola Schneider, I'm interested, do you have any kind
of sense of how much of this new data build,
data center build that we're seeing is specifically I related.
Is that even something that makes sense? Is there an
AI data center versus something else or is it just bits?

Speaker 5 (17:05):
Absolutely, the you know, the servers are completely different, and
you know, the power and the cooling is different. And
if you look at a data center, it's an AI
data center today, say it's a ten megawaut data center.
You know, five or six years ago, there'll be a
thousand IT racks and the data hall would be huge.
Now the data hall has seven d I t racks
and you have all these chillers outside, you know, to
support the cooling that so it's much different. It's not

(17:27):
these Amazon warehouse type data centers.

Speaker 6 (17:29):
More.

Speaker 5 (17:29):
There's smaller, more confined in the IT rooms, a lot
of power and cooling going through it. It's not a
place that you know, it used to be walk around
and you know your place, servers and everyone had a
good time, but not anymore. It's a it's a business.
But you know, as we were saying, you know, the
you know, putting an asset where there's where there's power,
and everybody in the world right now is saying that

(17:52):
the data centers are going to go where.

Speaker 3 (17:54):
The power is.

Speaker 5 (17:54):
But but what we're what we're seeing and the hyper
scales are all doing this as you just said, is
they're they're building these training clusters that are close to
where people are with the intention of using those to
make money. There's not a lot of money being made,
you know, training a model, and the question is how
many more of these models are we going to need,

(18:15):
so the money is going to be made and deploying
them in the field. And we're seeing a big shift
and this, in my opinion, is kind of the year
of infernts and and we're starting to see you know
a lot of these training clusters that were originally deployed
just to train are now doing infrints, either full time
or part time. And the other thing that we're seeing
is you know, with infrints close to the users, optimized

(18:38):
for different applications. We're not seeing those yet because AI
is still developing. We're still at the beginning of this.
We don't know, you know, what it's going to take
to do an AI agent and a genta AI. You
know what's that going to take. It's going to be
multi modal how much processing, how much how much of
the IT stack is going to be needed and where,
and as we start inputting more and more video. You

(19:00):
right now everything's text to text, you know, multimodal, it's
going to be video, tech, text, image, it's going to
be all these different things. So we can't optimize the
data centers close to the users yet for inference, we're
still going to have to do those in data centers.
I think that's going to be the case for a
few years.

Speaker 2 (19:18):
Okay, so things not really changing too dramatically. It's just
build close to the users, like always.

Speaker 3 (19:25):
So clever.

Speaker 2 (19:25):
Actually, something I want to ask you about. I'm from
Ireland and so when I hear data center, I think,
oh my god, it's destroying the power system. There's like
moratorium and new data centers there because it's such a
large share of the power system. And there's a couple
of reasons in Europe where something like this has happened,
and now there's been conversations about this. This is going to
happen in the United States. Kind of seems like that's

(19:46):
calm down in the last few months. But interested to
hear your thoughts on how big a challenge this will
be for your business.

Speaker 7 (19:54):
It's hard to talk about that without politicizing things. But
I think, look, there will be parts of the country
where there's there would be some resistance uh to data
center deployment. We are seeing some of that in the
Southeast and other and other parts. I think I think
it's the real answer to that is all going to

(20:15):
depend on the solutions that we bring to to that
data center load growth.

Speaker 3 (20:21):
When you say we, do you mean as or do
you mean we.

Speaker 7 (20:24):
The providers of energy and together with the data center
operators and the and the hyper skaters. I think a
lot of all we talked about here is that there
was a there was a time not long ago, there
was this idea that not because of the l LAM
training phase, a lot of data centers, large data centers

(20:46):
will co locate with generation in places where there's as
Will was saying, there's no load, that there's no infrastructure,
there's no fiber offs so you have to make up
for the lack of all that with low cost of power.
But when you when you when when you look at
the amount of capital that in you to deploy in

(21:07):
those data centers, you don't want to run the risk
of being stranded after the training phase of the of
the large language model uh ends, so you want to
use that for something else. So we talked a lot
about this that here you end up going back to
where the traditional data center markets are, and in some

(21:29):
of those markets we are seeing local resistance.

Speaker 3 (21:33):
Uh.

Speaker 7 (21:33):
We're also seeing local resistance to the development of power
plants to supply those those data centers. The famous NIM
business not in my backyard. So I think that is
a challenge that that the industry, both the energy industry
and the data center industry, has to overcome. It is

(21:54):
a real challenge, uh. And I think it's one that
will be will be met with deployment of transmission, because
I think that there's more flexibility where you can deploy
data centers than there will be on where you deploy
the generation, whether it's gas or or renewable. I hopefully

(22:14):
we don't get to new coal to supply to supply
this demand. And the bottomneck here is actually transmission to
get to from from from the generational sources to the
to the data centers.

Speaker 2 (22:27):
Okay, and we'll actually going to ask you about Google
has been quite active on trying to develop new sources
of clean energy before actually this whole AI drive became
a thing, but you signed a couple of pretty first
of a kind PPAs in the last couple of years.
It'd be interesting to hear how progress in that is developing.

Speaker 3 (22:47):
Yeah.

Speaker 6 (22:48):
Sure, So the history of you know, energy consumption and
energy generation Google is a long one. But the short
story is this, since I've been there last you know,
ten and a half yars or so, are our energy
consumption globally has grown between twenty twenty five percent a year.
We're now approaching thirty plus maybe more tearrawad hours of
energy you know, consumed every year. That's doubling every you know,

(23:12):
five years or so. And we've also signed you know,
you know, twenty plus gigawatts of renewable energy generation you know,
contracts to with folks like A Yes and others to
to help supply energy to our facilities. We we maintain
a strong climent to our to our clean energy goals,

(23:32):
and we have an hourly carbon free energy goal a
BYT twenty thirty that we continue to chase and uh
chase very vigorously and and to that end, you know,
we The story there is you can get to about
seventy to eighty percent carbon free and energy supplies you know,
through wind and solar and batteries and sort of like
the mix of things there. But that last twenty to

(23:55):
twenty five percent, that last mile is is harder because
you have to start thinking about capacity and baseload and
reliability and this sort of stuff. And so we spent
the last few years thinking about and working on, you know,
sort of what are those next gen technologies after wind
and solar that are going to be carbon free and
start to supply you know, up to that that nearly
one hundred percent carbon free energy. And for US, it's

(24:17):
things like nuclear power, it's things like launderation storage, it's
potentially hydrogen, don't look too hard. It's potentially like carbon
capture and storage and geothermal and we've done a couple
of these in the last couple of years. We did
a deal in Nevada between US and a company called Ferbo,
who's a geothermal developer. They're using our cloud technology to

(24:39):
optimize how they drill wells and operate their wells and
operate their plants to then build advanced geothermal plants to
sell that power to Novada Energy, the utility with whom
We designed a tariff for a new rate that the
regulator is approving that's gonna a sign or a lot
allocate the costs of that geothermal above and beyond sort
of business as usual to us, the customer right so

(25:01):
that we don't so that we get the product we want,
the grid gets a clean baseload generation source, and other
right pairers don't don't pay the cost. So that's what
we're really proud of. And that's a model, you know,
that sort of rate structure and that you can slot
different technologies into is a model we're working on with
a lot of other utilities across the US. And then
the other is a partnership we recently as signed with

(25:23):
a small modular reactor technology provider called Chiros Power. Chiros
is developing Gen four small monulor reactors. They have a
pilot planned for twenty twenty nine for which we're going
to be a customer in the Tennessee Valley. And in
addition to being a customer for the pilot to help
get that commercialized and off the ground, we committed to
being a customer for their next five reactors. And what

(25:46):
we get out of that is, you know, some confidence
of having access to power, assuming that everything goes well
with our technology, but also the ability to sort of
partner with them to cite those next reactors in places
that you know, hopefully make make sense for us and
our loads and our data centers to then meet that
hourly carbon free energy goal in places we have growing load.
And so we expect in the twenty thirties to be

(26:08):
deploying small modulor reactors onto the grid. It's not going
to be like a you know, Antarctica small modul the
actor of data center behind the meter microgrid thing. That's
not the plan. It's really meant to be a grid
participant and putting capacity on the grid to supply our needs.
And so Bi you said about that as well.

Speaker 3 (26:25):
Yeah, okay, great.

Speaker 2 (26:27):
So something someone alluded to earlier was the topic of
energy efficiency and the word deep seek, which hads.

Speaker 3 (26:34):
Up in the room.

Speaker 2 (26:34):
Does everyone know like what I'm referring to when I
talk about deep seek? Yes, okay, great, darsh I'm gonna
go to you here because we're having a conversation whether
this this something that everyone saw coming. Not necessarily the
idea that like deep seek was going to release a
model and this would be the exact market reaction, but
that there was going to be big gains in energy

(26:56):
efficiency improvements. This has obviously been the history of data
centers for ages that energy efficiency has improved. So like,
why were people so surprised by this? And what's the
future of energy efficiency in artificial intelligence? Simple question if.

Speaker 3 (27:09):
You could just do great question.

Speaker 4 (27:15):
I think technology comes in waves where like you make
really good hardware and then you try to extract performance
out of the hardware as much you can. When you
reach the limit of what you can do in software space,
you go back and make better hardware. And ideally that's
like a very quick trend of like, hey, I spent
six months, let's say one to two years making hardware,
one to two years making software, and then I find

(27:35):
the holes in the hardware and I make it better. Right,
So I think, what I think, it's just more mostly
timing like this happened now, I think, yeah, this is
the expectation is that, hey, I can only buy a
certain number of Vida GPUs. What can I do with this?

Speaker 3 (27:46):
Now?

Speaker 4 (27:48):
Let me hire highly specialized PTX programmers that don't write
Kuda code. They right level below that because Kuda doesn't
solve the problem that I needed to solve. It's too abstracted,
it too, it's too power hungry, right, so I don't
get I don't get enough control over the hardware, so
I need to go to a lower level. You keep
going down that stack. You get down to hardware, then
you go down, you go to TSMC, right, then you

(28:09):
go down you get to like minerals and things like that.

Speaker 3 (28:11):
So you can keep going down that stack.

Speaker 4 (28:14):
But definitely, I think what Deep Sea proved and if
you guys with the pre the paper, the last page,
there's suggestions on how to improve in Vida GPUs, which
is really interesting because this is a redesigning them how
to use them. Yeah, the micro architecture of the Vida
GPUs is not optimal for what deep SEEQ wants basically,

(28:34):
which is interesting, right because that's the that's the cycle
we're going to go through and so not related. But
you know, the gp that we're designing solves those problems.
We did our own benchmarks three years ago and we
found out some of the problems that Deep Sea found
out as well. So there are ways to improve hardware honestly,
like the vendor should do you know, benchmarking and research

(28:55):
and make the GPUs better themselves. But yeah, it does
require some interactivity with and end user that's like, hey,
I want this to be better. Here's like four things
you can do to make it better. And that will
continue happening, right. People will make new GPUs, new accelerators,
people will make co package optics, they'll do all sorts
of fancy stuff and people will use it. But like,
I'd actually don't like the way this is running. It's

(29:16):
actually too slow for my use case. Can you fix
this thing? So I think this is like normal, but
I think there's so much focus on the volume of
GPU ship that I think people forgot that there's still
optimization room. There's a lot of headroom and optimizing software
for that.

Speaker 2 (29:29):
Okay, how do you how does that kind of coordination
work between the software developers and a video like is there?
Do they have channels to work on this together? Doing
video have their own internal research teams? Yeah, it's called
hardware software code design. Some companies do it much better
than others. We do it the best. Yeah, we do
a good job.

Speaker 4 (29:50):
But yeah, no, they have teams internally that are building
fundational models at video train them on supercomputers, AI clusters
and then they're finding these things. But I think it's
interest that, Like your next question is, well, why didn't
they find this out, you know last year when they
made the hardware, Well, it's the same thing. Why is
black WU one hundred three to two kilowats per wack
instead of one hundred and twenty five? So there are
like there are fuzzy zones where you can miss things,

(30:13):
and I'm sure I'll miss things and it happens. But
I think the magnitude of that coming out so aggressively
saying hey, we don't need this much computing power. And
also there are things in the GPU at the micro
architecture level in the silicon that I don't like that
I want you to change.

Speaker 3 (30:28):
Is is a step shift because.

Speaker 4 (30:30):
Now you're expecting every customer to go down to that level.
And I think that's the race now as to how
optimized can you get with one water one hundred watts
or one giga water or whatever?

Speaker 2 (30:41):
Do you have a kind of benchmark in your own
mind internally of this is the energy standard that like
a query and chat GPT was a year ago. How
much more energy efficient can we get? That? Is it
orders of magnitude or.

Speaker 3 (30:54):
Orders of magnitude orders of magnitude.

Speaker 4 (30:56):
Absolutely, yeah, I think this conversation is good because everyone
in the panels like it. Will we're delivering power, that's great,
that's a problem. But also like I'm a chip guy,
like we should make better chips that consume less power.
Perhaps maybe there's like a push and pulled balance there
of you know, like keep using a good job. Right,
that's orders of magnitude less, less power consumption, more efficient
than and envita GPU. It is the main specific in

(31:17):
that sense, but it does solve the problem and it's
more efficient. So you will see like a chip startups,
AI startups, quote package optic startups, all these all these
companies competing and being able to deliver much orders a
magnitude better efficiency.

Speaker 3 (31:31):
That's what we're doing.

Speaker 2 (31:32):
Okay, So for the three other panelists, I'm got to
ask you the same question is did you see this coming? Well? Actually, sorry,
The first question is how do you operate and like
you need to make these big decisions about what to
build and what to allocate resource to under this level
of uncertainty around energy efficiency improvements. But then I guess
the second part of the question is does everyone who
works in the industry kind of assume there's gonna be

(31:54):
these energy efficiency imrovements you're kind of making decisions around that.

Speaker 5 (31:57):
So ho to you first student, I think everyone expected
more efficiencies to be to be gained in the transformers
and the algorithms and and but you know, I think
he had a panelist earlier that said, you know, all
the all the models that have been trained have been
trained on the public data, and there's all this other,
you know, private data that's actually going to be you know,

(32:18):
more beneficial. But all I can say about about that
is that it was this is probably the most confusing topic,
all the experts that are weighing in and saying completely
different things. But we haven't seen any reaction, negative reaction
from of our from our customers as far as you know.

Speaker 3 (32:36):
Orders or or forecasts. It's all. It's all, you know,
business as usual. There's been like no effect at all.

Speaker 6 (32:43):
Yeah, well, yeah, I think you know, Google and look,
training models and and and uh, software and hardware are
by no means my own expertise, and so so i'm
I'm I don't have a lot to offer here except
for I think we always expect deficiency gains. I think
we welcome them. I think we're seeing the same sort

(33:04):
of efficiency gains and our own you know, model building
and model training and uh and uh, you know that
it hasn't to See's point, it hasn't really changed our
look on on our on our business planning because training
is only part of you know, the AI story and
the machine learning story. The serving and the inferences is

(33:26):
A is also a big part of it. And and
you know, don't forget that Google does myriad other things
and data centers right, and so data center, so mL
and A I are are but a portion of our
of our forecast and our outlook, and so so we
remain sort of on a stalle path.

Speaker 7 (33:41):
Yeah, I think the same thing here. I haven't seen
any real change perhaps with the with the small exception
of investors over reaction to to headlines, but but the
fundamentals of the business remain very strong. Let's not forget
here that a lot of what we're trying to solve
for in the power markets is also replacement of aging

(34:03):
infrastructure in the US, especially no cold generation gas generation.
So efficiency gains are definitely welcome because what that's going
to do is to make sure that we don't overbuild.
The worst thing that could happen here is if the
market starts overbuild, overbuilding generation, transmission, distribution, and rate payers

(34:26):
get left with massive bills and the demand doesn't show up, right,
that would be a cycle of boom and bus.

Speaker 2 (34:32):
Right.

Speaker 7 (34:32):
So we believe in competitive markets. We believe in market efficiency.
And I think if if the efficiency is on the on,
the on, the on the computing power, also if that
brings demand back to a more reasonable level. But no
matter what the projections are, the numbers are staggering. The
projections for demand for new generation and new and new

(34:55):
infrastructure for transmission are staggering. So we just hope that
that that we don't overbuild. I guess that's right.

Speaker 3 (35:02):
Okay.

Speaker 2 (35:02):
So I have one quick fire question which you're only
allowed to answer one number two by twenty thirty. What
percentage of US power demand do you think will come
from data centers?

Speaker 3 (35:12):
Not no explanation, just a number.

Speaker 2 (35:15):
There's a networking session after seven and a half, well SIXI.

Speaker 7 (35:21):
Ish, yeah, I think I'm a little perhaps a little
more more bullish than that. I think it would be
like somewhere between eight and ten percent.

Speaker 2 (35:32):
Okay, especially pretty clustered around a certain range. But thank
you very much. This is really informantive panel. I think
we learned a lot about the idea that the data
center industry is going to grow. We're very all, very
confident on that, but actually maybe won't change quite as
much in terms of its geographic structure or capital investment cycled.
So thank you very much for joining me. Please join

(35:54):
me and giving my Palels a round of applause.

Speaker 1 (36:06):
Today's episode of Switched On was produced by Cam Gray
with production assistance from Kamala Shelling. Bloomberg NIF is a
service provided by Bloomberg Finance LP and its affiliates. This
recording does not constitute, nor should it be construed as
investment in vice, investment recommendations, or a recommendation as to
an investment or other strategy. Bloomberg ANIF should not be
considered as information sufficient upon which to base an investment decision.

(36:30):
Neither Bloomberg Finance LP nor any of its affiliates makes
any representation or warranty as to the accuracy or completeness
of the information contained in this recording, and any liability
as a result of this recording is expressly disclaimed,
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.