All Episodes

November 19, 2025 48 mins

Bloomberg’s Caroline Hyde and Ed Ludlow discuss what investors are keeping their eyes peeled for when Nvidia’s earnings come out. Plus, Nvidia CEO Jensen Huang and Tesla CEO Elon Musk speak at the US-Saudi Investment Forum. And Brookfield Asset Management targets $10 billion of fund commitments for a global AI infrastructure program in partnership with Nvidia and Kuwait's wealth fund.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, podcasts, radio news. Bloomberg Tech is alive
from coast to coast with Caroline Hyde in New York
and Va Loow in San Francisco.

Speaker 2 (00:22):
This is Bloomberg Tech coming up.

Speaker 3 (00:24):
All eyes on Nvidia earnings after the closing bell, Investors
expecting to learn more about where those billions of dollars
on AI spending are actually going.

Speaker 4 (00:32):
Plus in Video CEO Jensen Wang and Tesla CEO Elil
Musk are speaking right now at the US Saudi Investment Forum.

Speaker 5 (00:38):
We'll bring you the latest, and.

Speaker 3 (00:40):
Brookfield Asset Management targets ten billion dollars of fund commitments
for a global AI infrastructure program.

Speaker 2 (00:46):
In partnership with you guessed it in Video and.

Speaker 5 (00:50):
In Vidia dictates trade.

Speaker 4 (00:51):
Right now, we are seeing signs of stability in the
Nasdaq one hundred. More broadly, I remind you that almost
two trillion dollars have been wiped off of this benchmarks.
It's the end of October in large part because in
Video has been down, but all the key mag seven
names have been under pressure. But today's some reprieve and
video in the point's perspective helping within as that one
hundred crypto though still in the EI of the storm
or by two point eight percent, that anxiety driving it

(01:13):
below ninety thousand dollars, so still some risk A version.

Speaker 3 (01:16):
Ed okay, in Vidia is up more than two and
a half percent, but off its session high. It is
a stock that's up almost forty percent year to date,
outperforming double the performance of what we've seen of the
NAZAQ one hundred in S and P five hundred. The
expectation is revenue growth above fifty percent, net income growth
above fifty percent. But all that matters is what CEO

(01:39):
Jensmong tells us about the future. Let's get with Bloomberg'ssey
and King, who leads our semiconductive coverage. I mean, that's
what it comes down to. Either they will beat consensus
or they won't. But expectations are really high for this quarter,
as is the skepticism of what's happening bigger picture with
data center instructure. Give us the things we need to

(01:59):
look out for, and what's in your preview of this
company's earnings?

Speaker 6 (02:03):
Yeah, I mean the numbers speak for themselves, right. We're
looking for a prediction in the sixty billion range for revenue,
and just to give that context, that's ten x where
we were three years ago, ten x Okay on profit
for this year, we're going to be looking at one
hundred billion dollars of net income. That's more than Intel

(02:23):
and AMD get in revenue combined. So the numbers have
come off the charts. The key here is we all
know the numbers are going to be good. We know
the forecast is going to be But the key is, well,
do we really believe the basis for those numbers? And
that's going to be the key question he's going to face.

Speaker 4 (02:40):
So in what can his response be that's more than
what are you already signals in GtC that he has
line of sight on half a trillion dollars worth of
orders not you black Bell, but Roobin into twenty twenty six?
How much more can he signal that they will remain
integral to inference as well as training.

Speaker 6 (03:00):
Absolutely right, he's essentially played all his cards in that respect.
What's going to happen will be that the investment community
are getting their first chance to sort of pull that apart,
to ask him about the details, to ask him about
the new products, to ask about the margins, to ask
about when exactly these sales will kick in and we haven't.

(03:20):
They haven't really had that chance. So that's what they'll
dig into today. And his response is how precise he
is will condition how they feel.

Speaker 3 (03:28):
There are some real concerns and there are some real
questions from videos. Some of those will get to pose
this evening to gentlemen self. There is the idea of
depreciation on older chips and circular financing that's just not going.

Speaker 6 (03:40):
Away, absolutely not. I mean, we saw a deal announced
with Anthropic yesterday, big commitment to use a lot more
of in Video's chips. But guess what Invidea is putting
ten billion dollars to work in that company over time.
So yes, that concern is absolutely not going away. And
if anything is going to accelerate, who we get a

(04:00):
clear outcome.

Speaker 4 (04:02):
Bloomg's inking, who will be across all those earnings after
the bell alongside ed We so appreciate it. Meanwhile, look,
we know Wall Street is eagerly awaiting in Vidia's results
for a clearer picture on AI spending because it impacts
the whole rest of the market. Bloomberg's US Equities report
a common rhyining keys here to just bring us the context.
We know the question is gonna be asked of Nvidia,
But what does it signal about the commitment of the

(04:24):
Magnificent seven of the key hyperscalers and more broadly the
rest of the AI trade.

Speaker 7 (04:29):
Yeah, so this is a huge moment for the AI trade.
I think a lot of people are really looking to
Invidia and the stocks reaction to sort of decide the
next direction of where the market, the entire market is
going to go. You know, in Nvidia is the largest
waiting in the S and P five hundred and the biggest.

Speaker 5 (04:46):
Name in AI.

Speaker 7 (04:47):
And what we also know about in Vidia is that
its biggest four clients are some of the other MAGS
seven members. So we're going to see where they're spending
is flowing if it's still flowing to.

Speaker 5 (04:58):
Nvidia how much.

Speaker 3 (05:00):
When I was reading your your piece which you co
wrote with Rhyan Vla Seleca, the data is really interesting.
It's gonna be a big part of the Bluma tech
audience that don't know some of that. So you just
talked about waiting right in video is about eight percent
of the S and P five hundred. That's a factor
you also looked at in Nvidia's multiples relative to the
index level the Nasdaq one hundred, for example. What are
the other key data points that have us on such

(05:23):
edge ahead of the earnings report later this evening.

Speaker 7 (05:26):
Well, you know, as Ian said, I think a lot
of people are really looking at the forward guidance. You know,
in video is expected to continue to grow revenue, even
though that growth is expected to slow in the coming years.
I think the other thing that people are really looking
for is what Jensen's going to say about what they're
seeing in the future. Right Also, is Ian pointed out
the guidance is very important here, and that's I mean,

(05:48):
probably even more important than the numbers that in Vidia
actually reports. We want to see the line of sight
into revenue growth going forward. There are also still some
questions about China, how much revenue can be expected there
or not, And so I think the sentiment here, what
investors take away about their confidence from the report, is
going to be maybe even more important than the actual numbers.

Speaker 4 (06:10):
Pretty most common Rhino Keey, it's a great story. Thank
you very much indeed for bringing us the data as well.
And now we bring you the investment perspective. Martin Norton's
here with us chief investment strategistic and Power, which administers
more than one point six trillion dollars in assets, and
that's about the same amount that's been wiped off on
nude like one hundred since the end of October.

Speaker 5 (06:28):
Yes, is there room to the.

Speaker 4 (06:30):
Downside here or are you thinking there's some sort of
relief rally from any data we get from Jensen later?

Speaker 8 (06:35):
Well, I don't think you can ever count out a
relief rally. But what I come back to when we
look at the price action that we've seen over the
past few weeks is that we have taken some froth
off the top, but we're not looking at really attractive
valuations at this point. We're still at elevated valuations for
a lot of these names. And what's interesting, and it

(06:56):
alluded to this, you know, the questions that people are
raising the current environment, I don't think they're necessarily going
to be conclusively answered within Video's report, right, the depreciation question,
the demand question, you know, extending to the rest of
the economy. I think those doubts are are with us,
which would mean potentially more volatility.

Speaker 4 (07:17):
Well, if Jensen can't tell us how long his GPUs
are going to last and what the depreciation of them are.
I'm not sure anyone have, but you're so right that
that has been again an argument about why perhaps.

Speaker 5 (07:26):
Evaluations are flush.

Speaker 4 (07:27):
If you look at a video trenting about thirty times
future earnings, that's not that elevated. So is it the
rest of the trade, the palenteers, or perhaps some of
the energy stocks that have risen to extraordinary degrees.

Speaker 8 (07:40):
Well, I think you raise a good point. It's not
that elevated, especially if you're focused on that right side
of the probability distribution, if you're looking at a full
probability distribution, and you come to the conclusion that I
don't know if I share, but some folks are raising, hey,
this isn't as transformative as people have suggested it is.
We're not going to see every application take off the

(08:02):
way people suggest it would. Then I think there's room
for some of these stacks to come down. Now, I'm
not sure I share that view. I tend to believe
that the AI supercycle is real, that it is going
to have a pretty profound impact on the economy, but
we have to make room for the full range of
outcomes when we price these things.

Speaker 3 (08:18):
Marta, it's great to have you back on Bloomberg Tech.
As is quoted in that well read story on the
Bloomberg term on Bloomberg dot com from one investor, this
is a quote, so goes Nvidia, so goes the market
kind of report from your desk, and your perspective is
that the situation here.

Speaker 8 (08:37):
Well, I think there's no question that this is a
Capstone report. It's a macro indicator, and I think has
the potential and you see that in pricing around it,
right this idea that you could have swings up to
seven percent either way. I think this is the kind
of thing that people are really going to key off of.
My question is whether this is going to answer all
the AI doubts that are out there or whether it's

(09:00):
just going to arrest them for now. But we're still
going to be wrestling with some of these questions going forward.

Speaker 3 (09:05):
Depreciation is the most common concern or question that I've
received for Gensen so far. You know, get out to
people and said what would you ask? There are people
that look at the depreciation issue and say, how can
I model for that impact on the balance sheet of
those key customers of Nvidia, But others on social media
are talking about a different data set which is utilization.

(09:27):
If those older generation GPUs are running at one hundred percent,
that's a really good problem to have, right. This is
kind of very specific, but are those soft data sets
that your team are looking at to work out what
neof's going on?

Speaker 8 (09:40):
Well, you know, I think the thing that we're focused
most on as we turn our gaze to twenty twenty
six is this question in particular about the capacity bill
that we expect in twenty twenty six and frankly in
twenty twenty seven. So our view is that, you know,
we're going to have these questions to your point about utilization,
about appreciation, but our view is that those questions are

(10:02):
going to linger as we build out the capacity for
AI in twenty twenty six, in twenty twenty seven, because
there is so much that you have to put in
place to be able to see the demand come through.
And so for us, as we go through twenty twenty six,
we're going to be watching, of course, like everybody else.
Are we seeing that capex is the conviction still there
to build out the massive infrastructure that you need for AI,

(10:24):
And I think that's the key question that we're wrestling with.

Speaker 4 (10:27):
It was interesting that you said you don't align yourself
with the negativity and actually applications of AI, right, you
do think the supercycle's real?

Speaker 5 (10:36):
What data set are you looking for that? Because we do.

Speaker 4 (10:39):
Have the MIT pilots aren't working, there's always a counter
example for every time there's a negative, but Intelligent just
have some great analysis out showing that actually only ten
percent of companies at the moment or people surveyed are
saying that they're using generator of AI for revenue or
for new product.

Speaker 8 (10:55):
Right, I think there's still an I mean, first of all,
if you take a look at kind of adoption rates
for the AI cycle relative to the Internet and things
like that, people are pointing out the Federal Reserve and
others that you've seen much faster acceleration of adoption. But
to your point, we're still very much early days, and
so I think one of the things that we look
at is just at the earnings season themselves and looking

(11:18):
at what companies are saying. And a lot of the
commentary around AI at this point is still very generic.

Speaker 5 (11:23):
It's not very.

Speaker 8 (11:24):
Specific in terms of how AI is actually transforming those businesses.
And I think also watching to see how earnings in
the broader economy are responding, Are you starting to see
those cost savings come through, which I think is of
course the first leg, and then ultimately you'd also want
to see the revenue which we're seeing from the hyperscalers.
But I think those, you know, that real world application
is really what we're going to be keying off of it,

(11:45):
and I think it's going to take time. I think
you still need to build that capacity to see the
earnings impact.

Speaker 4 (11:52):
Broadly speaking, Macha, so it's great to have you in
good thank you for coming to this year. Mantulton close
side of empower. Meanwhile, in Vidia CEO and Hung it's
already on stage ahead of his earnings publication.

Speaker 5 (12:02):
Of course, I mean while test to.

Speaker 4 (12:05):
Alongside in DC as part of the US Saudi Investment
Forum in Washington, d C.

Speaker 5 (12:09):
Take a listen to his.

Speaker 9 (12:10):
Royal Highness announced the AI strategic Framework and partnership. Today
we're going big with Elon and Jensen.

Speaker 10 (12:17):
So thank you for those opportunities.

Speaker 9 (12:23):
Now they told me I have time for two last questions.
So last night at the dinner, I got a number
of questions because it seems that the schedule leaked and
everybody was giving me hints about the last two questions
I'm going to do. So the first one was for
you Elon, and there's a big one for you Jensen,

(12:44):
So prepare for that one.

Speaker 10 (12:47):
AI in space Is that possible?

Speaker 11 (12:51):
Yes, if civilization continues, which it probably will, then AI
in space is inevitable. You know, I always have to
like preface that, you know, we shouldn't take civilization for granted.
We need to make sure to take care to ensure
that civilization hasn't an upward arc. I mean, any student

(13:15):
of history knows that civilization does not always have an
upward arc, and in fact, civilizations have life life cycles.
So hopefully we are in a strong upward arc. I
think we are for now, but we don't want to
take that for granted or becomplacent. But in order to
the way to think of AI in space is that
in order to achieve any meaningful percentage of a Kotdashev

(13:37):
two scale civilization where you're using even a millionth of
a millionth of the Sun's energy, you must have solar
powered AI satellites in deep space, so that once you realize,
like once you think in terms of a Cottashev two
scale civilization, which is what percentage of the Sun's energy

(13:59):
are you turning into useful work? Then you then it
becomes obvious that space is overwhelmingly what matters.

Speaker 12 (14:08):
Overwhelmingly.

Speaker 11 (14:09):
So the Sun only receives one roughly one two billionth
of the Earth only receives roughly one two billionth of
the Sun's energy. So if you want to have something
that is, say a million times more energy than Earth
could possibly produce, you must.

Speaker 12 (14:27):
Go into space. It's and so.

Speaker 11 (14:33):
This is where it's kind of handy to have a
space company, I guess.

Speaker 13 (14:38):
Sell the books cold chips in space too, Yes, easier
to cool chips in space.

Speaker 11 (14:43):
Yes, there's definitely no water in space, so you're gonna
have to do something that doesn't involve water.

Speaker 12 (14:49):
Well, it's just got to radiate, that's right.

Speaker 11 (14:52):
So my estimate is that actually that that that the
cost of electricity, like the cost effectiveness of AI in
space will be overwhelmingly better than AI on the ground,
so far, long before you exhaust potential energy sources on Earth,

(15:14):
long before.

Speaker 12 (15:15):
Meaning like I think even PEFs in.

Speaker 11 (15:16):
The four or five year timeframe, the lowest cost way
to do AI compute will be with solar powered AI satellites.

Speaker 12 (15:28):
So I'd say not more than five years from now.

Speaker 13 (15:32):
Wow, And just look at the supercomputers we're building together.
Let's say each one of the racks is two tons.
Out of that two tons, one point nine to five
of it is probably for cooling.

Speaker 8 (15:43):
Right.

Speaker 13 (15:44):
Just imagine how tiny that little supercomputer is, right, each
one of these GB three hundred racks and.

Speaker 14 (15:49):
Mull just be a little tiny thing.

Speaker 11 (15:50):
And just electricity generation is already becoming a challenge. So
if you start doing any kind of scaling for both
electricity generation and cooling, you realize, okay, space is incredibly compelling.
So like, let's say you wanted to do I don't know,
two or three hundred gigawats per year of AI computed,

(16:17):
It's very difficult to do that on Earth. So the
US average electricity usage last time I checked it was
around four hundred and sixty gigawats per year average usage.

Speaker 12 (16:29):
So so something like say you know three hundred, If
you're doing three.

Speaker 11 (16:35):
Hundred giga what's a year, that would be like two
thirds of US electricity production per year. There's no way
you're building power plants at that level. And then if
you take it up to say a taro wat per
year impossible, Like you have to do that in space
there they're just is there just is.

Speaker 12 (16:52):
No way to do a tarow per year on Earth.

Speaker 11 (16:57):
And in space you've got tenuous solar you've got you
don't You actually don't need batteries because it's always sunny
in space, right exactly, And and the solar panels actually
become cheaper because you don't need glass or framing.

Speaker 12 (17:14):
And the cooling is just radiative. So that's that's why
I think.

Speaker 14 (17:18):
That's the dream. Yes, that's the dream.

Speaker 9 (17:21):
So Jensen, everybody last night was asking me, and I'm
mindful it's Earning's call for you today. So I'm gonna
say this delicately. Everybody has been asking me to ask you.
Are we going to have an AI bubble?

Speaker 14 (17:35):
That's the last question?

Speaker 10 (17:37):
All right, let's not.

Speaker 13 (17:39):
All right, let me see well, let me just tell
you what we see, okay. So, so I think it's
really important when you look at what's happening around the
world and go back to the first principles of what's
happening in computer science and computing. There are three things
that's happening. The first thing is that we all know
that Moore's laws run its course, and the ability the
amount of demand for computing, since the amount of computation

(18:01):
we can get out of general purpose computing is really challenging,
and so the world's been moving to accelerated computing for
some time. We've been pushing this now for some over
twenty years. Let me give you one statistic. I was
just at supercomputing six years ago. CPUs were ninety percent
of the world supercomputers top five hundred supercomputers. Six years ago.

(18:23):
This year less than fifteen percent. Went from ninety percent
to ten percent. And meanwhile accelerated computing went from the
other way ten percent to now ninety percent. Okay, so
you're seeing that inflection point, the transition in high performance
computing from general purpose of computing to accelerated computing. Well
of the one of the most data intensive, one of

(18:45):
the most intensive computation things that the world does in
cloud is data processing. Several hundred billion dollars of computation
is done on just raw data. Process has nothing to
do with AI, just CFL processing data frames. You know
everybody's names, address, their sex, their age, where do they live,
you know how much money they make. All of that
sits into a data frame, and that data frame drives

(19:07):
the world today, whether it's in banking or you know,
whether it's in credit cards or of course e commerce
and everything from ad recommendation and everything is driven off
of that data frame. That data frame costs hundreds of
billions are always to go compute. And so that's the
number one thing end of More's lows. The second thing
is generative AI. The most important application of the last

(19:31):
fifteen years is called rexis recommended systems. How do we
know what information to recommend to us in a social feed?
How do you know what ad to recommend to somebody,
what book to recommend, what movie to recommend? The world
is the Internet is so gigantic without a recommended system
that the little tiny phone of us will have no chance.
I've ever seen the right information that REXUS is the

(19:54):
engine of the Internet today. That's going generative AI. It
used to be running on CPUs, now it runs on GPS,
which then says the third thing. When if you just
look at those two applications, many of the Internet companies
can build an enormous number of GPUs supercomputers. Just doing that,
of course, then it creates this the third opportunity on.

Speaker 14 (20:16):
Top of it, which is agentic AI.

Speaker 13 (20:17):
This is Grock and this is open AI, this is
anthropic you know, this is Gemini.

Speaker 14 (20:22):
Agentic AI sits on top of that.

Speaker 13 (20:24):
But don't you know, don't forget to think about what
is happening above, underneath what everybody sees as AI today,
there's a whole movement of computing from jenniferpose computing to
accelerated computing, and that if you just if you take
that into consideration, you'll come to the conclusion that in fact,
what is left over to fuel that revolutionary agentic AI

(20:48):
is not only substantially less than your thought and all
of it justified.

Speaker 9 (20:53):
Well, I was just informed by the team that my
boss and your bosses is going to talk next the
Honorable President and his Royal Highness to the com Prince,
and hence we ran out of time.

Speaker 10 (21:05):
But in essence, this is ah.

Speaker 9 (21:10):
Such so much love for you Elon and Jensen. But this,
in essence is a ninety two alliance that shifted from
energy to digital to the intelligence age, powered by pioneers
such as Elon and Jensen to serve humanity and create
on a net new basis, new economies, new jobs and

(21:32):
a better future for humanity powered by the Kingdom of
aud Arabia and the United States. Thank you for our
lifetime partnership and friendship. Thank you, Elon, thank you, Jensen.

Speaker 12 (21:42):
Thank you.

Speaker 3 (21:48):
That was Elon Musk speaking alongside and Vidio CEO Jensen
Huang at the US Saudi Investment Forum in Washington, d C. Probably, Caroline,
the biggest piece of music came out of it was
Elon Musk confirming that Xai is going to do a
five hundred megawat data center in the Kingo of Saudi
Arabia in partnership with Humane. That's a story that we
broke actually back in July, so we've had a sense

(22:09):
that it was coming for its confirmation.

Speaker 4 (22:11):
Confirmation of course, all eyes on really the access that
Saudi Arabia has to the latest greatest chips and what
they're able to continue to export out in the United
States to Saudi Arabia, too Humane to be able to
use on the ground when it comes to Blackwell, and
of course we're going to hear so much with your
interview later today ed on details of an AI bubble,
the vindication there that we're starting to hear from jensens to.

Speaker 5 (22:32):
Already the CPU, the GPU.

Speaker 4 (22:34):
Necessities when it comes to just our social media desires,
let alone what's happening with the genta AI.

Speaker 3 (22:39):
Yeah, we should probably point out the obvious to our
audience that in video reports earnings after the closing bell.
Jensen Wang is an experienced executive to say the least,
and he probably thinks to himself, Ah, what can I
or can I not say during the course of this conversation,
But a lot of his final answer there was repetition
about the idea that as we move from the balance

(23:02):
of workloads being inference having previously been training, that is
his evidence base for this big build out INAI infrastructure
to continue.

Speaker 4 (23:12):
And I think more broadly as well, there were going
into the realms of imagination, a realm of imagination that
I've heard time and time again. We've heard it from
Sooner pitch I, We've heard it from Jeff Bezos. Now
we're hearing it from Elon Musk about the idea that
actually the energy limitations are far less in space.

Speaker 5 (23:27):
And this is why.

Speaker 4 (23:28):
Suddenly you're hearing a lot of these executives talking about
how we might be building data centers not on this Earth,
but outside of the world at the moment. This is
an interesting way diversion perhaps of talking about the cost
of energy that seems to be going up into the
right here in the United States.

Speaker 3 (23:44):
Yeah, in space, you can put solar panels on satellites
and you can use radiation to do calling. That seemed
to be the point that Jensen Wong and Elon Musk
were making, as if by magic. In the time that
we've been speaking, Bloomberg Senior Tech Executive editor Tom Giles
has appeared on set. I mean, it's a big moment,
right if you have the world's richest man sat alongside,

(24:04):
let's be honest, probably the most important person in global technology,
you're kind of bracing for news. The news that I
saw was confirmation is something we reported. That's SAI doing
something in Saudi Arabia.

Speaker 2 (24:15):
That's right.

Speaker 15 (24:16):
It's yet again evidence of this huge need for data
centers and computing capacity, which is what they were talking
about from beginning to end. And this is how XAI
is going.

Speaker 14 (24:28):
To take part in it.

Speaker 15 (24:29):
This is how Elon Musk and his empire are going
to take part in ensuring that we have the capacity
that we need to fuel all of these services, especially
the ones that Xai is providing with GROC.

Speaker 4 (24:43):
All of this Tom Hinges on access to compute and
to GPU, how we unfolding that story as a newsroom
at the moment of Humane's access to the latest in video,
Blackwell architecture, and more broadly, how we see the relationship
for the demand dainty centers to be built out in
Saudi Arabia rather than here in the United States.

Speaker 15 (25:03):
Yeah, Caroline, I was just in Saudi a couple of
weeks ago, and the thing that one of the things
that I took away from that was this urgency to
find sovereign AI first of all, ensure that each region
of the world.

Speaker 14 (25:19):
Has the computing that it needs.

Speaker 15 (25:21):
I also saw the urgency of these relationships between US
based companies and sovereign wealth funds in partnership with the
Saudi government and other governments throughout that region. And the
idea is that there's going to be a lot more partnerships.
You're going to see a lot more collaboration. Humane is
this company that just came out of nowhere a few

(25:43):
months ago and really does seem to be wanting to
take play a big role in this data center build out.

Speaker 3 (25:52):
Before we let you go there. There is also this
issue of reciprocity. So for example, there's all these projects
announced in Saudi and other Gulf states, but the United
States government, i think, is very hopeful that those titans
of Middle East finance will also put the equivalent number
of dollars into the United States itself.

Speaker 2 (26:09):
That's right.

Speaker 15 (26:10):
They want to see they want to jointly invest in
the region. They want to see countries from around the world,
particularly this region, the oil rich region, also making investments
and showing that the US is a place to invest.

Speaker 14 (26:25):
You know, this.

Speaker 15 (26:26):
Administration wants to send the message that we're open for
business and that we're here to build jobs, and we're
here to bring some sort of manufacturing and technology dominance
back to the United States.

Speaker 12 (26:38):
Being both.

Speaker 3 (26:38):
Senior Executive Editor, Tom Giles, thank you very much. Don't
forget to tune in. This afternoon, we have an exclusive
interview with Video CEO Jensen Wong following the company's earnings.
Print that conversation around six thirty pm Eastern time.

Speaker 2 (26:51):
Okay, coming up.

Speaker 3 (26:52):
Pooja Goyle from Carlisle joins us to talk about AI
infrastructure spending from a very different side of the table
very much looking forward to this one. This halftime, We'll
be right back. This is Bloomberg Tech. Welcome back to

(27:13):
Bloomberg Tech. Nvidia is the super Bowl moment today earnings
after the bell, the stock up more than two percent
off its session high. It is a stock that's up
almost forty percent year to date, and there are very
high expectations, but there is also very high skepticism about
what is happening in this AI infrastructure build out. I
will continue to track it throughout the hour. It is

(27:33):
the big one. But one big big move to the
upside is Alphabet, parent company of Google. Shares trading at
a record high, on track for their biggest jump since
about mid September. Yesterday, Gemini three was released, an executive
saying that this is a big jump in the model's
abilities for reasoning and coding. This seems to have been

(27:56):
some kind of delayed response in the stock. You know,
you did see others like am Outman of Open Ai,
even Elon Musk on social media congratulate Google what they've
achieved with Gemini free.

Speaker 2 (28:06):
Now that's playing out in the shares character it is.

Speaker 5 (28:09):
And well cool.

Speaker 4 (28:10):
One hundred folty billion being added in terms of honey capitalization,
we're up at more than three and a half trillion
for Google. Now let's break down though, what this so
called Gemini means man deep singers with us being the
intelligent senior tech analyst joining us.

Speaker 5 (28:22):
Is it a big leap?

Speaker 14 (28:24):
It is?

Speaker 16 (28:24):
And when you look at some of the benchmarks they
showed in the paper around visual reasoning, I mean everyone
has been focused on multimodality. This was the true kind
of model where you could see multimodality in action in
terms of okay, we can do code, the model can
also do image generation and visual reasoning, which is what

(28:45):
you see in Weymel. I mean when I think about
you know why they're so successful with Weamel. Yes, they've
been doing it for the longest, but also some of
it is AI that's coming from their models, and I
think that was reflected in the paper. And look at
how far they've come in the past two years from
that botch Bard launch to now Gemini three model being
a frontier model, so really well executed. And I think

(29:09):
it was all on TPUs. That's the other thing, right,
n ro N video GPUs used for training, which everyone
still relies on Nvidio for training.

Speaker 14 (29:18):
So that's a big.

Speaker 3 (29:19):
Le Mandy, this is interesting. We were reading your research
this morning. I think we're going to bring it up
on the screen. So you're basically saying that if this
is evidence of the success of the TPU, Google's custom chip,
that might free up Google Cloud or GCP to take
their n video allocation and then put it to work
for customers, which is a good thing for their cloud

(29:41):
business when it comes to external facing customers.

Speaker 10 (29:43):
That's right.

Speaker 16 (29:44):
And so look, Google is still buying in video chips.
In fact, they are one of the top three customers
for Nvidia. And so when I look at you know
how everyone is using their Nvidia allocation, some of the
workloads are in fact for Meta everything is being consumed
inside you know Meta with the family of apps for

(30:06):
training and for inferencing and recommendation systems in the case
of alphabet, I mean, given everything internal is running on TPUs,
Google Cloud is where they deploy a lot of the
Nvidia allocation, whether it's the latest Black veil or the
prior versions. And that's where you can rent it. You
can generate revenues same way as new clouds are doing it.
And so from that perspective, I do think that cloud

(30:29):
revenue could get a lift just because they are more
availability of in video GPUs over there.

Speaker 3 (30:35):
I really recommend you go read that research if you're
a terminal client. If you're not, maybe I'll post it
on the social media's later. You just heard Man deep
Seeing of Bloomberg Intelligence break it down. Thank you very much.
AI infrastructure spending news keeps rolling in Brickfield Asset Management
is teaming up with in Video but also creates wealth
fund targeting ten billion dollars of commitments for a program
to build global AI infrastructure. The big plan is to

(30:57):
acquire up to one hundred billion dollars of data center,
energy and other assets. I want to discuss with Pooja Goyle,
she's the partner and chief investment officer for Carlisle's Infrastructure Group.
That's just a piece of news, but the structure of
it is a really interesting case study for what's happening
right now in AI infrastructure. A financing arm getting commitments

(31:19):
for a fund, partnering with some of the players in
video in the technology case, and then saying over course
of time, we're going to go out and buy these assets.

Speaker 2 (31:29):
How do you make a response to that. How do
you react to that.

Speaker 17 (31:33):
Well, first of all, Ed, thank you for having me
on your show. And look, from our perspective as infrastructure investors,
we have a pieces driven approach to investing in infrastructure
assets and a longer term time horizon, and we do
believe that AI infrastructure is a significant investment opportunity for us.

(31:53):
Now you're at Carla, we're developing over twenty gigawatts of
data center capacity, primarily scale data center capacity. But we
aren't just developing these assets in isolation. We have taken
a much more comprehensive view where we are looking across
the value chain for AI infrastructure and we are making

(32:14):
sure that we're developing these assets with that comprehensive lens.
So that means absolutely developing data centers, but also addressing
one of the most significant bottlenecks when it comes to
data center development, which is access to power. Look, you
had a little bit of a snapshot where you were
watching Elon talk about AI and he was talking about

(32:35):
access to energy being the one most significant bottleneck. The
way we are developing AI infrastructure is that we are
building these large scale energy campuses where we have power
generation capacity. That's co located next to this data center capacity.

Speaker 5 (32:54):
That's Copier power. Am I right? This is actually something
that you formed.

Speaker 4 (32:57):
I was reading about the release back in twenty twenty one,
a new portfolio company that's just building out in terms
of a platform nature these campuses.

Speaker 5 (33:06):
The scale is extraordinary.

Speaker 4 (33:08):
What was interesting was back in twenty twenty one, it
was all about sustainable infrastructure.

Speaker 5 (33:12):
It was all about renewable power sources.

Speaker 4 (33:14):
Is that realistic now when we think about the energy necessity?

Speaker 17 (33:19):
Yeah, I mean, Caroline, You're absolutely right. From an energy perspective,
you need to take in all of the above approach.
So absolutely you need solar and storage, but gas is
a very important part of.

Speaker 5 (33:31):
The solution as well.

Speaker 2 (33:32):
Look at Copia for example.

Speaker 17 (33:34):
We think of data center development as building large campuses,
and a campus is basically like a mini city that
has multiple gigawads of power generation capacity. This includes gas,
solar as well as storage. That capacity is connected onto
the grid and located right next to that power generation capacity.

(33:56):
A hyperscale data centers that are also connected to the grid.

Speaker 2 (34:00):
So I'm not talking about building islands.

Speaker 17 (34:02):
I'm talking about building a fully integrated city or a campus,
which is a better term for it, and by doing that,
you're making sure that data centers are getting access to
power in a more timely manner. Remember, timing is very
important here, and that power is actually cost effective. Cost
an economics matter a lot here, but more importantly, it's

(34:25):
also more reliable power. Everyone's talking about five nine reliability,
which is ninety nine point nine nine nine percent. In
order to do that and build long lived infrastructure assets,
that reliability is just a very important point. So at Kopia,
for example, we have a site that we're building in Arizona.
It is about three times the size of Manhattan. Once

(34:46):
that site is fully built out, you're talking about thirty
billion dollars in capital investment between the power generation assets
as well as the data center assets. And then Coopia
has another five campuses in the web behind that. So
you want to make sure these campuses are located close
to where there will be demand for compute power, but

(35:08):
you also want to make sure it's going to be
cost effective, delivered on time, and also reliable.

Speaker 4 (35:13):
I wish we had more time, absolutely fascinating the size
and the scale. Pooja Goil, come back soon. Tell us
how these campuses are evolving. Chief Investment officer for Carlile's
Infrastructure group. We thank you. Let's just turn attention to
Nokia now, which is also streamlining its business to focus
on the networking infrastructure that can connect all of these
data centers.

Speaker 5 (35:31):
Now, Ka CEO Justin Hotel spoke with us earlier.

Speaker 18 (35:35):
With AI, the market is going to change dramatically. It's
already changing in the data center, which is a part
of our business. You know, we're building AI factories. Obviously
Jensen talks a lot about this, I know in video
has earnings later today. But we're connecting data centers to
each other and building massive AI factories. That's using our
optical technology, it's using our IP routing technology. And where's

(35:55):
the future headed. The futures headed to physical AI, robotics,
autonomous vehicles, delivery drones, ARVR, you glasses, more and more devices.
And fundamentally, that means our networks need to change to
handle that, and that's what we're planning for and anticipating
to seize that opportunity.

Speaker 16 (36:11):
Can you briefly describe the growth opportunity here and how
the profile, the growth profile of your company will change
as you make the shift.

Speaker 14 (36:18):
Yeah.

Speaker 18 (36:18):
Look, I mean today you see the pockets of growth
that we have in the fixed infrastructure, and as we
see this build for AI native networks going into mobile,
we're going to see tremendous growth there. It's just not
coming in the next few years, or we believe it
will come over the longer term as the market invests
and builds, but it's not going to be in the
next couple of years. So really think of our business
in a couple of ways, capturing growth and fixed infrastructure today,

(36:41):
and then in mobile infrastructure, positioning the business for technology
innovation and longer term growth as that market picks up.

Speaker 8 (36:47):
What industries do you expect to be most dominant in.

Speaker 2 (36:50):
Are there particular industries that you think will.

Speaker 8 (36:52):
Be fastest to adopt the AI connectivity that you're hoping
to provide.

Speaker 18 (36:57):
Yeah, I think there's a few things. First of all,
you know, there's no question that the core tech industry
that we're in today is the engine of growth, right
AI and cloud customers, hyperscalers, cloud providers, and that's of
course serving the technologies we have today l l ms,
you know, AI agents. But when we look ahead, I
see it being in areas like transport and logistics and manufacturing,

(37:19):
physical AI also an area that we already serve that
we call mission critical enterprises. Think of public safety, uh,
you know, rail transport. These are these are places where
AI can add better reliability, better security, better outcomes for people.
If you think about public safety, for example, emergency services,

(37:39):
and those are areas where I think we'll see fast
you know, we'll see faster AI adoption. But I don't
think we can. You know, we can predict the future perfectly.
We need to just anticipate what are the use cases
and the needs. It may be that, you know, delivery drones,
other retail applications also accelerate and those are early users.
So for us, it's about building the plumbing and the
core capability.

Speaker 14 (37:58):
That we need.

Speaker 3 (38:00):
There was Nokia CEO Justin Hozard Okay coming up in
earnings reports, investor presentations, and company memos. His egitors have
been touting efficiency gains from AI and pointing to the
tech for shrinking or flat workforce as we are more
in that next carac.

Speaker 4 (38:16):
Meanwhile, we're watching a deal that has sent sem Rush
holding shares skyrocketing seventy four percent, Adobe agreeing to buy
the marketing platform first take over announcement of course this
has failed acquisition of Figma. It's all cash deal, twelve
dollars for share.

Speaker 5 (38:31):
This is a blue bag tech.

Speaker 13 (38:46):
Looking ahead, however, layoffs and reduction and hiring plans due
to AI use are expected to increase.

Speaker 19 (38:55):
You see a significant number of companies either announcing that
they are are not going to be doing much hiring.
We're actually doing layoffs and meant much of the time
they were talking about AI and what it can.

Speaker 20 (39:08):
Do, the supporting cast of the soul crushing work is
now being done by agents. They work hard twenty four
x seven. You don't have to pay them, and they
don't need any lunch, and they don't have any healthcare benefits,
so they're very affordable and that really complements our workforce.

Speaker 4 (39:25):
Just a few who have been hearing about AI's impact
on the workforce. Increasingly executives have been laying the blame
for job cuts at technology's feet. Nelu Meg's editor for
AIU Seth Figeman joins us now and Seth, I'm hearing
the term AI washing. Is it actually that AI is
to blame for these job cuts or it's a nice excuse,
but actually they overhearede in COVID and this is a

(39:46):
good way of announcing job cuts.

Speaker 14 (39:48):
Yeah.

Speaker 21 (39:48):
I think it's a little bit of both. I mean,
first off, the step back. We're seeing a real shifting
how we talk about this. Even six months ago, a
year ago, companies were pretty sheepish about saying AI had
anything whatsoever to do with cost cugging headcount reduction, because
I think no one wanted to be the poster child
for massive job displaced unemployment. Nobody wanted that back headline.
Something has shifted in the last six months, And I

(40:09):
think it comes down to one, we are in a
bit more of a difficult macroeconomic environment where companies one
have more reason to be cugging costs, and potentially AI
improving against that backdrop is going to be a perfect
storm for those cost cuts. But two, it's a little
bit easier probably for investors to say it's AI than
to say we over hired, we're a bloated workforce, we're

(40:29):
dealing with outdating technologies.

Speaker 14 (40:31):
Let's just say AI.

Speaker 3 (40:33):
So in terms of a post a child or maybe
a better phrase of case study, you have Amazon. Right,
So in June Andy Jesse signaled or said, you know
that this would happen long term because of AI, but
then more recently when Amazon actually did job cuts, that
was not the messaging that's right.

Speaker 21 (40:51):
I think he came out there and said, well, you know,
not yet you know. And I think that the Amazon
cuts maybe speak to a different phenomenon that's very TEXTPASI,
which is that we're seeing a lot of tech companies
do significant content. Some of that is because of overhiring
during the pandemic, to be sure, but also these same
tech companies are reallocating substantial resources to compete in the
larger AI race, and as a result of that, they're

(41:13):
trying to trim and be more efficient in other parts
of their businesses. So AI is a part of it,
but it may not just be because chatbots are taking
people's jobs.

Speaker 5 (41:21):
There is data we can reflect on.

Speaker 4 (41:23):
I think the New York State is the first state
that said when you make big layoffs, you've got to
say whether it's AI automation related. We're seeing in the
Challenge of Gray and Christmas numbers, thirty one thousand jobs
was sort of AI sacrificed in just October elone.

Speaker 21 (41:38):
That's right, but again the challenger stuff is also based
on how people how companies are representing that publicly. To
your point, though, I think other states are trying to
emulate the New York legislation. We would love nothing more
than greater transparency on this rund because I think there's
a lot of fear, there's a lot of misinformation, and
that would help us really set back from fiction.

Speaker 3 (41:56):
Beg Seth Figman, Thank you very much. Let's get to
another top story. Meta has secured a key legal victory
against the Federal Trade Commission. The FTC alleged the company's
purchases of Instagram and WhatsApp violated antitrust law.

Speaker 2 (42:11):
A judge didn't agree.

Speaker 3 (42:12):
Bloomberg's Riley Griffin joins us with the details. I think
let's start with the basic legal reasoning that the judge
gave what was the decision based on what happens.

Speaker 22 (42:21):
Well, and you have to remember that when the FDC
first launched this lawsuit, that was five years ago, that
was Trump one point zero. The social media landscape has
changed drastically since then, and that was really the reasoning
behind his ruling. The FDC had argued that me Wei
and Snap where it's only competitors, and you and I
have spoken plenty of times about what TikTok is doing

(42:43):
to Meta's market share. That's what Judge Boseberg said as.

Speaker 4 (42:46):
Well, and then Meta gets up and says, see, we
have got so much competition, it's fierce out there, and
continued to tackle it. It's an interesting sort of argument
to have to make to your investor base employees that
you are under threat in some way.

Speaker 22 (43:03):
It's such an important point, Caroline, because really this win
is also a warning. Looking forward, Meta is going to
have to grapple with Judge Boseberg's comments, which are that
it is not differentiated from its competitors and that TikTok
is eroding its market share.

Speaker 2 (43:18):
This is probably the most difficult question.

Speaker 3 (43:20):
But what happens next is that just it now, it's
all done, or there are some changes that Meta has
to make, or there will be more legal challenges down
the road.

Speaker 22 (43:28):
So we've been speaking with analysts. Nobody expects the appeal
process to proceed, but we're going to wait and see.
Really this means that Meta doesn't have to spin off
Instagram or WhatsApp. That was the overhanging threat, but a
big win one that was really priced in. Analysts had
expected Meta to take the w here.

Speaker 4 (43:47):
Bru Mags, Ridy Griffin has been alled across the story,
thank you so much. I mean, while coming up, we're
going to get back to the Nvidia earning So course
we are. Will Street is awaiting AI signals. We await
an exclusive conversation with Jensen Wang.

Speaker 5 (43:58):
The ed will conduct is blut that tech.

Speaker 3 (44:15):
Turning back to Nvidio ins investors awaiteds results after the
closing bell Conjensavanni Bloomberg Intelligence ci Alis wrote at the
end of October that in Video's partnerships could broaden its
revenue and quote. With China constrained expansion into quantum robotics
and networking reinforces a long term growth trajectory. Conjensavaranni joins

(44:35):
us now here in San Francisco. Whatever happens in the
quarter happens, It's all about this kind of long term story.
You wrote that note after GtC and DC, and you
seem to have seen enough to think that long term
story is intact.

Speaker 12 (44:52):
Definitely.

Speaker 23 (44:52):
I mean, it's not going to be able the numbers,
as you said this QUAD, but I think given the
macro and sentiment backdrop that have been rising concerns or
question regarding sustainability of these deals be double our customers
double ordering or double securing supply and see the final
can supply keep up even if the demand is true

(45:12):
and can sustain whether it's from the supply chip side,
from TSMC and packaging, or whether supply from the end
of data center build up that the customers are trying
to build.

Speaker 2 (45:21):
Can they execute that fast?

Speaker 5 (45:23):
Qinjin. When we heard a GtC from Jensen.

Speaker 4 (45:28):
About the five hundred billion dollar half a trillion line
of sight, how real and tangible are those orders?

Speaker 23 (45:35):
I mean when you look peel into the onion, the
timing of that was sort of over six quarters and
it did some analysis and that just suggests basically five
to ten percent above consensus. So a very very achievable
and very executable target that he laid out.

Speaker 3 (45:49):
That's one reason why expectations is so high going in
since Aday that single slide behind him on stage, five
hundred billion dollars. If there are some questions to pose
to Jensen, Wang and Nvidia, not just today but on
an ongoing basis, what are the it's your minds conngent.

Speaker 23 (46:07):
I think we need more clarity around these deals. Right,
there's deals when it comes to sort of media securing
revenue by these investments and sloan. There are questions around
depreciation schedule of their GPUs and again, finally there seems
to be or at least implies some sort of securing
scarcity of GPU securing from multiple different provider, whether it's

(46:28):
other merchant silicon providers or acy providers.

Speaker 5 (46:32):
Can I ask.

Speaker 4 (46:33):
About the depreciation of GPUs? Is Jensen the key person
to ask about this? He's obviously understanding exactly how they're
using and how they're appreciating within other those data centers.

Speaker 5 (46:43):
Can he give a signal as to whether.

Speaker 4 (46:44):
Companies on their own forward looking basis estimating right whether
it's three four years, four years, five years.

Speaker 23 (46:52):
Well, definitely from a technology perspective, he can definitely give
that answer. But there are two input factors here. One
is realistically, how long can you use this chips, which
we believe of three to five year period seems fine,
But there's also business decisions that the customers are making
when they're evaluating lifetime of these chips, whether will they
upgrade to newer chips, will these chips be still valid

(47:13):
to use their models which are increasing at unprecedented rate.

Speaker 3 (47:17):
There were people out there that say the kind of
micro barriers of this world are wrong because those older
generation chips are one hundred percent utilization. Is there a
Bloombag Intelligence house view on that.

Speaker 23 (47:28):
Well, again, from a technology perspective, we think that useful life.
We're seeing most of the cases are correct and the
chips can be used that long. Whether from a business perspective,
whether from a GPU rental pricing perspective, that's valued or
not depends on the customer use cases.

Speaker 5 (47:44):
Con Jin, you've got a busy day. We so appreciate
that you've come on to front Ron.

Speaker 4 (47:48):
What is our super bowl? Kunjin, Sivanni and Bluembag Intelligence,
We thank you. Do not forget to tune in later.
You've got to tune in for an exclusive inefit with
Invidia CEO Jensen Wang following his earning.

Speaker 5 (47:59):
Six thirty pm is going to be thirty pm Pacific.
Who's doing it?

Speaker 3 (48:02):
You're doing it in Yeah, it's going to be an
interesting conversation. There are difficult questions for him, the depreciation factor,
circular financing, which they've already pushed at back up before.

Speaker 2 (48:14):
But let's see what's in the print.

Speaker 5 (48:16):
Yeah, that does it? From this edition of Bloomberg Tech.

Speaker 3 (48:19):
Yeah, don't forget to check out the podcast. A new
way to find it online on all the Bloomberg platforms.
This is Bloomberg Tech.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.