Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, podcasts, radio news from the heart of
where innovation, money and power collide in Silicon Valley and beyond.
This is Bloomberg Technology with Caroline Hyde and Ed Ludlow.
Speaker 2 (00:35):
Live from San Francisco. This is Bloomberg Technology coming up.
US President Trump accuses China of violating an agreement with
the US to east tariffs, escalating tensions between the two countries.
Speaker 3 (00:47):
Plus sales of Apple's.
Speaker 2 (00:48):
iPhone are set to take a big hit from Trump's
new tariff policy. We get into the numbers and we
speak with the CEO of AWS as the leading Hyperscala
runs faster on a What a week it has been
in the world of technology.
Speaker 3 (01:02):
I'm looking at the Nazak one hundred.
Speaker 2 (01:04):
It is a megacap and technology heavy index, and over
the course of the week we're up a percentage point.
We're actually lower in Friday session, but this is a
rebound from last week where earnings and that tariff story
have been the driver of the markets and the headlines.
There are two names in particular that we have been
so focused on. Those names are in video. Of course,
(01:24):
we had earnings twenty four hours ago, thirty eight hours ago,
forty eight hours ago, and then we spoke to Gensen one.
It's giving some of the gains that it had seen back,
but there is a big question still about China and
cross border trade of the lead edge technology. And then
Apple again softer four tens and one percent just twenty
four hours ago, and Video was the world's most valuable company.
(01:47):
Apple had regained some status as a three trillion dollar club.
Speaker 3 (01:50):
We're under pressure. Now. Here's the top story.
Speaker 2 (01:52):
China's planning to allocate seventy billion dollars of capital in
a new effort to fast track new infrastructure projects and
up its own economy from ongoing US tariffs.
Speaker 3 (02:02):
That's according to sources.
Speaker 2 (02:03):
Meanwhile, President Trump reigniting those tariff war concerns, posting on
True Social that China has quote totally violated its agreement
with US or US, offering no details of what agreements
China could have violated. Let's get more from Bloomberg's Mike Shephard.
Do we know what the president is talking about here?
Speaker 4 (02:24):
Well, we haven't gotten a clear signal from him directly
about what had set him off this morning, but we
have seen ed over the past few weeks since that
sit down in Geneva, between Treasury Secretary Scott Besson and
his Chinese counterparts signed that that little agreement that that
daytime that they had forged there in Switzerland was starting
to come apart. The Chinese government had objected to some
(02:47):
export controls that the US has increasingly been applying to China,
keeping advanced technology from getting in there, And of course
this has been a sore spot for Beijing for years,
with limits on exports of semikins, but they have been escalating,
and in more recent weeks we even saw the US
go after makers of chip design software, telling them essentially,
(03:09):
you can't sell to China without violating our export restrictions.
So that is something that already Beijing was seeing as
a violation of the spirit of those conversations. We had
heard that a week ago, but now today we hear
from the top US trade negotiator Jamison Greer talking about
rare earths and critical minerals. And during that discussion in Geneva,
(03:31):
the two sides had agreed that Beijing might lift some
of the barriers that it had placed in retaliation for
US export curbs on exports to the US of critical minerals.
Speaker 3 (03:41):
So there is a bit of.
Speaker 4 (03:42):
Tit for tat that we are seeing here, and it
may be bubbling over in the form of Donald Trump's
truth social feed here today.
Speaker 2 (03:49):
Yet Yeah, I'm reading the news on the Bloomberg terminal,
and I go back to what Michael Kratsios, who runs
the White House Office of Science and Technology, right told us.
The policy from this administration is promote and protect. Protect
is the tariffs bit right. Promote is on shoring and
stimulus for the tech industry. China is doing the same thing, right,
(04:09):
It's trying to drum up some money seventy billion dollars
we're reporting for its own industries in China.
Speaker 3 (04:15):
What do we know about that?
Speaker 4 (04:17):
Well, And that's a good question, and I'm glad you
brought that up, because it's unclear whether the administration here
is responding to that report of additional Chinese stimulus toward
artificial intelligence and other high tech projects that they have
planned as a way to spark their economy and make
sure that they have some domestic demand to satisfy the
(04:40):
industry that is trying to cultivate. But nonetheless it would
not sit well with the US administration here because they
are trying to bring some of that manufacturing to American
shores to try to build up the American tech manufacturing base.
We've seen so much of it, especially in semiconductors over decades,
move to Asia, and Donald Trump there's been applying pressure
(05:01):
to none other than Tim Cook, the Apple CEO, to
try to get the iPhone production moved here. Of course,
in our conversations with Mark German and so many others,
we know that that is a bit of a pipe dream,
but nonetheless it is a priority for the administration, and
so you do see, as you said from your interview
with Michael Kratzios, this desire to promote American tech here
(05:22):
at home, but also protect it against the challenge it
sees coming.
Speaker 2 (05:26):
From China, Bluembos, Mike Shephard, do not go far, how
a sense, more news is coming on those topics. Meanwhile,
sales of Apple's iPhone and those of its rivals are
set to take a significant blow from President Trump's tariffs.
IDC research showing smartphone growth of just zero point six
percent post tariffs. For more, Nabila popeu IDC senior research director,
(05:50):
joins us, I always track your data, not just the
backward looking data, but extrapolating out and looking forward, you've
kind of now got the post tariffscenario. Just explain what
you published overnight and how this impacts in particular Apple.
Speaker 5 (06:07):
Hi, thank you and nice to be here as always. Yeah,
so exactly, And whenever we publish our forecast, we always
have to show what was the scenario before, right, So
in our latest forecast, which we just published, we're showing
a flat growth and it's really important to see how
that's changed, right, because twenty four was.
Speaker 3 (06:25):
You know, the year of we had like twenty three.
Speaker 5 (06:29):
We basically we've been in a few years of decline.
In the twenty four exactly, thank you for showing that
up was a big year of growth. And then twenty
five was supposed to continue this growth. But we were
supposed to have about two percent two point three percent,
you know, continued recovery. But since February, when was our
last forecast was published, we've been going through such tumulus times. Right,
(06:52):
there's been a world ruin of uncertainty. You've seen that
we're thrown around many times. We've had to pull down
the forecast because of our we see high uncertainty, so
much terrrift supply chain turmoil, and there's a lot of
you know, just soft demand across the board in large
global markets.
Speaker 2 (07:13):
I'm trying to understand policy and how you factor policy
into that model. Right, So, the situation was that smartphones
were given a tariff exemption.
Speaker 3 (07:23):
That was mid April.
Speaker 2 (07:24):
Then President Donald Trump just this month May said that
for smartphones made outside of America there should be a well,
let's be honest, not just smartphones, the iPhone made outside
of America, there should be a levy of twenty five percent.
Speaker 3 (07:41):
How have you factored that in?
Speaker 5 (07:44):
So you know, that came out right actually after we
had finalized our forecast. So we you know, what we
had taken into consideration, right, because we knew that even
when the exemptions were announced or whatever wherever the tariff stood,
that it was all temporary. That things could always change.
So we had to draw a line under stand at
some point.
Speaker 6 (08:03):
But there was you know.
Speaker 5 (08:05):
We had we did bacon that there is I think
things could always change. So there the what we're you know,
saying to our clients or anyone that you know essentially
asked us this question is that that there is a
huge downside risk to the US forecast and the tariffs
that twenty five percent essentially brings, you know, would impose
(08:26):
the US market, right, It would impact increased prices, It
would change the demand right to.
Speaker 3 (08:33):
The US market.
Speaker 5 (08:34):
Right now, we're expecting the US market to grow about
one point nine percent, and what which is actually US
and China are what are holding up that even that
flat zero point six percent growth that we're seeing globally,
but it was pulled down Both those large markets were
previously expected to grow higher. And if this twenty five
percent were to grow go into effect, you know, US
(08:57):
market would potentially even go into a decline.
Speaker 2 (09:02):
We should say that that this data from you and
the newsflow, it doesn't just apply to the iPhone, right,
it also applies to other smartphone makers like on Android
os as well. But this has been our focus, which
is Apple and the pressure that Tim Cook has been
under from this administration. The supply chain is a question mark.
(09:24):
But right now, do you envisage any impact to pricing
of iPhone or any giant shift to where that company
manufactures that handset.
Speaker 5 (09:35):
So I'll address the second part first, right even you know,
when when you know President Trump announced this potential twenty
five percent for anywhere he even said that it's not
just Apple, it would be to any brand that as
long as phones coming into the US were not made
in the US, it would they would face potentially twenty
five percent or anything wherever they lack, Right, We do
(09:57):
think that despite wherever, whatever, if you wherever the tears
happen to land eventually if they do, regardless of that,
we think smartphone OEMs will continue to diversify outside of
China and Vietnam and India. We said this in our
press release, will continue to be the two primary, you know,
(10:17):
global hubs for smartphone manufacturing for reasons that everyone on
your show that has come on to say, right, for
because of the supply chain ecosystem that has already been
established there prior to all this tariffs madness that began
earlier this year. Right, it's really hard. I think anyone
that says smartphone manufacturing, I mean, you understand that, I
(10:40):
understand the ideal ideology behind bringing manufacturing to the US,
And the question has also been asked, it's a dream,
but I you know, I don't see any reality where
that is possible. So it's just yeah, So that's the
reason why we do feel that it's continued, those two
regions are going to continue to be the focus of
oh m's diversification plans.
Speaker 2 (11:03):
IDC Senior Research Director Nabila Popad's great to have you
on the show.
Speaker 3 (11:06):
Thank you so much.
Speaker 2 (11:07):
Now coming up, UiPath raised its outlook, but competition in
agentic WORKFOW solutions it's intensifying. We'll talk to CEO Daniel
Dines about what that means for his company.
Speaker 3 (11:17):
That's next. This is Bloomberg Technology.
Speaker 2 (11:31):
Automation software company UiPath raised its full years sales outlook
yesterday when it released its earnings. The company also stressed
that it's public sector clients renewed contracts. Let's get more
on this with UIPAR CEO Daniel Dines. Daniel, it's great
to have you back here on Bloomberg Technology. You know,
a point of discussion for most of the year so
(11:52):
far has been the impact on of joje on companies
just like yours that have government federal level contracts. I'm
trying to read between the lines and the transfer to
the earnest call. Was there any impact to you, positive
or negative from that doge activity.
Speaker 7 (12:11):
Thank you for having me. Look, we had a great
event in dc A month ago and we talked to
it a lot of agency. There is a renewed interest
in agentic automation as a space, and we basically renewed
(12:32):
our contract. We have a great deal with the Air Force.
They are coming with a great initiative called Airmen which
is aiming at a very transformative of their business processes.
Speaker 6 (12:45):
So on.
Speaker 7 (12:46):
In one, I think it's a more positive environment in DC,
but still there is a lot of moving pieces in
transition there. From our perspective and our guidance, it's not
a big difference.
Speaker 2 (13:06):
Many points to new ar R R additions and then
being low.
Speaker 3 (13:12):
What then accounts for that.
Speaker 7 (13:17):
Well, we continue to execute. Well, we just finish a
big transition. We had an entire year of changes in
the company. We accommodate everything to go really big in
agentic automation.
Speaker 3 (13:33):
Basically, what is the growth picture?
Speaker 2 (13:38):
Our analyst right that growth indicators a mix for UiPath.
Would you agree with that analysis of it being mixed?
And if you do see growth, where's it coming from?
Speaker 7 (13:49):
Well, I think that agentic automation is a tremendous opportunity
and it's I think everybody agrees that this is gonna
be very trans formative on the business landscape.
Speaker 3 (14:02):
We believe this is.
Speaker 7 (14:03):
Going to be a bigger opportunity than RPA, and we
are really well positioned to capture a significant share of it.
So all in once, I think this is really our
biggest opportunity for growth.
Speaker 2 (14:18):
I'm looking at the cost of inference right now, you know,
particularly in the context of companies just like yours trickling down.
A lot of companies are talking about the momentum that
agent KI is giving them. What's your point of difference.
What is it that you're able to offer that others
are not.
Speaker 7 (14:39):
I think a lot of agents that we are seeing
today are conversational agents, but customers would like agents that
are deployed in the context of their existing business processes.
And how can you trust the agents? It's the main
point of the conversations. And we offer a technology to
(14:59):
ork straight between RPA bots and agents and humans that
really enhance the reliability of agents and the trust of customers.
Speaker 2 (15:12):
Right now, what is the biggest cost consideration for you?
I talked about inference costs coming down. Even so, investors
in particular love to see a commitment to access to
compute and also just some like long term commitment that
the investment's worthwhile.
Speaker 7 (15:32):
Yeah, I would not say that. In our business, the
infant cost is significant. The real cost comes with the
implementing of the agents. Still, the capability is required to
create really good agents that mimic the capability of humans
(15:55):
are kind of rare, and we are working a lot
to build technology that helped to democratize the ability to
build agents. To me, this is gonna be the real
cost of deploying a gentic automation.
Speaker 8 (16:10):
That's key.
Speaker 2 (16:12):
Daniel dines UiPath CEO, thank you so much for joining
us here on Bloomberg Technology. Now coming up, Microsoft's push
to get corporate customers on board with co Pilot seems
to be taking off. We've got the details on its
AI sales coming up next.
Speaker 3 (16:25):
This is Bloomberg Technology.
Speaker 2 (16:42):
A growing push for sovereign AI has led to a
wave of data center announcements, including one this week from
Bell Canada. It plans to invest hundreds of millions of
dollars to build AI data centers across the country, and
it is named chip startup Grock as the project's exclusive
inference partner. Grock CEO Jonathan Ross spoke with Caroline Hyde
about sovereign AI, but also US chip curbs on China.
Speaker 9 (17:05):
From grox's beginning, we have done no business in China,
and this is not a geopolitical thing. This is just
a shrewd business thing. We saw all of these very
large tech companies go into China and lose over and
over and again, and there's just a thumb on the
scale for any Western company. I hope that someday China
allows real competition because real competition makes people stronger, and
(17:27):
I want to compete against those Chinese companies. But until
they allow real competition, we can't do that. So we've
blocked all Chinese companies from having access to our API.
We will not sell chips in China because they're just
going to reverse engineer them and try and build them themselves,
and they're not going to compete fairly. As a result,
when we do these deals, we're not allowing Chinese companies
to run on them, and we do build operate transfer.
(17:49):
In these sovereign cases, we're doing build operate, so we're
the ones operating that, and that actually gives commerce and
others a lot of comfort that Chinese companies aren't going
to be getting.
Speaker 10 (17:58):
Access, But does the strictions on say in video selling
H twenties, as Jensen would say, mean that China just
gets better at.
Speaker 3 (18:04):
Doing it themselves.
Speaker 10 (18:05):
In a Huawei will be able to compete with growth in.
Speaker 9 (18:07):
The future potentially. However, the reality is if you enable
people now, then they're going to be able to use
that advantage to keep growing quicker. We're going to be
using AI to design our chips, We're going to be
using AI to design our software, So why would you
give that advantage? I think it's different going back to
information age technology, it's about replicating and distributing. Compute is
(18:29):
about doing something specific. If you don't have the compute,
you can't do it, whereas information knowledge sort of diffuses
and you'll catch up. With generative age technology you can
pull ahead.
Speaker 10 (18:40):
You talk about wand in competition and to be competitive
in China, for example, what about the competition within VideA.
How do you stand compared to AMD and the other
offerings Here in the United States.
Speaker 9 (18:51):
We actually think we're one of the best things that's
ever happened to in Vidia shareholders. So in Vidia sells
a product that is a premium product. It's a luxury good,
it's for training the models. When you train a model,
you're willing to spend more money because you get to
amortize that across an enormous number of users. If in
Vidia continues focusing on inference, which they have to do
(19:11):
right now because there isn't enough inference capacity, they're going
to have to bring their margins down. They're going to
have to sell their chips for a lot less. Remember,
our cost is much less than in Vidia's cost. On
top of that, higher volume, lower margin. And so the
more inference compute we deploy, the more demand there is
for training compute. The more demand for training compute, the
(19:31):
more GPUs and VideA sells. And the one thing that's
really important here in Video wants to do between three
and five million GPUs for AI this year. But that's
based on what's called HPM high bandwidth memory. It's this
very scarce component. They can make as many GPU chips
as they want, they can't make as much of the
memories they want. Those chips are going to be built
and sold, and the only question is what are they
going to be sold to do when we come in
(19:53):
and we sell inference chips than what we're doing is
we're allowing those GPUs to be sold into training, which
is a higher margin business. The video is going to
make more profit thanks to us.
Speaker 2 (20:05):
All Right, we have some breaking news crossing the Bloomberg terminal.
Utel Sat satellite based internet providers in talks to Rays
one point five billion euros. This is interesting because in Europe,
the continent's hoping that you tel Sat will be able
to displace and compete against the likes of Elon, Musk
(20:25):
Starlink in Europe.
Speaker 3 (20:27):
That's the red headline.
Speaker 2 (20:28):
You tail Satin talks to Rays one point five billion
euros with a doubling of the steak from French entities.
Speaker 3 (20:35):
We get more on that story as we get more
on it.
Speaker 2 (20:38):
All right, let's get back to what's happening in the
world of AI, and Microsoft is focused in driving adoption
of its AI tool co Pilot, and at a town
hall this week, it touted its progress.
Speaker 3 (20:49):
Who broke the story, Bloomberg's Brody Ford.
Speaker 2 (20:51):
I think this is really interesting, right because it's quick
on the heels of this deep dive we did in
Business Week about co Pilot, how it happened, how it launched,
What Microsoft shypes are for it, what you reported are
names across various industries, big corporate names that are doing deals.
Speaker 5 (21:10):
Right.
Speaker 11 (21:11):
Yeah, The background here is that Microsoft salesforce companies like
them they have these AI tools, right. I mean, in
the consumer space, Chat, GPT and tools like it have
ramped incredibly. In the enterprise, it hasn't been quite as
easy of an adoption path, and so with tools like Copilot,
Microsoft's AI assistance, we don't have great intel over how
(21:32):
many folks are really using it, how many folks are
really paying for it. What we learned from the meeting yesterday,
which sources we're telling us about, is that you know,
there are dozens of customers who are having over one
hundred thousand paid users. That's a pretty significant amount. I mean,
of course, you know there's some discounting in that, but
this is a sign that adoption is ramping, maybe more
(21:56):
than investors had been appreciating.
Speaker 3 (22:00):
Your reporting.
Speaker 2 (22:00):
We note that Microsoft CEO Sati Endella is very closely
tracking all kinds of data, including who is using co
pilot and how they're using it.
Speaker 3 (22:11):
Why is that important, right?
Speaker 11 (22:12):
I Mean, what's really important is that you don't have shelfware,
but you don't buy one hundred thousand licenses, but you know,
fifty people use it in finance, right. I mean, this
is the classic problem with software where maybe you can
sell a lot of things, but you have to really
make sure they're using it.
Speaker 2 (22:28):
Bloomberg's Brodie Ford another excellent piece of reporting.
Speaker 3 (22:31):
Thank you very much.
Speaker 2 (22:32):
Okay, coming up, Patrick McGoldrick from JP Morgan Private Capital
joins us to talk about its approach to AI investment opportunities.
Their conversation's next, This is Bloomberg Technology. Welcomes back to
(22:55):
Bloomberg Technology, Amed Ludlow in San Francisco. It's kind of
like the last day of technology earning and we're seeing
movers in the market on some numbers that got posted
last night, those of Dell and Marvel. In Dell's case,
it was all about the profit outlook in topped estimates.
There is clearly a backlog for AI serve demand, but
shares are a little bit softer. Marvel a slightly different story.
(23:17):
They kind of have like strong sales targets, they've won
some business and they were pretty clear with the market,
but investors still tip it on that chip name stock
down five percent. That's having an impact at the index
level as well. Let's go from public markets now to
private markets. JP Morgan private Capital's Growth Equity Partners. It's
(23:37):
a one billion dollar fund focusing on Series B right
through the pre IPO startups. Managing partner Patrick McGoldrick joins
us the focus right now the AI opportunity. I find
this so interesting and it's part of today's VC spotlight.
In my world, when I break a story about an
AI startup or a new round, it's so interesting to
(23:58):
look at the cat table traditional venture capital, traditional but
also people like you coming in more in the private
growth equity space. Would you just talk a bit about
that business model and what you try to do.
Speaker 12 (24:10):
Sure, and thanks for having me on today. I appreciate
the opportunity. So the way we position our strategy is
to operate like a traditional venture and growth equity investor.
So we started investing at the earlier stages of a
company's formation, Series B through pre IPO on the technology
and consumer side, all the way through that last inflection.
Speaker 8 (24:29):
Point before going public.
Speaker 12 (24:31):
I think for us, we seek to leverage the whole
firm to deliver value to companies, and in a world
where capital is just that, but value add in changing
the inflection curve, of a company's growth de risking. We
think JP Morgan is well suited to be that partner
of choice.
Speaker 2 (24:49):
So interesting you're talking about leveraging the full scale of
JP Morgan.
Speaker 3 (24:54):
Right as you're pitched.
Speaker 2 (24:55):
The traditional VC would say, don't just take our check,
take our operators. Many of us are entrepreneurs or startup founders.
Your pitches, we're JP Morgan.
Speaker 12 (25:05):
Yes, I think in the simplest of essence, of course,
I think we're privileged to be part of an organization
that spends eighteen billion dollars a year in technology spend.
Speaker 8 (25:13):
We have sixty three thousand technologists.
Speaker 12 (25:16):
And I think what is really important to recognize is
we will work with companies across our firm as early
as the seed stage, so we're on the hunt for
exceptional technologies. Now as investors, of course, first and foremost
we assess product market fit, the strength of the management team.
But then where can we pair in those relationships covering
ninety percent of the Fortune five hundred or forty five
(25:37):
thousand private companies, So we have the privilege of those
operators that other VC firms have. But I think the
global footprint of a firm like JP.
Speaker 3 (25:45):
Morgan, what are you saying in private markets right now?
Speaker 12 (25:49):
Yeah, it's a complicated question. It depends on the topic.
But let's start with where we see the investment opportunity set.
I think for us, we're spending a lot of time
in areas like artificial intelligence and cybersecurity, both incredibly resilient.
Speaker 8 (26:02):
From a tech spend perspective.
Speaker 12 (26:04):
There is a recent review of chief information officers and
chief technology officers budgeting priorities, and I think for different reasons,
you're seeing increased commitments to those areas. In AI, I
think it's a world of now having gone from experimentation
to actual deployment. Over sixty five percent of companies are
now deploying AI in.
Speaker 8 (26:24):
Their full production suite.
Speaker 12 (26:26):
Those companies are now automating tasks, they're creating efficiency and
making a more delightful consumer and enterprise experience. In cyber unfortunately,
you could have a version of me that's virtually produced
my cadence inflection, a little bit of my positioning, and
you'd have to be able to protect against that. So
that defensive posture is critical. It's why even in tougher
(26:47):
macro environments, the defensiveness of the spend there is very
clear and so you'll see that reflected in some of
our portfolio companies.
Speaker 2 (26:55):
Some of your portfolio companies we just showed Alpha Sense.
Of course, the maker of Cursor such a hot property
right now, tell me what's going on with that. But
also data breaks like data Breaks recently has been relying
on tenders like employee liquidity rather than a primary raise.
How have you navigated those two mechanisms in these two
(27:17):
hot names in the world of AI.
Speaker 12 (27:19):
Well, let's start with Alpha Sense, because I think that
company think of it really as a market research and
intelligence platform that in two thousand and eight had the
idea of bringing publicly available data in the world of markets,
also accessing the research reports that are put up by
Wall Street firms, digesting that and giving research analysts, corporate
(27:40):
development arms as well as portfolio managers the right solutions
at their fingertips to make informed decisions.
Speaker 8 (27:46):
With the auDA end of AI.
Speaker 12 (27:47):
That's gotten even more acute and clear as how they
deliver value. And they recently acquired a company called Tigis,
which provides private market perspective, so customer interviews that could
transcribe that marriage of data is critical to making better
decisions on data Bricks.
Speaker 8 (28:03):
You bring up an interesting point.
Speaker 12 (28:04):
I think the evolution of the private markets are such
that companies are staying private longer in order to provide liquidity.
They look to access this tender solutions and letting employees.
Speaker 8 (28:15):
Seek some liquidity.
Speaker 12 (28:16):
For us, we want to be active in names like
Data Bricks, where it's critical in the world of infrastructure
for AI growing well in excess of the public markets
with an exceptional management team, and so we're thrilled to
be involved in both of those and think they're critical
in this new advent of AI.
Speaker 2 (28:32):
Patrick, we just have thirty seconds. But you see some
kind of exit coming in either of those names.
Speaker 12 (28:38):
Well, I wouldn't want to speculate on that too much.
I think the management teams are better suits to do that.
What I would say about the exit environment more broadly
is we've gone from a faucet that's completely shut off
to a trickle where you have companies like in health
service Titan of course Core Weave that have used the
public markets, which is important.
Speaker 8 (28:56):
Hopefully the back half of the year produces.
Speaker 12 (28:58):
More of that, but they are companies that could effectively
go public by virtue of their performance.
Speaker 8 (29:03):
Again, the management team and the market.
Speaker 2 (29:05):
Size, Patrick, open up that tap next time. Come back
on and tell us when you have an ex Patrichan McGoldrick,
Managing Partner, JP Morgan Private Capital, Thank you very much.
Speaker 3 (29:15):
Coming up, we.
Speaker 2 (29:17):
Speak with AWS CEO Matt Garman in an exclusive conversation.
One year in the role, so much has happened, from
the sort of hyperscale perspective right through to the models
that AWS is hosting large through Bedrock.
Speaker 3 (29:33):
We also got some breaking news. Let's take a look
at DJT hitting session higher, raising his decline.
Speaker 2 (29:39):
Just confirmation the company's closed about two point four billion
dollars in a bitcoin treasury deal, something.
Speaker 3 (29:46):
We brought you a little bit of earlier this week.
We'll be right back. Don't go far. This is Bloomberg Technology.
Speaker 2 (30:05):
Welcome to our Bloomberg radio and television audiences worldwide. We
go right now to a conversation with Matt Garman, AWS CEO.
Speaker 3 (30:14):
Matt, it's good to catch up.
Speaker 2 (30:16):
It has been basically one year that you've been in
the role as AWS CEO.
Speaker 3 (30:21):
Is a place to.
Speaker 2 (30:22):
Start what has been the biggest achievement in that time
for AWS.
Speaker 13 (30:28):
Yeah, thanks for having me on. It's nice to be
here again. Yeah, It's been a fantastic year of innovation.
It's really been incredible, and as I look out there,
one of the things that I've been most excited about
is how fast our customers are innovating and ten adopting
many of the new technologies that we have. And as
you think about customers that are on this cloud migration journey,
(30:50):
many of them have been doing that for over the
last several years, but this year in particular, that we've
really seen an explosion of AI technologies, of agentic technologies,
and increasingly we're seeing more and more customers move their
entire estates into the cloud and AWS.
Speaker 6 (31:06):
So it's been really fun to see.
Speaker 13 (31:08):
It's been an incredible pace of technology and it's been
a really fun first year.
Speaker 2 (31:13):
The moment that investors kind of sat up and paid
attention was when Amazon said that it's AI business was
at a multi billion dollar run rate in terms of sales.
What we don't understand as well is what proportion of
that is AWS infrastructure.
Speaker 13 (31:30):
Yeah, that is AWS, right, and so the key is
that's a mix of customers running their own models. Some
of that is on Amazon Bedrock, which is our own
hosted models where we have first party models like Amazon Nova,
as well as many of the third party models like
anthropics models, and some of those are applications things like
Amazon Q which helps people do automated software developments, as
(31:54):
well as a host of other capabilities, and so there's
a mix of that. And I think part of the
most interesting thing about being at a multi billion dollar
run rate is we're at the very earliest stages of
how AI is going to completely transform every single customer
out there. And we talk to customers and we look
at where the technology landscape is, and we firmly believe
that every single business, every single industry, and really every
(32:16):
single job is going to be fundamentally transformed by AI.
And I think we're starting to see the early start
the stages of that. But again, we're just at the
very earliest stages that I think what's going to be possible,
and so that multi billion dollar business that we have
today is really just the start.
Speaker 3 (32:31):
Can you give me a generative AI revenue number.
Speaker 6 (32:36):
For the world or for awls?
Speaker 2 (32:38):
Are you guys for AWS? Maybe Amazon as a whole.
Speaker 13 (32:41):
Yeah, Like I said, we are in multiple billions of dollars,
and that's for customers using AWS. We also use lots
of generative AI inside of Amazon for a wide range
of things. We use it to optimize our fulfillment centers.
We use it when you go to the retail site
to summarize reviews, or to help customer find products in
a faster and more interesting way. We use AI in
(33:05):
Alexa in our new Alexa Plus offering, where we conversationally
talk to customers through the Alexa interface and help them
accomplish things through voice that they were never able to
do before. So every single aspect of what Amazon does
leverages AI, and our customers are exactly the same. Customers
are looking to AWS to completely change, whether it's their
(33:28):
contact centers through something like Amazon Connect where it shows
AI capabilities so that you don't have to go program
it all the way down to our custom chips or
Nvidia processors, or anything where customers at the metal are
building their own models. We have the whole range of
people that are building AI on top of AWS, as
well as Amazon themselves.
Speaker 2 (33:48):
We always credit AWS as being number one HYPERSCALA. But
just what you said there about what the client's using
in the silicon level through the capacity, it would really
help if you could proportionately tell me what percentage of
workloads are being run for training and which proportion of
(34:09):
workloads being run for inference.
Speaker 6 (34:11):
Sure, yeah, and that changes over time. I think.
Speaker 13 (34:14):
Look as we progress over time, more and more of
the AI workloads are being inference. I'd say in the
early stages of AI and generate of AI, a lot
of that usage was dominated by training as people were
building these very large models with small amounts of usage.
Speaker 6 (34:29):
Now the models are.
Speaker 13 (34:30):
Getting bigger and bigger, but the usage is exploding at
a rapid rate, and so I expect that over the
fullness of time, eighty percent, ninety percent, the vast majority
of usage is going to be in inference out there,
and really, and just for all those out there, inference
it really is how AI is embedded in the applications
that everybody uses. And so as we think about our
(34:51):
customers building, you know, there's a small number of people
who are going to be building these models, but everyone
out there is going to use inference as a core
building block in every they do, and every application is
going to have inference, and already is starting to see
inference built in to every application, and we think about
it as just the new building block. It's just like compute,
it's just like storage, it's just like a database. Inference
(35:13):
is a core building block, and so as you talk
to people who are building new applications, they don't think
about it as AI is over here and my application
is over here. They really think about AI is embedded
in the experience. And so it's increasingly I think it's
going to be difficult for people to say what part
of your revenue is going to be driven by AI.
It's just part of the application that you're building, and
it's going to be a core part of that experience,
(35:35):
and it's going to deliver lots of benefits from efficiency,
from capabilities, and from user experience for all sorts of
applications and industries.
Speaker 2 (35:43):
The present day, it's fair to say majority is still training.
Speaker 13 (35:47):
No, I think that at this point more definitely more
usage as inference than training.
Speaker 2 (35:52):
We want to welcome our radio and television audiences around
the world. We're speaking to AWSCO Matt Garman, who officially
next week celebrates one year in that role leading AWS.
Speaker 3 (36:04):
A new metric that.
Speaker 2 (36:06):
Has been discussed particularly this earning season. We discussed it
with Nvidia CEO Jensen One this week is token growth
and tokenization. Has AWS got a metric to share on
that front.
Speaker 13 (36:18):
I don't have any metrics to share on that front,
but I think it's one of the measures that we
can look at as the numbers of tokens that are
being served out there, but it's not the only one,
and I increasingly think that people are going to be
thinking about these things differently. Tokens are a particularly interesting
thing to look at when you're thinking about text generation,
but not all things are created equal. I think, particularly
(36:40):
as you think about AI reasoning models, the input and
output tokens don't necessarily talk about the work that's being done.
And increasingly you're seeing models that can do work for
a really long period of time before they output tokens.
And so you're having these models that can sometimes think
for hours at a time. Right, they might you ask
these things a go and actually do research on your behalf.
(37:02):
They can go out to the internet, they can pull
information back, they can synthesize, they can redo things. If
you think about coding and que developer, we're seeing lots
of coding where it goes and actually reasons and does
iterations and iterations and improves on itself. Looks at what
it's done and then eventually outputs the end result, and
so at some point kind of the final output token
(37:24):
is not really the best measure of how much work
is being done. If you think about images, if you
think about videos, there's a lot of content that's being
created and a lot of thought that's being done, and
so tokens are one aspect of it, and it's an
interesting measure, but I don't think it's the only measure
to look at. Although they are rapidly increasing.
Speaker 2 (37:43):
Project Rainier, massive Custom server design project. What is the
operational status and latest on Project Ray now?
Speaker 13 (37:51):
Yeah, so we're incredibly excited about so project right here
is a collaboration that we have with our partners at
Entthropic to build the largest compute cluster that they'll use
to train their next generation of their claud models. And
Anthropic has the very best models out there today. Claude
four just launched, I think it was last week, and
(38:13):
it's been getting incredible adoption out there from our customer base.
Nthropic is going to be training their next version of
their model on top of Trainium two, which is Amazon's
custom built accelerator processors purpose built for AI workloads and
we're building one of the largest clusters ever released. It's
(38:33):
an enormous cluster, more than five times the size of
the cluster compared to the last one that they trained on,
which again is the world's leading model.
Speaker 6 (38:42):
So we're super excited about that.
Speaker 13 (38:44):
We're landing TRAININGUM to servers now and they're already in operation,
and Entthropic is already is already using parts of that cluster,
and so super excited about that. And the performance that
we're seeing out of TRAININGUM two continues to be very
impressive and really pushes the envelope I think on what's
possible both from an absolute performance basis as well as
a cost, performance and scale basis.
Speaker 6 (39:04):
I think some of those.
Speaker 13 (39:05):
Are equally going to be really important as we move
forward in this world, because today much of the feedback
you get is that AI is still too expensive. But
costs are coming down pretty aggressively, and it's still too expensive,
and so we think there's a number of things that
need to happen there. Innovation on the silicon level is
one of those things that needs to help bring the
cost down, as well as innovation on the software side
(39:26):
and algorithmic side, so that you have to use less
compute per unit of inference or training. So all of
those are important to bring that cost down, to make
it more and more possible for ADI to be used
in all of the places that we think that it
will be over time.
Speaker 2 (39:41):
Matt and Wednesday, Nvidia CEO Jensen Wang summarized inference demand
for me. I just wanted to play you that SoundBite.
Speaker 4 (39:47):
Sure, well, we got a whole bunch of engines firing
right now. The biggest one, of course, is the reasoning
AI inference.
Speaker 6 (39:56):
The demand is just off the charts.
Speaker 12 (40:00):
You see the popularity of all these AI services.
Speaker 2 (40:03):
Now your pitch for trainium too, And as you know,
I've kind of taken a part the serve of design
and looked at it is the efficiency and cost efficiency
relative to Nvidia Tech. Are you seeing that same demand
Jensen outlined for TRAININGUM two outside of the relationship with Mthropic.
Speaker 13 (40:22):
Yeah, Look, we're seeing it across a number of different places,
but it's not really TRAININGUM two versus in Nvidia, and
I think that's not really the right way to think
about it. I think there's plenty of room. The opportunity
in this space is massive. It's not one versus the other.
We think that there's plenty of room for both these
and Jensen and I speak about this all the time
that in Vidia is an incredibly fantastic platform. They've built
(40:43):
a really strong platform that's useful and is the leading
platform for many many applications out there, and so we
are incredible design partners with them. We make sure that
we have the latest in Vidia technology for everyone, and
we continue to push the envelope on what's possible with
all of the latest in Vidia capabilities. And we think
there's room for Trainium and other technologies as well, and
(41:04):
we're really excited about that, and so we have many
of the leading AI labs are incredibly excited about using
Trainium too and really leaning into the benefits that you
get there. But for the law for a long time,
these things are going to be living in concert together,
and I think there's plenty of room. And customers want
choice at the end of the day. Customers don't want
to be forced into using.
Speaker 6 (41:25):
One platform or the other.
Speaker 13 (41:26):
They'd love to have choice in Our job at AWS
is to give customers as much choice as possible.
Speaker 2 (41:31):
What is general availability of Nvidia GB two hundred for AWS,
and have you I guess launched Grace Blackwell backed instances yet, Yes.
Speaker 13 (41:42):
Yep, so we've launched our they would call them P
six instances. And so those are available in AWS today
and customers are using them and liking them, and the
performance is fantastic. So those are available today. We're continuing
to ramp capacity. We work very closely with the Nvidia
team to aggressively mp capacity and demand is strong for
(42:02):
those P six instances, but customers are able to go
and test those out today. And like I said, we're
ramping capacity incredibly fast all around the world and in
our various different regions.
Speaker 2 (42:15):
Now, what is your attitude to Claude anthropics model being
available elsewhere on Azure Foundry for example.
Speaker 6 (42:24):
Great, I mean that's okay too.
Speaker 13 (42:26):
I think many of our customers make their applications available
in different places, and we understand that various different customers
want to use capabilities in different areas and different clouds.
Our job is to make AWS and this is what
we do, is to make AWS the best place to
run every type of workload and that includes anthropic claud models,
(42:48):
but it includes a wide range of things and frankly,
that's why we see big customers migrating over to AWS.
Take somebody like a Mandoli's who's really gone all in
with AWS and move some of their workloads to there.
One of the reasons is that they see that we
have capabilities sometimes using AI by the way, in order
to really help them optimize their costs and have the
(43:10):
most available, most secure platform in monthlies. This case, they're
taking many of their legacy Windows platforms and transforming them
into Linux applications and saving all of that licensing costs.
But we have many customers who are doing that, and
so our job is to make AWS by far the
most technically capable platform that has the most and widest
(43:32):
set of services, and that's.
Speaker 6 (43:34):
What we do.
Speaker 13 (43:35):
But I'm perfectly happy for other people to use Like
it's great that Claud's making their services available elsewhere and
we see the vast majority of that usage happening in AWS.
Speaker 2 (43:44):
Though, will we see open AI models on AWS this year?
Speaker 6 (43:49):
Well, just like you.
Speaker 13 (43:49):
Know, we encourage all of our partners to be able
to be available elsewhere, I'd love for others to take
that same tack.
Speaker 2 (43:58):
Let's end it with this question from the audience, Actually,
which is where you're going to grow data center capacity?
Speaker 3 (44:03):
Around the world.
Speaker 2 (44:04):
I got a lot of questions from Latin America and
Europe in particular, where Jensen flies to you next week?
Speaker 6 (44:10):
Yeah? Great.
Speaker 13 (44:12):
So in Latin America we're continuing to span expand our
capacity pretty aggressively.
Speaker 6 (44:17):
Actually.
Speaker 13 (44:17):
Earlier this year we launched our Mexico region, which has
been really well received by customers, and we've announced a
new region in Chile, and we already have for many
years have had a region in Brazil which is quite
popular and has many of the largest financial institutions in
South America running there. So across Central and South America,
we are continuing to rapidly expand. In Europe, we're expanding
(44:40):
as well. We have many regions already in Europe. One
of the things I'm most excited about actually is at
the end of this year we're going to be launching
the European Sovereign Cloud, which is a unique capability that
no one has, which is completely designed for critical EU
focused sovereign workloads, and we think given some of the
concerns that folk have around data sovereignty, particularly for government
(45:03):
workloads as well as regulator workloads, we think that's going
to be an incredibly oper popular opportunity for everybody.
Speaker 3 (45:10):
Matt Garman, AWS CEO, thank you.
Speaker 6 (45:13):
Very much, thank you for having me.
Speaker 2 (45:16):
Let's get other headlines in talking tech. First up, TikTok
shop is cutting several hundred jobs in Indonesia in this
latest round of cuts. This after taking over the operations
of local rival tocopedia last year. Sources say the cuts
are mainly in e commerce teams, and more cuts are
set to happen as soon as July. Plus Reform UK
leader Nigel Farage announced plans to introduce a Crypto Assets
(45:39):
and Digital Finance Bill if his party wins the next
general election, aiming to launch what he calls a crypto
revolution in the UK. The legislation would include a cut
in capital gains tax on crypto investments to ten percent,
the creation of a bitcoin digital reserve at the BOE,
and provisions that will make it illegal to restrict services
for people who want to pay with crypto and Stripe
(46:01):
has held early discussions with banks about the potential use
of stable coins. This comes as the payment firm debuted
a number of stable coin related products.
Speaker 3 (46:09):
In recent months.
Speaker 2 (46:11):
That does it for this edition of Bloomberg Technology. What
a week it's been don't forget check out the podcast.
You can find it on the terminal as well as
online on Apple, on Spotify, and on iHeart. From San Francisco,
this is Bloomberg Technology.