All Episodes

October 7, 2025 • 45 mins

AMD CEO Lisa Su and OpenAI President Greg Brockman join Bloomberg Tech to weigh in on the two companies' deal to roll out AI infrastructure, a pact the chipmaker said could generate tens of billions of dollars in new revenue. Plus, White House AI and Crypto Czar David Sacks defends the Trump administration’s approach to China in the context of the global AI race, and Meta CMO Alex Schultz has a new book on how AI is changing the advertising landscape.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Bloomberg Audio Studios, podcasts, radio news. Bloomberg Tech is alive
from coast to coast with Caroline Hide in New York
and Eva Low in sentences.

Speaker 2 (00:19):
Go this is Bloomberg Tech coming up. AMD signs a
historic deal with open Ai to roll out AI infrastructure,
a packed the chipmakers said could generate tens of billions
in dollars of new revenue.

Speaker 3 (00:33):
Plus.

Speaker 4 (00:33):
We speak with the White House AI and Crypto Zar
David Sachs about relations between the US and China in
the context of the AI race.

Speaker 2 (00:40):
And metas cmo joins us to discuss how AI is
changing the advertising landscape what that means for small businesses.

Speaker 4 (00:48):
Here and now we turn our attention to the broader markets.

Speaker 3 (00:50):
Bitcoin at a record high. Then, now's that one.

Speaker 4 (00:52):
Hundred at a record high. We're up six tenths of
a percent, the moon music around AIAI infrastructure, and the
frenzy therein ed it continues. I'm looking at the socks
up three point five percent, a new record high for
the index that tracks the semiconductor industry.

Speaker 3 (01:07):
But what is leading a game? Of course, you know
it's one key name.

Speaker 4 (01:11):
We look at AMD at one point hitting a record
high We are up the most on this stock in
nine years, twenty eight percent surge. At one point it
was thirty seven percent higher. All as we hear that
tens of billions are going to be added in terms
of revenue because the open ai infrastructure pat a sign
of real confidence in Lisa SU's AI accelerator and build

(01:31):
out ed. We've got such an interesting conversation coming out
about this new deal.

Speaker 2 (01:39):
Welcome to Bloomberg radio and TV audiences around the world.
AMD has signed a definitive agreement with open Ai to
deploy six gigawatts of amdgpus. AMD says it will equate
to tens of billions of dollars in revenue. Open Ai
will get up to one hundred and sixty million AMD
shares in tranches and set against both operational and financial milestone.

(02:00):
The focus is inference. Let's bring in a m D
CEO Lisa Too and open Ai president Greg Brockman, both
of whom join us on set here at Bloomberg Tech
in San Francisco.

Speaker 5 (02:09):
Good morning, Good morning, It's great to see you here.

Speaker 2 (02:12):
Let's frame the opportunity. Lisa, you know that the market
reaction is very clear, but for a m D and
the AI industry at large. What do you think this represents.

Speaker 6 (02:22):
Well, look, this is a huge milestone for a m D.
You know, we are so thrilled with the partnership with
the open Ai team, and it's also you know a
huge moment for the AI industry because you know, when
you get to the right down to it, you need
more AI compute. I mean, that's where we are today.
Compute is a foundation for all of the intelligence we
can get from AI. And you know, we are a
compute provider. We have spent years on our roadmap, We've

(02:45):
spent years working with open Ai and the team, and
you know, together now we're embarking on you know, a
massive build out of six gigawatts of AI compute, and
it's it's a big deal for us, for our shareholders,
for our teams, and for you know, the partnership and
the overall AI ecosystem.

Speaker 2 (03:03):
Greg I say that the top the focus is inference.
I think that's really important to be specific about what
you will do with this capacity. So so literally explain
that part. And I'm conscious that you know, in the
first instance, the first target is one gigawatt and then
eventually six gigawatts, but what will you use it for?

Speaker 7 (03:20):
Well, I think that the world continues to underestimate the
amount of demand for AI compute. Right that we've seen
this explosion of demand with things like chat GBT. You know,
we're at eight hundred million weekly active users. Now this
product didn't even exist three years ago, and we're in
a position where we cannot launch futures, we cannot launch
new products simply because of lack of computational power. And

(03:42):
we see these models continuing to get exponentially better, and
I think we're just heading to a world where so
much of the economy is going to be lifted up
and driven by progress and AI. And so we're very
much heading to a world by default that I think
looks like a compute desert, right that there's just not
enough compute to go around, and so we're trying to
build as much as possible, as quickly as possible. So
we're starting with one gigawatt simply because you got to
start somewhere. But honestly, we're building as fast as we

(04:04):
possibly can and trying to bring as much computational power
to bear for the economy and for the world.

Speaker 4 (04:10):
Lisa, this is such a big commitment to Instinct in
particular as a customer. Does it make open ai the
largest for that particular product.

Speaker 6 (04:20):
Well, this is certainly the largest deployment that we have
announced by far. I mean, you know, six gigawatts of compute.
As Greg said, we're going to start with the first
gigawatt in the second half of twenty twenty six on
our new next generation m I four to fifty chip.
I think the thing to understand is, you know, these
types of partnerships actually take you know, years to really

(04:40):
get comfortable with the idea that we're going to you know,
go all in together. And this isn't all in partnership
in terms of building out you know, the AI compute
that open ai needs for everything that they're offering to
the world. So yes, it's a huge deal, and it
also says a lot about you know, how much needs
to come together for you know, this entire ecosystem to operate.

(05:01):
So you know, we are setting up you know, certainly
there's a lot of engineering work, but our teams are
working together on hardware, software, We're ensuring the supply chain,
all of those elements are set up and ready to
deliver on this massive commitment.

Speaker 4 (05:16):
Greg, talk us through a little bit about the players
that you need to also lean on. This has been
years in the making, as you say, with AMD, but
what other cloud providers were involved. How are you thinking
about this working with an Oracle all others out there.

Speaker 7 (05:29):
Yeah, we really think of this as an industry wide effort,
and in general, we think that compute is something that
does require the entire supply chain to really wake up
and to really to start building much more than people
we're planning on. I think this starts from energy to
try to get far more power to be built on.
Things like nuclear I think are going to be very
important to come online. The cloud providers are an important

(05:50):
part of this as well. So we're going to be
deploying AMD in our own data centers. We'll be deploying
them together with cloud providers. You know, we have a
deal with Oracle, lots of other cloud providers out there.
You can really see that we're very much in the
We just want compute as much compute as possible. We
think this is important for the economy, We think this
is important for the nation, we think this is important
for humanity, and so really we're working with everyone in

(06:11):
this whole industry in order to get as much compute
power online as quickly as we can.

Speaker 2 (06:16):
Lisa, I'm sorry specifics where is this data center going
to be? Is it one single site? Is it orical
that we'll partner with you on this?

Speaker 6 (06:28):
Well, actually, what this really is is an announcement of
what you know, A m D and open a are
going to do together. You know, open ai has a
lot of partners in terms of you know, where they deploy.
I imagine a lot of it will be in cloud
service providers. It's really up to you know open Ai
and Greg and Sam and the team. But the way
to think about it is, for this you know, amount

(06:49):
of compute, it's going to have to be in a
lot of different places.

Speaker 5 (06:52):
It's a massive amount, multiple locations.

Speaker 6 (06:54):
Multiple locations, I would imagine, you know, multiple providers to
really get this online as fast as possible.

Speaker 2 (07:01):
Greg, there is a lot of focus on where open
ai is going to get the money from to fund
all of this. Sam Altman's big picture commitment is well documented,
right and the numbers to his mind are in the trillions.
But have you specifically thought about debt financing for this
relationship with AMD? Have you thought about doing a specific

(07:24):
equity raise. You are very committed across multiple projects.

Speaker 8 (07:28):
Yeah.

Speaker 7 (07:28):
Look, the way that I would the way that I
would look at this is that AI revenue is growing
faster than I think almost any product in history, and
that ultimately, at the end of the day, the reason
this compute power is so important, it's so worthwhile for
everyone to build, is because the revenue ultimately will be there.

Speaker 5 (07:45):
Now.

Speaker 7 (07:46):
As a company that is trying to move as fast
as we can, we look at everything, right, we look
at equity debt, we look at trying to find creative
ways of financing all of this. That's been actually a
huge focus of us for the past couple of years,
thinking about how can we possibly build the amount of
compute that is required in order to really transform this
whole economy into an aipowered economy. And so I think

(08:08):
you'll see lots of creative ideas, but fundamentally, I think
at the end of the day, it is because we believe.

Speaker 2 (08:14):
Sorry to jump in an interrupt and carriage, just forgive
me on this one. The condition of AMD issuing the
stock to open AI requires you to spend money basically
because you have to deliver that gig awad of capacity. First, Lisa,
I have to ask you if you have assurances that
open ai.

Speaker 5 (08:33):
Is good for it.

Speaker 6 (08:34):
Well, let me be clear. I mean, this deal is
a win for AMD, it's a win for open Ai,
and it's a win for our shareholders. And that's kind
of the way we put this together. I have full
confidence in open Ai, Sam, Greg Sarah. I mean, this
is a massive opportunity for us right now, right here.
It's about who has the most compute and how fast

(08:55):
can we get it online? And we're committing to doing
this together. And the fact is as open ai ad
buys chips, that's great for AMD. Our revenue goes up,
our earnings go up. You know, we expect that it
will also be very very accretive to our shareholders from
day one. And as we do that, you know, we're
very happy to have open Ai as a deep partner

(09:16):
and we win together. So it's like a virtuous positive
cycle in how we build out. You know, this big
vision for having all this compute out there, right.

Speaker 4 (09:24):
And yet we still question, as you were just talking about, Greg,
some of the other supply chain elements.

Speaker 3 (09:29):
You're talking about the need for nuclear for power.

Speaker 4 (09:31):
What's really interesting is we are you feeling confident enough
about the rest of the compute the supply chain is there,
is this going to be US manufactured? From your perspective,
were you looking and also building out internationally with AMD.

Speaker 7 (09:44):
Yeah, we've been looking at really all options our preference
and really the core thing that we try to do
is build as much as possible in the US. And
you can see the commitments that we've made over the
past year, you know, five hundred billion dollars of investment
in the US, and that's not stopping. We're continuing to build.
Think the international that there it is also going to
be important for the world to have compute. I think

(10:05):
that computer is going to become this like national security
strategic resource, and every country is going to need computational
power and so that we are really not limiting our
sort of sites in terms of where to build, but
we do think it is important that the US leads
in this technology, leads in computational power, and we're expanding
the supply chain. But you can see that we've really
been working with partners across the globe in order to

(10:26):
actually meet the demand that we expect to becoming and
upcoming years.

Speaker 4 (10:30):
LISTA the manufacturing of these chips. Will do you look
to Intel at all for it?

Speaker 6 (10:34):
Do you think in the future, Well, as you know,
the supply chain is something that we work on, you know,
very very meticulously. I think we have a very strong
supply chain. We're certainly deeply partnered with you know, TSMC
across the supply chain. You know, just to that earlier question,
we're absolutely prioritizing building in the United States because I

(10:55):
think that's super important. This is the US AI stack.
We want to have as much of it in the
US as possible. And you know, we continue to really
look at, you know, how do we ensure that there
will be a strong supply chain, you know.

Speaker 9 (11:07):
Going forward.

Speaker 2 (11:08):
Greg Sam posted on x that this deal with AMD
is incremental to what's already being done with Nvidia, but
as least enough. So I spent quite a lot of
time looking at the MII family and the newer generations
of products to come. Is there a very clear specific
benefit to using a MD technology for inference relative to
the capabilities of Nvidia, or do you just see it

(11:31):
broadly as some sort of diversifying factor.

Speaker 7 (11:34):
Well, I would look at it this way, that there's
a huge fixed cost to getting AI models running on
any platform, and so that when we look at what's
out there, that actually getting AI training to work is
a huge, huge amount of lift. That's something we've really
only done the work for in VideA, but for inference
that that's something that's much more that there's an easier

(11:55):
barrier to entry there. And one thing we found is
that I think that the work that Lisai team have
been doing on the m I four fifty series, it's
looking like it's going to be a really incredible chip.
I think that that there's the way that these things
work is there's niches for different balances of memory and
computational power, and so as we have a diversity of workload,
we're finding that having a diversity of chips also really

(12:17):
accelerates what we're able to do.

Speaker 2 (12:18):
Lisa, at the beginning of this conversation, I said they
were both operational and financial milestones to be met, and
Greg explained, you've got to start somewhere. So in the
first instance, one giga what, but would you just sort
of draw out the pathway to that first giga what?
You know, it seems like you're prepared to move quickly here.

Speaker 6 (12:35):
Yeah, absolutely, and maybe if I can just build on
some things that Greg said. I think he's absolutely right.
You know, we're a believer in there's a diversity of
workloads and there will be a diversity of workloads across
you know, customers, models, use cases, and from that standpoint,
you know, we feel really good about how we're positioned.
You know, we we love the work here because you know, Frankly,

(12:57):
you know, open Ai is the ultimate power user of
our chips and and test us in very good ways.
So I think that's that's what gives us confidence that
you know, the technology is there. And then to your
point about milestones, yes, I mean this is clearly a
case where we are tied to each other. The first
gigawatt of deployment is super important. We're going to start that,

(13:20):
you know, second half of next year, and we're going
to build on from there. And it really is not
just the technology, but your commercial milestones, adoption milestones, and
just how we proliferate the capability going forward. But I'm
looking forward to building this as fast as possible. You know,
we're already working with a number of cloud service providers
who are also very active on our technology, and I

(13:41):
think this is a great catalyst to get the industry
to build faster.

Speaker 4 (13:46):
Tied to each other is such an interesting ton of phrase,
and Greg, you are seeing more AI users and chip
makers and designers becoming more financially tied to each other.

Speaker 3 (13:57):
Is this going to continue?

Speaker 4 (13:59):
Is this the staf forward for how you see this
financing going forward?

Speaker 7 (14:03):
Well, I really see the world transitioning to this AI
powered economy. And the interesting thing is within open AI
that we've really seen what it's like when your progress
is limited and accelerated as two sides of the coin
by computational power. Like teams within open Ai, that their
ability to deliver really is tied to the amount of

(14:23):
compute that they get. And I think we're heading to
a world where that is how the whole economy will function.
And we're starting to see it right that people having
access to better AI tools. If you're a coder, you're
able to do far more if you have access to
better AI models. And we're heading to a world where
if you can have ten times as much AI power
behind you, you will probably be ten times more productive.

(14:44):
And so I think that we're moving to a world
where the whole industry is waking up to the fact
that we have just not planned. We have not planned
for this moment where this explosion in AI demand is happening.
So it's happening all the way from the power to
the silicon, and I think this whole industry has to
to find a way to actually rise to meet the occasion.

Speaker 2 (15:03):
Lisa, you have given us a look into the future
before about how you see the total addressable market the
industry now that the ink is dry with open AI
and Greg, are you rethinking either your bigger picture analysis
of the market for AI accelerators and GPUs or do
you see AMD now having an improved position in that

(15:24):
market relative of course to your friends at Nvidia.

Speaker 6 (15:27):
Well, again, I think, and I've told you before, I
believe that this is a huge market. You know, we
have size just the AI accelerator TAM being you know,
over five hundred billion dollars in TAM over the next
few years. I think some might say, you know, maybe
I was a little conservative in that TAM analysis, but
the way to think about it is there's so much

(15:49):
need from compute. I mean, you just heard it from Greg,
so you know this is a huge pie and you're
going to see the need for you know, more players
coming into it, and you know, from my standpoint, this
is a big validation of our technology and our capability.
You know, as much as we love the work with OpenAI,
we're working with a lot of other customers as well.
There's a lot of excitement in the industry around m

(16:11):
I four fifty, so we're ready for it.

Speaker 4 (16:14):
MDCO Lisa Sou Open Ai President Greg Brockman, it's been
a joy having you on the show.

Speaker 3 (16:19):
Thank you both very much.

Speaker 4 (16:21):
Thank you, and coming up more on AMD and its
impact on the broader tech markets. We've got a key
investor for you, Tony one of t row Price.

Speaker 3 (16:28):
This is really metech.

Speaker 4 (16:36):
Let's stay on that AMD open Ai story, it's impact
of the broader tech markets. Tony Huang t row Price,
Science and Technology from Portfolio Manager.

Speaker 3 (16:44):
You have exposure to AMD.

Speaker 4 (16:46):
It's added more than seventeen billion dollars in market cap
on one day alone.

Speaker 3 (16:49):
What does this deal signal?

Speaker 10 (16:52):
Yeah, Well, I think it's a really exciting deal here
for AMD and validates their roadmap and their silica and software's.
I think it gives them a really big lead customer
to really scale and I think that this will probably
attract other customers to their roadmap and continue to build
out ROCK.

Speaker 8 (17:11):
And I've always had this like thesis.

Speaker 10 (17:13):
That the TAM is really big and if companies can
deliver on that compute, that there's there's a lot of
companies that can win in the overall compute. TAM so
overall very exciting, and you know, I think it's got
a good setup for the next few years for the
company's runway.

Speaker 4 (17:31):
I mean, Lisa Sue just telling us that maybe she's
being conservative when she puts the AI accelerate to TAM
at five hundred billion dollars.

Speaker 3 (17:37):
Tony, I'm interested.

Speaker 4 (17:38):
Though, at opening eye telling us Greg saying that we're
looking at all kinds of way of financing this.

Speaker 3 (17:43):
How are you feeling.

Speaker 4 (17:45):
About this knitting together of purchases and creators, designers of chips.

Speaker 10 (17:52):
Yeah, well, I think that taking your step back over
the last kind of ten years, everybody's underestimated this TAM.

Speaker 8 (17:58):
And I think what's.

Speaker 10 (17:59):
Exciting about AI is that there are a lot of
productivity use cases here and we're starting to see AI
agents really form. And then I think physical AI, so
I think as long as like the n use cases
are delivering a lot of value, and there's promise there.
I think that's what matters in terms of the ROI,
and I think a lot of these kind of contracts,
I mean they are backstop by really blue chip companies

(18:21):
like Mirosov or Oracle.

Speaker 8 (18:23):
So in terms of you know, how I'm feeling.

Speaker 10 (18:25):
About the financing, I think it's it continues to be
probably a good attractive way to do as long as
the use cases are intact.

Speaker 2 (18:32):
Tony, the mechanics of the deal are the open AI
has the right to buy up to one hundred and
sixty million shares of AMD at a penny apiece, but
that right contingent on the operational and financial milestones that
we discussed. It is not circular financing, it's something different.
But how comfortable are you with that mechanism.

Speaker 8 (18:55):
Well, I think it could make a lot of sense, right.

Speaker 10 (18:57):
I mean it puts the two companies of in co
development and partnership, and I think there's shared economics and
it's a win win for both companies. I think some
of the prices the warrants are stricted at a pretty
you know high price on the AMB stock price, So
I think it's good for both sides and if they win,
they win together.

Speaker 8 (19:18):
So to me, I think it can make sense.

Speaker 2 (19:21):
Does this alter AMD's position in the AI accelerator market
in other words, they're closing the gap a little within video?

Speaker 8 (19:31):
Yeah, well, I.

Speaker 10 (19:32):
Think that both companies are doing really well and they're
doing things a little bit differently.

Speaker 8 (19:38):
But I think that you're.

Speaker 10 (19:39):
Alluding to kind of the market narrative of AMB was
kind of in between kind of in video on the
GPU side and then Broadcom on the custom side, and
I think this can help lift the narrative to go
from like a second place GPU player to a co developed.

Speaker 8 (19:55):
Partner of open Ai on the AI compute side.

Speaker 10 (19:59):
So I think it positions them well, gives them credibility scale,
an additional kind of kind of co development R and
D to improve their roadmaps. So to me, it definitely
is a big positive. And I don't think it's a
zero sum game. I mean, I think it just further
signifies the demand for AI computing. It's good for the
country to build this much capacity because I do think

(20:22):
the and payoffs are immense.

Speaker 4 (20:24):
I mean, Tony, we're a new record high for the
NASA that one hundred. At one point, MD was at
a record high. The valuations kind of a higher. When
I'm looking at your holdings top holding, Apple, Alphabet and
Video Meta, do feel confident that these are going to
continue to gain in value with this AI story.

Speaker 10 (20:42):
I think there's a good long term story behind all
our holdings, and I think a lot of them are
beneficiars of AI. They're strong companies that are are great
platforms and as a result, I think that you know,
we are entering this inflection in AI driving a ton
of productivity. You think about you know, output being productivity

(21:03):
and labor. I think pun game is going up and
labor is going to be relatively uncapped, I think as
a limiting factors. So to me, I think it's exciting.
It's never been a better time to be a tech investor.
I think there's just so much change in dynamism, So
I'm excited for the multi.

Speaker 2 (21:20):
Year here, Tony, if your humor me please, My colin
today was about all of the debt deals that are
underpinning the AI infrastructure build out that we're seeing. You know,
everyone seems to have a different comfort level with that
as well. How sanguine are you about the role of
debt in what we're seeing?

Speaker 10 (21:39):
Yeah, I think that you know, the key signal here
is you know, what what is this AI capacity, this
AI infrastructed capacity.

Speaker 8 (21:47):
Worth, and what's the useful life?

Speaker 10 (21:49):
I think a few years ago we're debating is the
useful life for a GPU data center three or four years?
And I think that was just too short. You look
at a lot of the GP use you know from
seven eight years ago. They're still being fully utilized because
the AI workloads continue to evolve. And then when you
fully appreciate these assets, like they're at a price.

Speaker 8 (22:09):
Point that could make sense for other lms be running
on them.

Speaker 10 (22:13):
And so I think that you know, the useful life
of these gps are definitely more than four or five years.
And you know, there's just more and more use cases.
You think about robotics, like you know, diagnostics, simulation, there's
just a ton more than we thought there were three
years ago. And I think that's what We'll keep the

(22:34):
utilization high of these data sounds as long as that
you know continues, I think it makes a lot of sense.

Speaker 3 (22:39):
Totally a good pivot another part of your portfolio.

Speaker 4 (22:42):
Verizon change at the top, hands verse bag, handing over
the rainsidantialman, how'd you feel about it.

Speaker 8 (22:48):
You can you come again now that question.

Speaker 4 (22:51):
With Verizon, we are seeing executive changes at the top
on certain companies.

Speaker 3 (22:55):
How do you feel, for example of.

Speaker 4 (22:57):
Verizon seeing hands verse bag changing over the rain today.

Speaker 10 (23:01):
Yeah, actually we don't own verses, so I probably that's
a little bit outside of my wheelhouse.

Speaker 8 (23:05):
To carment.

Speaker 2 (23:09):
Tony Wong with tro Price, we're trying to get you
on all the news of the day, but we really
appreciate the insight into the AMD open aideal.

Speaker 5 (23:16):
Thank you very much, Carrot.

Speaker 3 (23:17):
Time now for talking tech.

Speaker 4 (23:19):
First up, Apple ed is facing an investigation in France
over the storing of voice recordings made by its Sirie assistant.
Probe is related to claims that subcontractors and had access
to sensitive recordings. The iphonemaker says it uses Syrian interactions
to improve services and only stores them in users opt in.

Speaker 3 (23:37):
Apple declined to comment on the investigation.

Speaker 4 (23:39):
Plus Fox con reported sales growth of eleven percent in
the third quarter and projective further sales growth this quarter.

Speaker 3 (23:45):
The gains by and video partner officially named Honhai.

Speaker 4 (23:47):
Signals that demand for AI chips and servers remains strong,
and Tesla shares jumped after the company posted cryptic videos
teasing a possible product. Unveil Saysa's last addition to its
product lineup was a cyber truck two years ago.

Speaker 3 (24:01):
Executives have said, an affordable model. Why it's on the works.
Welcome back to Blue med Tech. We check in on
these markets.

Speaker 4 (24:13):
New record highs October apparently has been coined, certainly for
looking at the crypto world, Bitcoin one hundred and twenty
two hundred and thirty up two percent. We had a
risk on rally throughout the weekend. That risk on rally
continues into money with the NAZ that one hundred and
seven ten seven percent.

Speaker 3 (24:27):
And you know why.

Speaker 4 (24:28):
There is a key player at the moment that is
helping us push up and to the right. It is
AMD up twenty six percent at one point, hitting a
record high, up the most in nine years. Extraordinary new
deal soaring after of course, it's doing this deal with
open Ai for AI infrastructure that can generate tens of
billions of dollars in new revenue.

Speaker 5 (24:47):
Ed.

Speaker 4 (24:47):
We had a great conversation with Lisa Su and Greg
Brockman a moment ago. Another great conversation coming up.

Speaker 5 (24:52):
Yeah, really looking forward to this one.

Speaker 2 (24:54):
Joining us now as the White House, AI and cryptos
are David Sachs. He's also the co founder and partner
of Craft Ventures all of course as well. David, there
is something that I wanted to talk to you about
for a long time, and we will get to it,
the debate around should the United States or should the
United States not export some form of AI chip to China,
And we will get to it. But I'm sure you

(25:15):
can appreciate, you know, the AMD and Open AI agreement
is very interesting, and so as a place to start,
could I just ask your sort of interpretation of what
it signals when you have a US chip maker like
AMD after Nvidia, and one of the frontier model shops
like open Ai working so closely together on infrastructure.

Speaker 11 (25:37):
Well, and I think the main thing here is this,
there's an AI boom going on and the market is
highly competitive. We have a number of leading model companies,
we have a number of leading chip companies, and they're
all competing with each other and they're all booming. And
what we see here is that this AI boom just
keeps going and going and keeps driving businesses to new highs.
And this is all a result of President Trump's pro innovation,

(26:00):
pro export, pro AI policy. We saw this help drive
us GDP to three point eight percent growth rate in
the second quarter. So I think this seems this boom
just seems like you going to keep going and going,
and we just heard I think it's.

Speaker 4 (26:12):
Just justson One was the latest on a podcast shouting
you out, shouting out those that are in the team
at the White House at the moment guiding when it
comes to technology, David. But I'm interested as to whether you,
as an investor, have great comfort in these relationships being built,
financial relationships between chip designers and the chip users.

Speaker 11 (26:32):
Well, I think it's up to them. You know, I
don't really take sides in these deals. We want all
of our American air companies to be successful, and they're
all competing with each other and cooperating with each other.
I guess it's called coopetition, and that's a great thing
to see. We just want, you know, putting my AI
hat on, we just want these markets to be competitive,
and we want American companies to be successful, and that's

(26:54):
what's happening right now.

Speaker 4 (26:56):
I'm looking therefore at the future rollout, the buildouting that
gives you pause in terms of actually the infrastructure needs
here at the moment.

Speaker 11 (27:04):
David, Well, anytime you're working in the world of atoms,
it's going to be more complicated than when you're working
in the world of bits, and so it's very important
for us to scale up the amount of power that's
available to AI companies. And you're seeing that thanks to
President Trump's policies that are pro energy. The President back

(27:24):
the idea of drill baby drill going back many years.
I think he was very far sighted in this regard.
He understood that energy is a basis for everything. It's
certainly the basis for this AI boom. And we're in
the process of allowing a lot more power generation in
the US. President Trump is allowing so called behind the
meter power generation. In other words, these AI companies can
stand up their own power generation. We need to squeeze

(27:46):
more out of the grid, and we're enabling new oil
and gas and nuclear so all the above. So I
think that energy is the main thing, and President Trump
is supporting that.

Speaker 5 (27:58):
David.

Speaker 2 (27:59):
Final question on on AMD and open AI. Did the
parties consult the White House about moving forward with this
arrangement and their plan to work together?

Speaker 8 (28:09):
No?

Speaker 11 (28:09):
No, I mean not with me. And I wouldn't expect
them to. We just don't get involved in the deal
making between private companies. Again, We're here to create a
policy environment that is supportive for all of RII companies.

Speaker 5 (28:22):
I appreciate that.

Speaker 2 (28:22):
Okay, So, David, I asked you to come on the
program because we spoke at the end of last week
for a story that was done out of DC on
the idea of well, what is a China Hawk? But
also within that broad description, should the United States or
should the United States not export deprecated AI chips to China.

(28:45):
You've been very generous with your time, but I just
invite you again for our Bloomberg Tech audience. Explain how
you view yourself in this debate and what you think
is strategically most important for this country.

Speaker 11 (28:58):
Well, first of all, I would say that I consider
myself to be a China Hawk. I want the US
to win this AI race. We understand that China is
our main competition globally in this AI race, and we
want to do everything we can to win. And President
Trump and his AI speech on July twenty third laid
out some of the tempoles of that strategy. We need

(29:18):
to be pro innovation, We need to be pro infrastructure
and pro energy, and we need to be pro exports
so that the American technology stack dominates the world. So
this is all in the service of the United States
winning this AI race. We understand that it's going to
have major economic and national security ratifications, and we're in
it to win it.

Speaker 2 (29:38):
There is a lot of focus on the direct to
China part. But the other way that some look at
it is there is a marketplace outside of America and
outside of China, the Middle East being an example. What's
your position on that and whether the United States wants
to kind of leave the world open to China to
sell its own technology.

Speaker 11 (29:58):
Well, there was of you in the previous administration that
we shouldn't sell chips to many countries, including the resource
rich Gulf States, and I think that was a major
mistake because every time you tell a country that they
can't buy the American tech stack, what's their action going
to be? They're going to turn to China and adopt
the Chinese tech stack. I have a very simple metric

(30:20):
for measuring weatherwoiting the AI race, which is global market share.
If we look around the world and say five years
and we see that the American technology stacks has say
eighty percent market share, that means that we won. But
if we look around the world in five years and
we see that the Chinese technology stack and I'm talking
about Huawei chips and Deepseak models for example, has a

(30:40):
eighty percent market share, then obviously we lost. By the way,
that's what happened in five g We don't want to
repeat of that. So again, the strategy here should be
for the US to dominate the world and have the
greatest market share. And I think this is pretty obvious
to everyone in Silicon Valley because we understand that the
way to win technology races is to have the biggest
eco system. If you're a technology platform, you want to

(31:02):
have the most developers using your API. If you're an
app store, you want to have the most apps in
your app store. In a similar way, we want as
many users on the American technology stack as possible. And
I find it hard to understand why the previous administration
would exclude these rich countries from participating on our tech stack.

(31:24):
It certainly didn't help us in the race with China.
If anything, President Trump's policy boxes out China from the
Middle East, whereas the previous administration's policy forced these countries
into China's arms.

Speaker 4 (31:37):
You've reworked, therefore the diffusion ideals set by the previous administration.

Speaker 3 (31:42):
But when you're allowing only less.

Speaker 4 (31:45):
Powerful chips, I mean, I think the President even called
them obsolete versions of in Video's chips into China. Does
that mean that ship of innovation in China's already sailed
because they need to have the most sophisticated and they're
going to have to build it themselves.

Speaker 11 (31:59):
Well, but when you're talking about what we export to China,
that's obviously going to be a very complicated question, and
there's arguments on both sides. I think there's a pretty
strong argument for not selling China our latest and greatest
chips because that would be too beneficial for them. However,
if you don't sell them anything, then, like you're saying,
that will accelerate their desire to be independent of the

(32:23):
American stack. And so I do think there is a
compelling argument for selling them a let's call it deprecated
American chip or a less great American chip. And by
the way, this is why the Biden administration approved the
H twenty in unlimited quantities for export to China. Now,
when President Trump came in, he said that those H
twenty sales have to be licensed and they're subject to

(32:46):
a fifteen percent surcharge, And nonetheless, all the people who
approved the Biden policy started attacking President Trump. I think
this is a classic case of no one had a
problem with it until President Trump agreed to do it.

Speaker 2 (32:59):
David Jensen Woe went on Brad Gersner's podcast, and there
was a reaction to what he said. You would point
out you are an official of the government. Jensen is
a private citizen and CEO. But at the root of
what he was saying was also the idea of talent.
Many talented computer scientists and engineers come from China. I
think that was the point he was making. But the

(33:19):
personnel part of this story, where do you stand on that?

Speaker 11 (33:25):
Well, it's true that something like half the world's AI
researchers are from China, and so I do think that
we have to somehow be open to working with this
talent or allowing this talent to use the American chip
stack to some degree. So it's a complicated question always
of what you allow China to do. But I think
Jensen was making a point there about again the talent pipeline,

(33:46):
and we have to be open to it to some degree. Again,
I could serder myself to be a China Hawk. But
I wasn't triggered by what Jensen said because I watched
the entire two hour podcast, not just the thirty second clip,
and I understood what he was trying to say in context,
and it is a nuanced question. And look, let me
just say to all these people who are criticizing Jensen,

(34:07):
what have you done to help us win the AI race?
I can't think of a more strategic asset in the
I race than in Nvidia and Jensen himself, who for
thirty years have been working on these GPUs and Jensen
is a source of huge American advantage in this AI race.
So let's trow them a little bit of grace here.

Speaker 4 (34:24):
Context is everything. We appreciate you talking about the complicated
as well. White House, AI and Cryptois are David Sachs great.

Speaker 3 (34:30):
To have you with us.

Speaker 4 (34:32):
Now coming up met his chief marketing officer, Alex Schultz,
discusses how AI is changing the advertising landscape. Joins us
next as a Bluebertet. A new book is out click here,
The Art and Science of digital marketing and advertising. It's
by Alex Schultz, and he says that AI is going

(34:53):
to be a game changing for advertisers if used correctly.
Schultz the chief marketing officer, of course a meta there's
a thing all too about direct advertising and directing it.
You also understand entirely the analytics and the raw data
pine all of this.

Speaker 3 (35:06):
Alex, welcome to the show.

Speaker 4 (35:07):
I'm so interested in how you have seen the unfolding
over the last decade or two. Incremental growth is something
that you talk about so much, having direct impact with
direct targeted ads in many ways, but then you've got
to think of how a company has got long term vision,
a north star, as you put it. How are company
small and large managing to balance.

Speaker 7 (35:26):
This right now?

Speaker 12 (35:26):
Yeah, thank you for having me. This is awesome to
be here. I mean, look, incremental growth is about being
incremental to what would have happened without you doing your actions.
You need to be incremental to what would have happened.
You need to see what the lift is of your work.
That fits totally with thinking long term and what is
your north star? What is the goal you want to achieve?
That is absolutely a question of like I have a goal.
How do I incrementally lift that goal?

Speaker 4 (35:48):
Your north star was writing this book. I laid in
many ways as well as the day to day in
which you're understanding the analytics underneath what's happening in terms
of marketing for meta and.

Speaker 3 (35:58):
The growth story. Why how did you put this out here?

Speaker 8 (36:01):
Yeah?

Speaker 12 (36:01):
Well, I mean my north staff of the book is
to be useful, Like I want small businesses and large
businesses to be able to actually be better at using the
tools like you go back Ogilvie on advertising nineteen eighty
five Bible, Absolutely incredible book, but there are new channels
that have evolved since then. Digital marketing is seventy five
percent of the global advertising industry. There's no book for that.
I kept being asked for a book for that, so
I wanted to produce a useful book to help people

(36:23):
be good at that.

Speaker 5 (36:26):
Alex.

Speaker 2 (36:26):
In the back part of the book on AI and
its impact on marketing, you talk about an audience of one,
the idea that the AI can make the ad so
targeted to the individual.

Speaker 5 (36:37):
What are the risks with that?

Speaker 2 (36:39):
You're clearly needing a lot of data about that person
to make that audience of one relevant ad.

Speaker 12 (36:45):
Yeah, I mean, from my perspective, there are clear Now
we've reached the point where there are clear laws out there.
There are clear platform policies out there about purpose use,
limitation of data, and so it's very clear what you
can use the data for. And people like personalized ads,
so people have control. You can get young people really
understand how to manage their algorithm, and they love the
personalized ads that they get where there's something relevant that's

(37:07):
actually useful for them. So I think the balance is
actually in a way better place than it was, say
ten years ago, in terms of people's understanding of these things.
So I think the risks are much much lower, and
I think the biggest risk is not taking advantage of it.
You look where Europe is, You look at how they
regret the current regulations, You look at the draggy report.
They are regretting the lack of growth they have versus
the growth the US has, and I think one big

(37:28):
part of that is they've hurt advertising.

Speaker 2 (37:32):
The big picture with meta has been that all of
the work in AI has paid off for like the
bread and butter core business. Can you give us any
sort of insight into how much more valuable an AI
powered ad is what the conversion on it is relative
to traditional advertising, how it drives growth on the top line.

Speaker 12 (37:51):
Yeah, I mean you can step back completely. Our business
has been completely transformed by AI. When TikTok came along,
like five years ago, everything that you were looking out
on Facebook and Instagram was connected content. You joined, you'd liked,
you'd friended, you'd followed. Today, the majority of Instagram and
Facebook are unconnected content. And you can see in our
earnings results what the impact of AI allowing you to

(38:14):
rank content based on semantic understanding of content and semantic
understanding of the individual does for engagement. And then you
see it too on the ads in terms of the
revenue that's coming through and the results were providing for advertisers.
So it's very high double digit percentage uplifts if you
adopt things like advantage plus shopping campaigns, make sure you
feed the data through with cappy and do the basics right,

(38:36):
which I describe in this book, both with meta but
also using anyone else's tools.

Speaker 4 (38:41):
What about human creativity at this moment, Yeah, one of
my struggles is creativity is always about pixels and look,
I love my creative seemed they're incredible.

Speaker 12 (38:50):
We closed Fifth Avenue. We had Lewis Hamilton do donuts
on Fifth Avenue. It was amazing. Things were great and
it really really worked. But there's a ton of creativity
with data. There's a ton of creativity with target. There's
a ton of creativity with conversion rate optimization and flow optimization.
And what I get at in this book is creativity
in the factors where it isn't David Ogilvy saying stick
a car to a billboard with super glue to sell superglue.

(39:11):
It's creativity and how you use data to give people
amazing personalized experiences and high conversion rate flows.

Speaker 2 (39:19):
Alex Schultz, chief marketing officer for Meta, thank you very
much and tune in to Bloomberg's screen Time on Thursday
for a deep dive on the creator economy with the
head of Meta's Instagram Adam Missi.

Speaker 5 (39:31):
That's what we're really looking forward to.

Speaker 2 (39:33):
Okay, coming up, we're going to get back to the
deal between AMD and open Ai and what that means
for concerns around circular financing, the build out of AI infrastructure,
and everything else.

Speaker 5 (39:42):
This is Bloomberg Tech.

Speaker 4 (39:54):
We'm going to get back to open AI and am
the deal pushing broader.

Speaker 3 (39:57):
Markets high, certainly pushing it AMD high.

Speaker 4 (40:00):
Joining Usskaya is with US Swiss quote senior market analyst IBEC.
These ongoing deals between chip designers and ultimately those that
are perching the GPUs.

Speaker 3 (40:11):
What do you make of it?

Speaker 9 (40:13):
Well, actually, it's very amazing, especially for those who are
questioning the circular nature of the business and the deals
that are being announced right now but infinite when. And
we'll look at the MD and open Ai deal today.
While open Ai is outstriking deals with data centers and
chip makers from inside the US and outside of the
US in order to stay ahead of the game and

(40:34):
make sure that they are not constrained by capacity constraints.
So that basically means that this company today is more
worried about not having enough supply for the huge demand
than the contrary. And that's outright positive for the aiballs and.

Speaker 13 (40:47):
The sentiment here and for AMD.

Speaker 9 (40:49):
There is nothing else to say that the jackpot their
moment looks like it has come.

Speaker 4 (40:53):
Yeah, certainly biggest move in nine years Epech. What's interesting
is in video did sync pulled back a little bit
after it had risen higher on the hopes that on
hire and the server demand was clearly there.

Speaker 3 (41:02):
But then we question market share.

Speaker 4 (41:04):
I mean, is there anything that gives you any anxiety
about videos dominance?

Speaker 9 (41:08):
No, absolutely not, and that's due to the context. As
you always say, context is everything. The sequence on which
we are receiving the information has been very insightful in
what's coming. Actually, NVDA announced last week that they would
be investing up to one hundred billion US dollars in
open Ai. That is announcing today that they will be
taking a ten percent's sake in AMD. So there is

(41:29):
a circularity there that suggests that m viderships are not
going to replace in the context of this open Ai deal,
but rather complemented now beyond the still this could give
some leverage to AMD, but it looks like the companies
are considering today that the AIPI is big enough to
feed everyone grandly.

Speaker 5 (41:48):
Epek for public market investors.

Speaker 2 (41:52):
Is open Ai the private company becoming some kind of
macro level factor that they have to model in.

Speaker 9 (42:01):
Well, absolutely, I mean they are so huge now, they
are the biggest startup in the world.

Speaker 13 (42:06):
They are worth five hundred.

Speaker 9 (42:07):
Billion US dollars and d do you have these huge
deals with publicly traded companies, and those are the market
moving deals of market moving stocks. We are talking about NVIDIAs,
we are talking about AMD So definitely, open ai is
pretty much the center of this AI revolution. It has
been the well the starting point, and it is gaining

(42:30):
importance every single day, and with each deal that they announced,
they are actually securing their position at the center of
this game.

Speaker 2 (42:39):
Should we be asking more questions about how open ai
is going to pay for all of this, then.

Speaker 9 (42:44):
Well yes, I mean some automol recently say that they
will be looking at some funding possibilities that he didn't
give details about.

Speaker 13 (42:52):
But one of the questions here is.

Speaker 9 (42:55):
That ai is actually a very capital intensive place, so
this company needs funding now. Nvidia coming to the rescue
could open the way for other companies also looking to
help open ai and take a sake in this company
that has not yet gone public. So I think that
open Ai, if anything, is not going to really having

(43:16):
any funding problems at the stage of the game because
they are dominating the AI business right now, and even
though it was it's it's a private company, it does
have all the fundings that it needs. I think from
private and public investors are just eager to take take
part of this company actually because.

Speaker 3 (43:35):
The growth story is so clear in terms of revenue.

Speaker 4 (43:38):
But when you compare thirteen billion dollars in revenue per
year and then you're thinking the trillions and dollars it
has to spend. What gives you comfort that that revenue
will match the amount that ultimately needs to be financed on.

Speaker 9 (43:50):
It, Well, it's it's definitely the reach that they have.
We have seen over the past three years other companies
like Meta or a Chinese company's aside, but many companies
right now perplexity met her. We have x for example,
wet Grog and other models that are trying to compete

(44:10):
open Ai, and so far they have not been successful
in meeting the level of enthusiasm.

Speaker 13 (44:16):
That open Ai had so far. So open Ai is
actually surfing.

Speaker 9 (44:20):
It has been to first to come in and it
still has this popularity and this leverage of being the
first comer. So I think that they do have a
very good leverage in expanding their business and making more
revenue in the future.

Speaker 13 (44:33):
They just need to play the game right.

Speaker 9 (44:35):
They just need to find the right partners and that's
exactly what they are doing right now.

Speaker 2 (44:41):
Eupe Oscar Deskaya Swiss. Quite great to have you back
on the program. Appreciate it a lot more a program
it was that does it for this edition of Bloomberg Tech, Caro,
tune in later today we have more news due to
come from Open AI. We're going to speak with the
CEO Brad Lightcap on the sidelines of Open AIS.

Speaker 5 (44:58):
Develop a day.

Speaker 4 (44:59):
Gosh a lot going, can't stop, won't stop, ed you
know it for us, we so appreciate it. Meanwhile, check
out our podcast why don't you There are so many
conversations you've got to dial back in. Get into Lisa Sue,
get into Greg Brotman, get into David Sachs and of
course Alex Schultz.

Speaker 3 (45:13):
Of Meta as well all online. This is a Bloomberg
Tech
Advertise With Us

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.