All Episodes

May 9, 2024 42 mins

Bloomberg's Caroline Hyde and Ed Ludlow take the pulse of the world of technology at the Bloomberg Tech live event in San Francisco as they sit down with CEOs and visionaries driving change. They speak with the CEOs of Arm, Hugging Face, Writer AI and more as part of this live special. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
From our heart.

Speaker 2 (00:02):
We're Innovation, Money and Power Collie in Silicon Valley, NBN.
This is Bloomberg Technology with Caroline Hyde and Ed Ludlow.

Speaker 3 (00:27):
Live from sunny San Francisco and Caroline Hyde.

Speaker 4 (00:29):
And our Ed Ludlow.

Speaker 5 (00:30):
This is a special edition of Bloomberg Technology. Coming up
will bring you live coverage from our technology event as
we speak with the CEOs and visionaries that are driving
change in Silicon Valley and beyond.

Speaker 3 (00:43):
Now at this hour, we're speaking with the CEOs of Arm,
of Hugging, Face, of Writer, AI and more as part
of our live special.

Speaker 4 (00:51):
And a lot more to come.

Speaker 5 (00:52):
We'll push your head to our keynote speakers later today.
That includes Adam Newman, Whitney Wolf Heard, Evan Spiegel and more.

Speaker 4 (01:00):
And while it.

Speaker 5 (01:00):
May not be a surprise to anyone too, did it's
certainly not a surprised to us. Artificial intelligence is probably
the overriding theme of the event. It's AI everything, even
in some cases if you're not an AI company.

Speaker 3 (01:12):
Yeah, I mean I think it has to be. Even
if you're not the AI company at heart, you're thinking
about how you adapt to it, push it forward and
bring out the productivity. But Ultimately, how do we live
up to the hype? The valuations are so extraordinary in
the private markets, they've been pretty heavy in the public
markets as well. Are we really seeing the level of
producted divity and growth and real use cases?

Speaker 5 (01:29):
Do mindication that these events are always very interesting, very engaging.
Everyone's very positive, But I would say in the background,
there's a talent war, frankly, and people are running out
of cash.

Speaker 4 (01:40):
So we've got to ask those difficult questions too.

Speaker 3 (01:42):
And geopolitics, how are you navigating the issue of China
as well? So so much to talk about in here
and the now. We're talking about one key infrastructure playing
When it comes to artificial intelligence, we're of course going
to be talking about the chip design firm ARM, which
has really bounced offlows in terms of its share price
throughout the trading of today. We're currently over the last

(02:03):
two days down by one point few percent. Coming out
after the bell with its earnings, tapid forecast head was
what the market seemed to be focusing in on, even
though they absolutely smashed it in terms of their fiscal
fourth quarter numbers and their fourth first quarter. To look ahead,
let's stick in to some of that caution with Renee
has is the arm CEO. Renee wonderful to have time
with you. And look, there does seem to be a

(02:24):
worry about your full year forecasts? Are you being cautious?

Speaker 2 (02:30):
Well, thanks both for having me Ed and Caroline. We
just came off a record year in terms of revenue.
We were up twenty percent a little bit over twenty
percent our fiscal twenty three from twenty twenty two, and
we're actually forecasting even higher growth this year, north of
twenty percent. And we also signaled to the markets yesterday
that in twenty five, twenty six, twenty seven, we see

(02:53):
that growth continuing. So we have incredible visibility to our
business and we're very, very confident of growth rate going forward.

Speaker 5 (03:04):
We're just seeing your shares actually ticking to positive territory
rene up now six tenths of one percent. The underlying
story is the build out in AI infrastructure. Right, we're
talking about data center powered GP by GPUs. Your numbers
were good. Tell me about the underlying demand then about
the long term and the addressable market you think is

(03:26):
either intact or is not.

Speaker 3 (03:30):
Well.

Speaker 2 (03:30):
I think this AI buildout, as you describe, or maybe
said another way, just expanding capacity to run these foundation models,
to do more and more training, to do more and
more inference.

Speaker 3 (03:41):
We really are only.

Speaker 2 (03:42):
At the very beginning because when you start to think
about the capabilities that this could unleash, whether it's around healthcare,
farmer research, productivity, gains, call centers, we're still in the
very very early days. That all starts with having to
do this level of training and inprints in the cloud,
but it ultimately will find itself in every single edge device,

(04:04):
whether that's a PC, your smartphone, your car, and whether
it's all those devices I've mentioned from the data center
to the edge devices. They all run on ARM. So
we have incredible visibility to where this is all going,
which is why we're very confident in the growth rates.
They're also one of the big problems you've got with
all of these AI data centers is around energy and power.

(04:26):
So power efficiency being so key, it's what ARM is
really good at. Increasingly we're seeing the most complex applications
moving to ARM and most sophisticated training ship on the
planet that was just announced Grace Blackwell, Well that's based
on ARM.

Speaker 3 (04:44):
Okay, so you're managing to really think that you're going
to be the server play as well as the PC play,
the cell phone play, and I want to focus in
on the cell phone play, Renee, because that's been where
your bread and butter has been in history. How are
we looking from a smartphone perspective? Is the market looking
strong to you? We've had many a mixed message coming
from China to MOD for example.

Speaker 2 (05:05):
Overall, what we've seen the smartphone market briftly for ARM
has been quite a good growth rate in terms of royalties.
Our version nine which is now being used in many
of the premium mobile phones, that drives a higher royalty
rate for ARM. There's also more complex CPUs that go
into that that's also better for ARM and going forward carrolling.
One of the things that we're seeing, and it's not

(05:26):
just in smartphones, is that as these AI models are
moving so fast, the hardware can't keep up with the software.
The software innovation is happening so quickly that by the
time the hardware is ready to run those models, everyone
wishes they had more performance, they had more efficiency.

Speaker 4 (05:45):
So what does that mean for ARM?

Speaker 2 (05:46):
It's driving growth in our licensing activity. People are looking
to do more and more design ships faster and faster,
and that's all good for us going forward. So I
think going forward you're going to see more and more
innovation happening, not only in the smartphones, across all these
edge devices.

Speaker 3 (06:05):
What's interesting in AY is it's hard to keep up
with the pace of geopolitical change as well. The latest
news coming that Huawei, of course is not going to
have access to poll Kong to Intel chips. You were,
of course a UK based company, but are affected by
US policies. Has this impacted your business? The limitations of
Huawei's access to chip designed to chip technology to licenses.

Speaker 2 (06:29):
Yeah, So that issue they referred to specifically was when
Huawei was placed on the entity list I think twenty nineteen,
twenty twenty, companies had to apply for a license to
exempt them to ship to Huawei. So a number of
companies asked for those licenses, they got those licenses. Now
those licenses are being revoked. We don't follow in that

(06:50):
category in any way, shape or form. We didn't apply
for any licenses at the time to share. We complied
with the export controls as they were laid out. So
there's really a non event for us in terms of
what you're seeing with Qualcoman or Intel.

Speaker 5 (07:06):
We are speaking live to the ARMS CEO, Renee has
We're on the ground here at Bloomberg Tech in San Francisco.
Last week, Renee Christiano mom was on the show telling
Caroline and I, this is the year of the AIPC.
You were asked about that on your earning school last
night and you gave a slightly different answer. And maybe
it's not the year of the AIPC more the twelve

(07:28):
to thirty six month window. And you don't want to
see just one PC supplier, you said you'd like to
see two or three. What's your beef with Qualcom?

Speaker 1 (07:38):
Now?

Speaker 2 (07:38):
When I look at the PC ecosystem, one large ecosystem
has already moved to ARM in a very big way.
Apple is now one based on ARM. All the Apple
silicon is based on ARM. And you see amazingly good
products relative to what they've delivered, fantastic battery life, performance,
thin and light, no fans. When you think about the

(08:00):
Windows market, it's a very different market. It's highly fragmented.
You have lots of different players. The ecosystem matters, the
channel matters, price points matter, high end gaming machines versus
low end devices that are like cloud enabled. So what
does all that mean. It generally has meant that breath

(08:20):
vendor choice, multiple options to provide a full scope is
what matters. And what I'm hearing is over the next
couple of years, the Windows ecosystem is going to be
able to afford that. And I think over the next
two three years, I do believe Windows Unarmed will be real.
I think you'll see multiple players, multiple price points, multiple units,

(08:41):
and I think you'll see meaningful market share that we
start to gain the kind of performance you see in
the other ecosystem. I think we'll find its way into
the Windows ecosystem.

Speaker 4 (08:52):
Rennie, I wanted to talk about geography really quick.

Speaker 5 (08:55):
We're here in San Francisco, right there's a lot about
Americas are in d focus on AI related chips.

Speaker 4 (09:01):
Are you seeing this sort of equivalent activity in.

Speaker 5 (09:05):
Europe, for example, any of your customers outside of those markets.

Speaker 2 (09:09):
Yeah, well I'm in San Francisco today too, so I
will see you a little bit later. But in general,
I think the geopolitics are something that all tech CEOs
are now having to figure out and work with AI models,
foundation models, sovereign clouds, thinking about what level of training
takes place in a country, versus outside the country where

(09:32):
the weights sit, et cetera. That these are all the
kind of things that politicians have never really had to
think about in the past. So we're involved in a
lot of those conversations, whether that's in the United States,
whether that's in Europe, and really just trying to understand
it because any lawmakers in all these jurisdictions are just
trying to figure it all out. And as I mentioned before,

(09:52):
as the software and models are moving so fast, it's
difficult for everyone to keep up. But we are central
to all those discussions.

Speaker 3 (10:03):
Renee, what's been keeping up is your valuation?

Speaker 4 (10:06):
Boy?

Speaker 3 (10:07):
I mean, do you think there's too much exuberance around
AI valuations out there? Are you going to make the
most of it? By well, we talked to one point
of listing in the UK too.

Speaker 2 (10:19):
Yeah, you know, I don't think about the valuations as
much as I just think about the AI opportunity, which
I frankly believe is undercalled in terms of just what
it's going to mean relative to society and what it
can do for our planet. I think again we are
in very very early days in terms of the capabilities
of what this can unleash for our society incredibly excited

(10:41):
to be part of it. But I don't think we're
part of a hype cycle at all. I think there's
a lot of innovation taking place, and you know, frankly,
the innovation that's taking place, any inventions that we're seeing,
it's just breathtaking. So no, I don't personally view it
as a hype cycle at all.

Speaker 5 (11:00):
They has I'm CEO really grateful you actually be here
with us later today on site at Bloomberg Tech. Your
stock open pretty low, and I think it's just a
little bit higher now during the conversation we've had.

Speaker 4 (11:11):
Thank you so much.

Speaker 5 (11:11):
All Right, coming up on the program, we're going to
be joined by Clem DeLong, CEO of Hugging Face. That's
coming up next. Stay tuned, we'll be right back. Is
Bloomberg Technology. Welcome back to this special edition of Bloomberg

(11:35):
Technology live in San Francisco at the Bloomberg Tech event
and artificial intelligence Surprise.

Speaker 4 (11:40):
Surprise is sort of the overarching theme.

Speaker 5 (11:43):
We've got a pretty good guest to talk about that
with and discuss all things large language model with Clemed Along,
CEO of Hugging Face. You made this prediction which we're
going to hold you to account on that by the
end of this calendar year, and I appreciate we're not
even halfway. Source models would be equivalent to the best

(12:04):
closed source models. Give us a status check of that
prediction place.

Speaker 4 (12:08):
I think it's already happened.

Speaker 6 (12:10):
Open source now is better than closed source for most
use cases. We've specialized customized models on the companies like
data sets. I have my meta reband glasses here that
are powered by Lammatri.

Speaker 4 (12:26):
Right.

Speaker 6 (12:26):
We've seen so many now use cases being powered by
open source models, and most of the big tech companies
are now publishing open models. Just last month, we've seen
Apple releasing open models on hogging Face. We've seen Nvidia,
we've seen Snowflakes, We've seen Data Bricks, we've seen Microsoft.

(12:46):
All of them now are publishing open models.

Speaker 5 (12:48):
There maybe some people in attendance who don't agree with him,
which is why I asked the question.

Speaker 3 (12:52):
Well, also Microsoft published one and then quickly with DURI
some of the reporting because they hadn't stress tested the
large language model enough. I hadn't whittled out some of
the toxicity checks. In particular, how are you feeling about
the way in which large language models are growing and
the way in which governance is developing around it.

Speaker 6 (13:12):
Well, we're starting to see that the most important question
is concentration of power. Right for such an important technology,
you don't want a world where just a few companies
are controlling.

Speaker 3 (13:22):
It, but that is the world we live in.

Speaker 4 (13:24):
I don't think so, I think more and more.

Speaker 6 (13:25):
What we're seeing is that with open source you can
actually distribute power more and you want to reduce the
gap between the most powerful companies and the rest of
the world, not only other companies but policy makers, non profits,
academia and all of that.

Speaker 4 (13:42):
And that's the purpose of open source.

Speaker 6 (13:44):
It reduces the gap between the most powerful companies and
the rest of the world, and that's what creates kind
of like a sustainable, balanced future for AI and technology.

Speaker 5 (13:55):
It's a conversation where you're going to have all day
long and you point out that basically open source allows
more groups to go.

Speaker 4 (14:02):
To work on it.

Speaker 5 (14:03):
The problem is, as we're learning the tens of billions
of dollars it takes to train models with yes, tens
or hundreds of billions of parameters, and then you go
lower down and now what we're hearing is that actually
there are the folks doing this running out of cash?
Are you seeing that as well?

Speaker 6 (14:23):
So it's more less true because for example, now you
can use Lamastri that really has been closely but that
meta has released and finds units for a very small
amount of money. That's why I'm hugging face. There's over
one million models that have been trained by companies, and
a lot of these companies are very small and don't

(14:43):
have like really really big big budget. I feel like
today every single company has to build their own AI
otherwise they run the risk of being left behind. And
that's what we're seeing, and it doesn't require any more
really really big budget. An interesting point though, that we're
going to see this year though, is that we'll need

(15:05):
to find for AI companies better business models. That's what
kind of like you hinted at, something we really focused
on at Talking Face. We are looking grateful to be
close to profitable, which is very unusual for AI companies.
But we're starting to see that there are some ways

(15:25):
to generate revenue and not burn insane amounts of compute
for AI startups today.

Speaker 3 (15:34):
I mean, how on that profitability perspective of yours, how
many paying customers do you now have Can you give
us an update. You've got a million models. What about
paying customers?

Speaker 6 (15:43):
We have more than ten thousand paying customers out of
the over one hundred thousand organizations, more than four million
AI builders that are using our platform, and I think
we found the right balance between monetizing, especially with like
big companies that are using the platform in.

Speaker 4 (16:03):
Private enterprise companies exactly in.

Speaker 6 (16:05):
Order to fund all the free community, open source work
that we're doing, and that is always going to stay
open source and free.

Speaker 3 (16:13):
Of course, I want to go back to what I
was saying though about people running out of cash. You
actually put out a really interesting call on x basically saying, look,
I'm here if you need me. Hugging face is here.
If there are good people out there building interesting businesses
but you're running out of money, we could be a
home for you. Are you making acquisitions? Is it acquihiring
that goes on them?

Speaker 6 (16:33):
We make some acquisitions. We're going to have interesting announcements
in the next few weeks.

Speaker 3 (16:37):
I oh, don't taaser, but that's interesting.

Speaker 6 (16:39):
But I think in geneoin AI you're going to see
more and more MNA because, as you said, I think
a lot of companies took very risky bets. A lot
of them are running out of money. And at the
same time you have other companies like Hugging Face and
others that are successful enough to be homes. Some of
these M and A is going to weird, right, We've

(17:01):
seen that happening in Lidit with Deck with some.

Speaker 5 (17:05):
Usual marriage with necessity rather than choice.

Speaker 4 (17:10):
May's.

Speaker 6 (17:10):
Yes.

Speaker 5 (17:11):
One thing that's good about summits like these Bloomberg Tech
by the way, we can go around the room and
ask who you're going to be shopping for. That's going
to be interesting. But you get all these people in
one place. You've also used the time that you've been
in San Francisco because you're up in Seattle, right, Miami
or Miami Apologies, you've been hiring, you've been interviewing candidates.

(17:33):
Is that just a function of the best candidates being
here in this city? How wide are you casting your net?

Speaker 4 (17:39):
Yeah?

Speaker 6 (17:40):
I think I think San Francisco is still the heart
of technology and the I right, there's so much talent,
so much so many interesting companies, so many interesting big
technology companies being here that it's important for us to
kind of like you have a foot on the ground here.
We have a team already here, but we're also hiring
community for hugging face here.

Speaker 3 (18:02):
Applications being taken.

Speaker 6 (18:04):
Yes, there's a really massive fight and struggle for AI
talents right now with inflation of packages everywhere. But what
we're seeing is that when when you have a mission
that's like interesting two candidates like we open source, then
you can attract really good talents. That's one of the

(18:25):
reasons why. Also we're seeing big tech doing more and
more open source. Right if you look at Meta with
all the great work.

Speaker 3 (18:31):
That favorites, you keep on. I mean you've only really
mentioned number three. You're trying to cajole Google into coming
even more open source.

Speaker 6 (18:39):
I think as long as companies contribute to the world
and to the field, if we've open source, we open research,
I think it benefits everyone. I think we've we've lost
a little bit this way in the US for the
past few years. If you look at AI five years ago,
most of it was open source and open science. It
changed a little bit when some companies started to make

(18:59):
money and and changing their approach to things. But I
think it would be positive for the world to get
back to an AI domain that is more open, more transparent,
more inclusive.

Speaker 5 (19:09):
And you asked a calendar date now, by the way,
because you told us you're going to announce.

Speaker 4 (19:13):
The news when you're ready four weeks. Yeah, we're holding you.

Speaker 3 (19:16):
Thanks clems on, our joy to have you with us
and let you get to your breakfast where he's holding
court here seeo hugging face. What a great conversation. Welcome
back to this very special edition of Blue Meg Technology,

(19:38):
Live in the Heart in San Francisco. All the grain
and the good of industry movers shakers when it comes
to artificial intelligence, in particular the academics, but the companies
behind it, the CEOs, and notably also the investors. And
this is an interesting one for the investor base. Right,
we potentially have a new large language model getting at
a decent evaluation.

Speaker 4 (19:57):
Yeah.

Speaker 5 (19:57):
So, I think what we reported last night is that
xa I, the AI company started by Elon Musk and
which he built out pretty quickly, is closing this kind
of mega funding round eighteen billion dollar valuation. The thing
is that we've learned right over the last year or
more that is not actually that eyewatering. A number the

(20:18):
numbers involved are not that iwak it wasn't he.

Speaker 3 (20:21):
Actually raising an awful lot considering an eighteen billion valuation.

Speaker 5 (20:24):
Yes, I think we've reported sort of up to six
billion dollars. The thing is that the compute costs a
mega and I took a phone call this morning saying,
look past the cash and start asking whether the XAI
has got access to the GPUs. Now Elon Musk has
an existing relationship with Nvidio Jensen hung in the Tesla context,
but it's a good gossip for the Bloomberg Tec event.

Speaker 4 (20:45):
It's a good thing to discuss well.

Speaker 3 (20:47):
And ultimately who are the investors? What we've seen I
think really the rise in twenty twenty three and twenty
twenty four, it's been corporate VC.

Speaker 4 (20:54):
Yes, of course, a strategic investor.

Speaker 3 (20:56):
Yeah, you've got Sequoia Capitals being incredibly active, who were
going to have to co capital on a little bit later.
But then more at the seed of the funding the
series A series people. Amount of money necessary for these
large language les means and video has to be a
player or a Google has.

Speaker 5 (21:10):
And what I heard is that Jared Birchall, whose head
of Musk's family office, has been in the Middle East
tolling the sovereigns a lot of that.

Speaker 4 (21:16):
Stuff going on.

Speaker 3 (21:27):
Welcome back to a special edition of Bloomberg Technology right
here in San Francisco. An event is upon our hands
that has all to do with artificial intelligence and what
to continue that conversation? Right here, right now at the
Bloomberg Tech Summit is writer CEO Mayhabim, who joins us now,
who has been doing Janata AI for the enterprise before
everyone else got with the program. You both rate in

(21:48):
twenty twenty, you've got an enormous chunk of change from
Iconic Capital and other key investors. And how does it
feel with everyone trying to surge in on the enterprise
opportunity here? How are you standing out? I'm making sure
that you keep the keys like ubers and clients that
you already have.

Speaker 7 (22:05):
Yeah, it's actually really exciting to see all of the investment.
Right We've been working on this, my Covaner and I
for ten years previously in a machine translation startup, and
so to see all of this attention is actually amazing.
But the way we stand out, I think, is with
a really differentiated platform that helps enterprises with the last mile,
which is ninety percent of the work in AI.

Speaker 5 (22:27):
May You've been on the show a number of times
over the last two years or so, and each time
I always reflect on sort of the rate of change
for the industry, but also grow for your company. Clem
DeLong of Hugging Face just gave us some numbers about
the sort of size and scope of how they're doing
if you say close to profit or near profit or
something like that, but just tell us about your company

(22:49):
and how it's doing.

Speaker 7 (22:50):
Yeah, I mean, it's been an incredible rate of change.
When we started the company, we knew AI was going
to be better at people at reading and writing, and
that has certainly happened. We now say, if you can
write it, you can build it, because AI is not
just the technology, it's the way to build new technology.
But building AI apps is actually still quite difficult, and

(23:12):
so the rate of change of just what we've been
able to do, I mean, it's hundreds of enterprise customers,
hundreds of thousands of users, thousands of applications that are
in production. So a lot of this kind of question around,
like how you get applications from POC to scale. You know,
we've been doing that for years now and it's just

(23:32):
had a tremendous impact on the growth.

Speaker 3 (23:34):
Of the business.

Speaker 5 (23:35):
You have some relatively new work on models, right, so
tell us about the kind of the latest and greatest
on the tech side of your offerings.

Speaker 8 (23:43):
Yeah.

Speaker 7 (23:44):
So, over the past few months, we've introduced vision as
a capability into a platform. We've launched Palmyra in thirty
two languages that really really high quality, beating human benchmarks
our customers tell us. And next up for us our
large reasoning models, so software that write software, which we're
really excited about being able to go from you know,

(24:06):
work substitution to real work reinvention and orchestration using AI.

Speaker 3 (24:12):
At the very start, you said basically ninety percent of
the work isn't just getting the right language, large language
model in the door, but it's implementing it. It's all
the other bells and whistles that go to ensure that
you get operational efficiencies that you put it to your
own workflows. What are some of the best ways you're
seeing and being harnessed. What are some of the worst ways,
Because everyone's still waiting for this Eureka moment where all

(24:33):
of our exuberants around AI actually makes a real difference.

Speaker 7 (24:36):
Around one hundred percent, there are fifteen hundred lms, right,
if large languine hundred Yeah, I mean, and.

Speaker 3 (24:41):
They can pass the MCAT and the LSAT.

Speaker 7 (24:43):
So if lllms were the answer, everyone would have the
generative AI program of their dreams, right, But that's not
the case. There's so much work to get the data
and the context and the workflow from the business user
into the application, right, And that's what our platform does.
It's this collaborative inner that combines the LLM with all
of those building blocks, and that's where the magic is

(25:05):
because the llms themselves need so much more context about
the business to be able to do what customers need
them to know.

Speaker 3 (25:12):
You said before that basically large language models are going
to be commoditized. The foundational models are going to be commoditized,
particularly from a consumer perspective. Where then does the value
ultimately end up lying? Because there are so many people
trying to fix problems using generative AI, A lot of
them are coming to you to try and be bought
or helped at the moment, I assume because they're running
out of money themselves. Yeah, there's certainly a lot of air.

Speaker 7 (25:35):
Being sucked out of a room by big tech, right,
but there's still a ton of opportunity for startups. Microsoft
has to build for the lowest common denominator, right, so
individual productivity is very different than team productivity and team workflows.
So even though it feels like we're going to go
through sort of a big consolidation phase, I do think

(25:57):
there's still a ton of opportunity for for stars. We
have made a small acquisition that will announce soon, and
I think we'll make others. So there certainly is a
real high barrier for entry to come in and serve
the enterprise. But it's still there's so much blank white
open space for startups to help enterprises compete.

Speaker 5 (26:17):
It's interesting maybe that you use the c word consolidation.
I don't think glem DeLong went as far as using
the word consolidation, but I think you know, you said
something a moment ago about big tech sucking the oxygen
out of the room. It goes to the open source
closed debate. I assume you sit on the open source side,

(26:37):
but just wigh in.

Speaker 3 (26:38):
So we're kind of in the middle.

Speaker 7 (26:40):
So our models are proprietary, A bunch are on hugging
face so later generations of models, but our latest models
are are closed source.

Speaker 3 (26:48):
But by being in the middle.

Speaker 7 (26:50):
What enterprises really need is the ability to audit right
and have the transparency around training data and all sorts
of things related to the models they don't really want,
like the last mile cumbersomeness of necessarily like fine tuning
or running the models themselves is what we're finding. And
so like in the in the kind of sucking the

(27:13):
air out of the room, the confusion around what vendors
to turn to and how to actually get great applications shift.
That's where That's where I think there's still a lot
of confusion in the enterprise, and I think there's still
all that work to be done to minimize hallucinations.

Speaker 3 (27:28):
To ensure that we're seeing a clarity of where the
underlying data is coming from and you're not having copyright issues.
Give us clarity on your business. Now, have you been
approached to be bought? Are you remaining independent? Are you
raising more money?

Speaker 7 (27:44):
So there's there's a really long, i think, product journey
for us to really realize our vision. So I'm really
excited about remaining independent. It used to be a year
ago that I would say, you know, lms are for
the rudgery, the work you don't want.

Speaker 3 (28:01):
To do today.

Speaker 7 (28:02):
The capabilities are so incredible, they're as good as us.
But the future is work where you get to do
the work you want to do and lllms do the rest,
right because one person's drudgery is another person's creative passion,
and that's kind of compelling.

Speaker 3 (28:16):
Vision for the future of work.

Speaker 7 (28:17):
We're not seeing enterprises come up with. Yes, we talk
to hundreds of companies a week, and that really feels
missing right now. Kind of executives painting a vision for
what AI looks like inside their companies in a way
that brings people along. So there's a lot to do
both in you know, kind of bringing our vision into
the world and helping companies achieve theirs.

Speaker 8 (28:37):
Right.

Speaker 5 (28:38):
A CEO may have be great to catch up here
at Blue veg Tech in San Francisco.

Speaker 3 (28:43):
She slies every week from some Franciscot and on and
I'm back.

Speaker 5 (28:47):
That's what we're hearing, the world of the CEO in
the world of AI on a plane coming out here.

Speaker 4 (28:52):
We're going to be joined by Stephanie Jang.

Speaker 5 (28:54):
Partner at Sequoia, for her take on investing in AI startups.
Stay with us, we'll be right back. This is Bloomberg Technology.

(29:16):
Welcome back to this special edition of Bloomberg Technology. We're
back together live in San Francisco a Bloomberg Tech, our
annual conference, and here at the Tech Summit, we've got
to talk about investing the first checks into those new
and early AI startups. We have a fantastic guest for
today's Visa Spotlight, Stephanie Jan, partner at Sequoia. You guys

(29:39):
are so busy, you are writing lots of checks, but
the new companies being founded in AI are not the
same as they were one year ago, and certainly not
eighteen months or.

Speaker 4 (29:52):
Two years ago.

Speaker 5 (29:53):
Just give us the sort of timeline of where we
are now in this industry way.

Speaker 1 (29:59):
First of all, thank you so much for having me
at Caroline. It's an absolute joy to be here. We're
at a really interesting time in AI today, seven years
from the advent of the transformer, four years.

Speaker 3 (30:10):
Since the advent of the GPT three moment.

Speaker 1 (30:13):
I think twenty twenty four is going to be a
monumental year for AI.

Speaker 3 (30:17):
And here's why. I think this year is.

Speaker 1 (30:19):
Going to be a step function leap in digital intelligence,
everything from video to AI agents to robotics. I also
think that this year is going to be the year
we see a shift in the ecosystem to a thriving
ecosystem with many winners in the models area across closed source,
open source, large models, small models, and third, I also

(30:40):
think this is the year we start to see AI
commercialization at scale.

Speaker 3 (30:44):
And at Sequoia, we've been really busy.

Speaker 1 (30:46):
As you noted, we're highly selective about the companies that
we partner with, but this year, in just the first
four months alone, we've invested in ten new AI companies,
everything from new foundation models to new AI native applications.

Speaker 3 (31:00):
I love being went through the history like seven years
ago since the transmission model. I mean it was twenty
years ago just over that Sequoia wrote the first check
into in video, and now we think there's still that
company really owning really the oxygen in the room.

Speaker 5 (31:13):
And the value vest right they writing checks of their
own is a strategic investment.

Speaker 3 (31:18):
And I'm interested therefore exactly to AT's point, how competitive
is it out there to get those first checks in
ho Who are you seeing coming? Is it the corporates
that are wanting to write checks? Is it VC's wanted
to write checks.

Speaker 1 (31:30):
It's an incredible ecosystem right now, with everyone pouring money
into the AI ecosystem. I think it's very much reflective
of the opportunity that we see in AI, the large
market opportunity that is to come. I actually think that
we're still in the very early innings.

Speaker 3 (31:46):
Of all of this.

Speaker 1 (31:47):
Well, you know, it's the classic saying of we overestimate
in the short run, but we really underestimate in.

Speaker 3 (31:53):
The long run. And video has done a wonderful.

Speaker 1 (31:56):
Job of being such a critical hold in the ecosystem
with hard we're driving compute, but also now with so
many software tools and the entire developer ecosystem they've built
around them. So I think that we're just in the
early innings and there's a lot more to come.

Speaker 3 (32:10):
We were just speaking with Clem from Hugging Face Anadeine Mayhembib,
who highlight the fact that it's really expensive to do
this and video chips are a putty penny. How are
you seeing the companies that you back able to sustain
the investment they need to make. How do you make
sure the checks you write you're going in the right
direction and not just sort of going into the pool

(32:30):
of training money.

Speaker 1 (32:32):
Yeah, well, I think that the classic conventional wisdom is
that incumbents with scale, data, capital and distribution have a
natural advantage, and that's absolutely correct. It also costs a
lot to build these models because of compute and for
AI talent, but I also think that there are so
many nimble ways for a startup to compete. Specifically, I

(32:53):
think the next leap is really around one high quality data,
specifically high quality labels of data and targeted domain specific data.

Speaker 3 (33:03):
Second, it's really about what you do with that data.

Speaker 1 (33:06):
Reinforcement learning with human feedback I think will really shine
in this next era.

Speaker 3 (33:11):
It's an idea.

Speaker 1 (33:12):
Derived from reinforcement learning, but here an agent actually also
learns on the fly with human feedback, and that's what's
so brilliant about chat topt for example. And finally, I
think that you really differentiate not just on model performance,
which is where all the capital goes into, but it's
also around product distribution and the entire product experience that

(33:32):
you offer.

Speaker 3 (33:33):
To the end customer.

Speaker 5 (33:35):
You use the word incumbent, I think we should probably
talk about who those incumbents are because the point that
may have even right made to a certain extent claim
from Hogeyface is that big tech and where I think
we're talking about alphabet Microsoft in the first instance, are
sucking the oxygen out of the room. From a capital perspective,
a talent perspective, you invest in the preceed and seed sage.

Speaker 4 (33:59):
Do you find that be true?

Speaker 1 (34:01):
Well, I think that incumbents absolutely have an advantage, as
we just outlined, but I also think that new startups
have a shop scales. At AI actually recently released the
survey last week where they interviewed thousands of developers on
their most popular models, and the ones that actually came
into light were GPT four, GPT three point five, and
Gemini as the most popular models used. But we're also

(34:25):
starting to see new players come into play with models
that are just as competitive in performance. I'm really excited
about the open source model ecosystem enabling many more new
players to come into play. LAMA three, for instance, is
so powerful. The new eight billion PARAMETERAR model is a
longer trained, small model that I think will become a

(34:46):
really powerful building block for new developers to build new
applications on top of and to build new models around it.
It's going to drastically reduce the cost of what it
takes to build new experiences.

Speaker 5 (34:58):
We are increasingly talking about Beta and its competence in
building large language models. You speak highly of them. Where
do you see them they? I think, Zuckerberg said on
the Cool last week, we want to be the world's
leading AI company.

Speaker 4 (35:16):
Where are they in that journey?

Speaker 1 (35:19):
I think that they have an incredible advantage, and not
just because of the capital that they're willing to pour
into play, but also because of the entire treasure trove
of data that they hold, all this proprietary UGC content
that they can really use to train their models. One
of the things I'm really excited to see them enter
the scene with this year is a new generative video

(35:41):
foundation model, similar.

Speaker 3 (35:42):
To what we saw with Sora and open Ai.

Speaker 1 (35:45):
To me, the most powerful thing that unlocked was that
the methodology we take for building large language models and
digital intelligence works for video as well. You take a
diffusion transformer model and you just scale it with enough
video dat and compute and meta has a wonderful advantage
given the entire treasure trow of content they have.

Speaker 3 (36:06):
To compete in the bosom.

Speaker 1 (36:08):
And then what they're doing with Lama three I think
is game changing entirely. It opens the playing field for
everyone themselves new startups, lowering the cost for a thriving
ecosystem with many winners.

Speaker 3 (36:21):
Come back when you've got more checks you can announce
in that thriving ecosystem. Such a joy to be here
with you. Thank you so much for having it, Caroline,
and having by Sequoia partner Stephanie Chan. Welcome back to

(36:41):
this special edition of Bloomberg Technology, the heart of San Francisco,
big event upon our hands and every year in fact,
Rumbag Business Week releases in tandem. It's a list of
tech wants to watch. But these are the startup founders,
the big tech managers, the mom he investors as well,
who of playing a big role in shaping text future

(37:02):
and joining us now is one of these ones. To
words please to welcome you did madame Amazon vice president
for of course the worldwide operations side of the business,
your first interviews. It's taking on an enormous role of
more than a million people that you manage the focus
of getting my package to me in the swiftest way,

(37:23):
most cost efficient manner as possible. Can I just ask
what your day looks like? What is a day in like?

Speaker 8 (37:29):
Well, first, Caroline and thank you for having me it's
supposed to be here.

Speaker 5 (37:33):
Yeah.

Speaker 8 (37:34):
For me, really, my day starts, you know, fairly early
in the morning. But you know, it starts with thinking about,
you know, the team I've got.

Speaker 4 (37:41):
You know, we've got a very very broad team all
around the world.

Speaker 8 (37:45):
We've got four thousand different locations that we operate around
the world, and really it's focused on how do we
continue to innovate on behalf of customers that do it
in a way that puts safety.

Speaker 4 (37:56):
And people at the forefront.

Speaker 8 (37:58):
And so my day is really focused on innovation across
four different spectrums.

Speaker 4 (38:03):
Safety, really the.

Speaker 8 (38:05):
Customer experience with delivery speeds innovating, especially with what's happening
with technology finding new ways, you know, whether that's through
robotics our operations to make things more efficient and driving
you when you're.

Speaker 5 (38:17):
Talking about the technology, we're talking about everything from the fleets,
right so there's a transition to sustainable energy in the
fleet context, talking about robotics in the fulfillment centers, and
dare I say AI in tracking the data? What's the
biggest investment focused for you right now? And technology roll out?

Speaker 8 (38:35):
You know, we've got technology all across our operations and
there's really two things that I would maybe thematically talk about.
One is, we do have a lot of investments in
automation and robotics that are going on, especially with how
quickly things are accelerating with General VII. We have investments
on really novel foundational models that look and use the
high quality data that we've gathered in source as we

(38:57):
ship tens of millions of products every day, and those
going to help make some of those robotic solutions more
generalizable as well as make them more efficient.

Speaker 4 (39:04):
And the second is we've.

Speaker 8 (39:05):
Been working on a set of really inventive robotic solutions
over the last few years that are finally reaching maturity
and scale and we'll start to roll out starting this year.
Both those are really exciting and it will be transformative
for operation.

Speaker 3 (39:17):
I mean, you've got to be inventive because Annie Jase
is asking you to focus on costs, but I'm sure
the innovation in a way does longer term once you
made the investment strip out some of the costs, but
ultimately does that come at a sacrifice of labor. How
do you talk to those people that you are so
key when you focused on to ensure that they feel
that are being augmented not replaced.

Speaker 8 (39:36):
You know, the best thing I can talk about is
our history. You've deployed seven hundred and fifty thousand robots
over the last decade across our operations. We've done that
while creating hundreds of thousands of jobs. And you know
what's really interesting and not a lot of people know,
is we've created dozens of new classic jobs, your skilled jobs,
technical jobs. And what we've learned in that process is

(39:57):
that one of the most important things that you can
do you as a company in this world of generatively
I and robotics, is to really focus on investing employees.
So we've launched two different programs. One is a twenty
twenty five off skilling pledge that really helps train people
for this new workplace in the future, and a Yeah

(40:17):
Ready program that's generally available to everybody that's really focused
on investing in helping provide a skill training.

Speaker 4 (40:24):
Over two million people.

Speaker 8 (40:25):
So really focusing on people alongside the investments who are
making in general.

Speaker 4 (40:29):
VI in Revidy, we just have thirty seconds.

Speaker 5 (40:31):
What's your one personal goal for the year, something you
want to achieve.

Speaker 8 (40:34):
You know, for me, there's more than one, but I'll
quickly I'll try to answer it quickly. The first and
the highest priority for us is safety, and we want
to be the safest workplace across the industries we operate
in making measurable and really remarkable progress in that area.
I want to company to invest in that. And the
second is to compete to improve the convenience for customers
and delivery speeds is an area of focus.

Speaker 3 (40:55):
Congratulations on being one of the key ones to watch.
Phenomenal the amount of people who manage young age that
you are, madame. We thank you, Amazon vice President of
Worldwide Operations. Meanwhile, I mean from ones to watch of
individuals to everything you've got to watch coming up, because
this is going to be an amazing set of conversations.
I'm going to be speaking with a key chip leader.

(41:17):
Of course, you're going to be speaking about the future
and technology. Who have you got lined up?

Speaker 5 (41:20):
Yeah, I'm going to talk to Tom Oxley of synchron
I'm going to talk about brain implants and what the
right method of putting a electrode into one's brain is.

Speaker 3 (41:29):
I love asual casuals perspective. Renee James is joining me
and Perco. Look, this is the question that having just
spoken with Renee the other Renee and chips of arm.
Where is the market share being taken by these newer players,
taking from AMD, from Intel, even potentially in video.

Speaker 5 (41:46):
Thank you so much for joining us on this special
edition of Bloomberg Technology. It's great to be back together
in the field, but we actually have a full day ahead,
so many great guests stay with us. Thank you for
tuning in from San Francisco at Bloomberg Tech for this
is Bloomberg
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.