All Episodes

September 30, 2025 38 mins

Bloomberg’s Caroline Hyde and Ed Ludlow discuss CoreWeave’s deal to supply Meta with up to $14.2 billion worth of computing power. Plus, Spotify shares sink on news that founder Daniel Ek will transition from CEO to chairman. And Anthropic Chief Product Officer Mike Krieger explains why the company is focusing on enterprise clients with its new model that can code for 30 hours straight.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, podcasts, radio news. Bloomberg Tech is alive
from Coast to Code with Caroline Hide in New York
and vl Loow in San Francisco.

Speaker 2 (00:22):
This is Bloomberg Tech coming up.

Speaker 3 (00:24):
Core.

Speaker 4 (00:24):
We've signed a deal to supply Meta with as much
as fourteen point two billion dollars worth of computing power.

Speaker 5 (00:31):
Plus a change of leadership at Spotify, a CEO Daniel
Ck stepping aside after almost two decades.

Speaker 4 (00:36):
And Anthropic releases a new AI model designed to code
longer and more effectively than prior versions. We speak with Aropis,
chief product officer, but.

Speaker 5 (00:45):
First ed we are both here in New York this
week and checking out the markets that are just taking
a pause amidst some macro focus. Of course, TENSI shutdown
of the US government that's on every investor's mind. What
does that mean in terms of data to what does
it mean in terms of the future of rape policy
the FED. We're currently completely flat, just pinching into the
green on the Nastet one hundred, but it's way more
interesting underneath the hood and you're looking at that.

Speaker 4 (01:07):
Yeah, our top story is Core, Weave and Meta. It's
the latest Core Weave capacity deal fourteen point two billion dollars.
The stock has been on a tear since it's listing
in March right now trading at its highest level in
around six weeks. Clearly that big game following the Bloomberg report,
The market's kind of focused on this idea, that is
evidence core weaves move beyond one single customer, which is Microsoft.

(01:28):
But actually, as Brodie up Forward often says, carry the
devil's in the detail on this one.

Speaker 5 (01:33):
Let's get that detail, Dy forward with us. You help
break the story. Core weave has tripled since it's IPO.
We're up another fifteen percent. What does it mean to
be adding Meta to the fold as well as open
Ai more recently of course in Microsoft O.

Speaker 6 (01:48):
Well, it means that core Weave isn't just a colonial
state of Microsoft.

Speaker 7 (01:52):
Right.

Speaker 6 (01:52):
That was the concern for so long with these neo
clouds that gosh, if their biggest customers are companies that
also effectively compete with them, that seems like a pretty
indefensible business. But you start getting companies like Meta and
open Ai who are likely to be longer term buyers
of this technology, and so I think folks are getting
more comfort that Corewave and companies like it may have

(02:16):
a more sustained place in this AI infrastructure build out.

Speaker 4 (02:19):
Fourteen point two billion dollars is a very large number, Brodie,
But the terms of the contract extent of twenty thirty two.
You know what I'm going to say, because we discussed
it this morning. There is skepticism here because there are
many unknowns. For example, does call We've actually have a
data center somewhere that's built and operating that they can
assign Meta's workloads too.

Speaker 2 (02:40):
Well.

Speaker 6 (02:40):
What the CEO told me is that they never sign
a deal if they don't have the right power and
data center allocation ready to go. I don't think there's
a data center that they're just going to flip on
and say, aren't Meta here you go. But you know,
this contract goes till twenty thirty one. They're probably going
to need to buy a bunch of chips, fill it
up and hand it over to Meta. And what that

(03:01):
also means is they're going to take out a lot
more debt. I mean, Corewave is kind of famous at
this point for levering itself up to a pretty incredible degree,
and we'll probably see a little more of that.

Speaker 4 (03:11):
The role of debt in financing AI infrastructure. That's becoming
a story of the year twenty twenty five, Blue Most
Brady Ford with the core we've metastory.

Speaker 2 (03:18):
Thank you.

Speaker 4 (03:19):
Meanwhile, AI's boom is sending investors searching for trades in
an overlooked group, suppliers of the gear used to make chips.
Let's bring in Bloombos Tech equity reporter Ryan blaceelca lots
of readership on this story and some names that have
not been at the forefront of what's happened within the semispace.

Speaker 2 (03:38):
Give us your reporting.

Speaker 8 (03:40):
Hey, good morning, Thank you for having me so. Yeah,
this is a trend that we have been seeing throughout
twenty twenty five now that some of the more well
known AI trade seems like maybe those are getting a
little bit mature. Investors are looking elsewhere. We've seen it
in storage companies, We've seen it in memory companies, So
now we're seeing it in the semiconductor capital equipment companies
named like LAMB Research, Applied Materials, KLA Core, Terradyne. These

(04:04):
kinds of companies which make the equipment that is used
for building the semiconductors. Building all this manufacturing out these
are getting sort of a second derivative move. Given the
sheer build out we're seeing all the chips that are
required for AI, obviously it's going to mean higher demand
for the chip for the machines that make those chips.

Speaker 5 (04:22):
We constantly question valuations, ran should we be worried about them
for these names?

Speaker 8 (04:28):
Well, I'd say if we were talking about this maybe
a month or so ago, it would be a lot
less of a ponent issue. But some of these companies
have really seen very steep rises. They've really become momentum
favorites over the past couple of weeks and months, and
I think right now they are getting to a little
bit more elevated levels. But certainly they are not sort
of a nosebleed, real sort of bubble kind of valuation,
but they certainly come up a lot from where they

(04:48):
were earlier this year.

Speaker 4 (04:49):
Ryan, if you walk into a fab the clean room
where semi condut is are manufactured, along the line is
machines for lots of different companies. At the beginning of
that line, you're probably likely to have one from LAMB Research.

Speaker 2 (05:01):
The chart that we took from your.

Speaker 4 (05:02):
Story shows LAMB being a real outperformer here.

Speaker 2 (05:05):
Is there anything more to know about that name.

Speaker 8 (05:08):
I think they are one that especially used with Micron,
so the memory chips that has been another area of
focus this year, so that high bandwidth memory. I believe
that LAMB works on machines that are used in those
I think that's probably certainly a component to their business there.

Speaker 5 (05:21):
Ran, So it's great to catch up with you, Ran, Plaselkah.

Speaker 9 (05:24):
We appreciate it.

Speaker 5 (05:25):
Meanwhile, sticking with chips at Taiwan's premiere, says that trade
talks with the United States have ended, quote the crucial
closing stages, indicating the global chiphub is finally nearing a
deal with the Trump administration.

Speaker 10 (05:37):
Now.

Speaker 9 (05:37):
Investment in the US was.

Speaker 5 (05:38):
Among the issues discussed in Washington in recent days, according
to a source, as well as lowering the twenty percent
tariff imposed on the island ed.

Speaker 2 (05:46):
Yeah. Some other news we're tracking.

Speaker 4 (05:48):
Google's agreed to pay twenty four point five billion dollars
to resolve Donald Trump's claims that being blocked from posting
on his YouTube channel after the January sixth, twenty twenty
one riot at the US Capitol amounted to illegal censorship.
That's according to a court filing which also shows twenty
two million dollars. We'll go toward construction of a new
ballroom in the White House, a project near and dear

(06:10):
to Trump, while the remainder will go to a handful
of other plaintiffs who joined him in legal action. Okay,
Coming up, and Thropic has a new AI model that
codes on its own for up to thirty hours straight.
No need to feel it soylent, no red bull, and
Thropic chief product officer talks to us about Claude Sonnet
four point five A conversations.

Speaker 2 (06:31):
Next, this is Bloomberg Tech.

Speaker 4 (06:46):
Deep Seak update is an experimental AI model, calling it
a step toward next gen AI. The latest version introduces
a new technique it calls deep Seek Sparse Attention or DSA,
a mechanism designed to explore and optimize AI training and
operation and improve efficiency when processing long tech sequences.

Speaker 5 (07:07):
Correct, let's stick on the models AI startup Anthropic is
that with a new one Claudes on it four point five.
The company says it can code longer, more effectively than
prior versions. Let's get more with Anthropics chief product officer
Mike Kreger.

Speaker 9 (07:20):
Mike, it's wonderful.

Speaker 2 (07:22):
How to have you on.

Speaker 5 (07:23):
Thirty hours straight is how long then it can code
on its own? What are the technical feats needed to
be able to go that long where humans can well
definitely not survive that unless as a whole load of
caffeine involved.

Speaker 11 (07:36):
Good morning.

Speaker 12 (07:36):
I think one of the main advancements we made was
around memory and what we call context management.

Speaker 11 (07:41):
So if you think.

Speaker 12 (07:41):
About how you human works for longer periods of time,
you're writing things down, You're making sure you can always
pick up where you're left off if you're coming back
the next day. So with cloud sign It four point five,
we did a lot of work on that memory management.
So the model, if you think about it, sort of
writes down what it's doing, keeps track of its state,
and then if it needs to sort of backtrack a
table to then keep going. And that's how it's able
to stay coherent for a much longer period of time

(08:03):
than any other of our models.

Speaker 5 (08:05):
How much you've managed to lower therefore those ideas of
inaccuracies or more broadly, that well that they're making things up.
Hallucination has always been the key issue. Heren't been something
that's limited edgentic AI adoption.

Speaker 12 (08:19):
Yeah, this is the model that we have that is
also besides being our most powerful, is also are safest
and most coherent, so as the lowest hallucination rate, and
is the least susceptible to things like jail breaks.

Speaker 11 (08:29):
And I think that matters a lot. I think I
tell my product.

Speaker 12 (08:31):
Team all the time, it's no use going for twenty
thirty hours if you're making mistakes along the way, and
so having it be both accurate producing good code that's
the prerequisite, and then you can focus on scaling up
on the time horizon.

Speaker 4 (08:42):
Mike, can we talk a bit about the audience for
claudes on at four point five. The real emphasis from
Entropic from the early days was enterprise customer as opposed
to a sort of direct consumer. But this field of
tools for the developer is expanding. It's probably more competitive.
Who are you hoping uses this?

Speaker 12 (09:01):
It's really we've taken a business focus, but that also
manifests kind of in the prosumer space, and so we
have a lot of what we call power users who
might be developers or might just be early adopters who
want to bring AI to their work. So one of
the things that SIGNUT four point five can do, along
with writing code, is also creating really professional looking word
and PowerPoint and Excel documents. It actually uses the same

(09:23):
coding capabilities under the hood, but not to write code,
but instead to produce documents. And that sort of capability
means that we're starting to see adoption in the enterprise
as well.

Speaker 4 (09:32):
Mike, you and I have discussed this in the past.
We know Mike Krieger as Mike Krieger CTO Instagram, And
what I'm seeing right now in the field of your
peers is the reports on open Ai and what Meta
is doing in social media and video. Is that a
direction you want the product team to take, claud.

Speaker 12 (09:51):
In, We're focused much more on the productivity use case,
and so when I think about our roadmap, it's very
much how do we take workoff people's hands, or how
do we accelerate folks and make them, you know, the
work the best that it can be. How do we
automate your work in the browser? So much more on
that productivity side of things, and I don't think you'll
see us play very much in that entertainment space.

Speaker 5 (10:10):
Might productivity perspective. It's come under some concerns recently. Think
about the MIT report everyone's suddenly talking about this only
well ninety five percent of the tests were basically failing
out there in the wild. How are you making sure
that enterprises adopt your products but actually see the productivity
gains from them.

Speaker 12 (10:30):
I think this is really important, where if you know,
AI gets brought into the workplace without the right either
tools around it or enablement, what you end up with
is this disillusionment a couple of months later around while
folks aren't adopting it, or yeah, it helped me a
little bit, but not enough. And so we have a
lot of emphasis on let's.

Speaker 11 (10:45):
Make sure the work is actually good.

Speaker 12 (10:46):
You know, you might hear this word online like slop,
where AI is creating work that actually just is not
very good. And I think of us, we're trying to
produce the anti slop work that actually, you know, maybe
gets you eighty percent of the way there, but it's
eighty percent that then lets you complete the work in
a way that you're proud of, rather than you know,
opes it did something, but now I feel like I
have to start over because it didn't really help. So
I think that's the really key piece for enterprise adoption.

Speaker 5 (11:07):
And therefore, does it remove the need for so many
people or ultimately there's still this argument this can all meant,
but will it start to replace?

Speaker 12 (11:16):
Mike, we think a lot about what, you know, the
comparative advantages are of people, you know, as relates to AI.
There's a lot of still relationship building and trust, critical
analysis and strategy that really comes on the human side
of things, and so we really try to design tools
that as much as possible play up those parts of
that human AI interaction, knowing that you know there will

(11:38):
be you know, laborships that are almost inevitable, but if
we can design our products along the way to maximize
both people's understanding of AI but also their use in
a complementary way.

Speaker 4 (11:47):
Mike, open a eyes holding dev day next Monday. It's
probably on your calendar for peripheral awareness. But I'm very
conscious that you're kind of speaking to us twenty four
hours after the news of clots on four point five
km out. Have you any data on the sort of
reaction to it, early demand and where that's coming from.

Speaker 12 (12:08):
It's been really interesting how quick people are to adopt
a new model. So by I think about one pm yesterday,
we already had more usage of clouds on at four
point five than all of our other models combined, which
really speaks to the eagerness of a lot of these
startups that are building on top of our models, as
well as early adopters to on day one you saw
GitHub and Cursor and Windsurf and many of these products

(12:29):
that build on top of our models want to incorporate
clouds on at four point five, and so we had
this really early crossover moment as well.

Speaker 2 (12:35):
That is interesting data.

Speaker 4 (12:38):
Correct me if I'm wrong, But this model is running
on Project Rainier, right, is that sort of operationally and
infrastructure wise where the training and now inference of it
is being done.

Speaker 12 (12:50):
So we do our both training and inference across You know,
we have partnerships with Google and Amazon, but we have
a significant part of this model being served now from
Amazon as well, and we're seeing a lot of growth
on a tobospedrock as well.

Speaker 5 (13:05):
Just going to that infrastructure layer. You're obviously the product
visionary here, but you need to have the energy the
compute to bring your products to life. Many worrying that
we're in some sort of bubble cycle around AI. How
do you think about that as you drive this business forward?

Speaker 12 (13:19):
Might I think there's this combined need to both scale
up for the training side, but also on inference. And
as we've scaled, especially with our business arrangements and the
companies building on top of son it, I think we
now have a sort of forward looking perspective on what
our inference needs will be, and I think that'll let
us go out and also secure the kinds of compute
deals that we need to both feel the training but

(13:39):
also have that sort of revenue generating inference side as well.

Speaker 5 (13:43):
And we get so focused on the compute needs of
the United States, but we've been talking a lot about
that in Europe. How it's scaling in the UK. From
your adoption, how are you seeing things different globally MIC,
Because those that are actually deploying what's happening with Sonnet
four point five in the latest models, we see this
a lot.

Speaker 11 (13:59):
In terms of our global footprint.

Speaker 12 (14:01):
There's something we started expanding earlier this year, and so
for our rollout of Trainium too, which is the chip
that Amazon has built for a WS that we use
pretty extensively for our cloud models. A lot of that
deployment is actually international, and when I go to Europe,
for example, I hear a lot of questions about data
locality and making sure that inference is happening in local
data centers.

Speaker 11 (14:20):
And the only way we're going to be.

Speaker 12 (14:21):
Able to do that is to have that international footprint
of these chips.

Speaker 11 (14:24):
And so you've seen the same in APEC.

Speaker 2 (14:26):
Mike.

Speaker 4 (14:28):
We are going to ask you a question about talent wars,
but I'm just going to make an appeal to you
to just be honest with me on this. How big
a factor it is or isn't for you right now?
In the product team at Anthropic in particular, I'm looking
at the pace at which open ai is putting stuff out,
Meta is putting stuff out. Just through your experience, what's

(14:49):
the talent situation right now?

Speaker 12 (14:51):
I'm seeing much more of that talent sort of you know,
back and forth happen within the research site in general,
a little bit less on the product. I think there's
some key hires where that's been the case. One thing
that's been a positive sort of maybe surprise or just
outcome of how mission oriented a lot of Anthropic, a
lot of the Entropic team really is has been. It's
affected us very minimally in terms of that back and

(15:13):
forth that you're seeing maybe among some of the other
frontier labs, which.

Speaker 11 (15:17):
Is very encouraging.

Speaker 12 (15:17):
Of course, you have to continue to make sure we
build a great culture and maintain that mission alignment.

Speaker 11 (15:21):
But so far it's it's been minimally affecting us.

Speaker 4 (15:24):
If we take so on it. Four point five is
the case study. What were the types of roles that
you needed to bring in to roll out the release.

Speaker 12 (15:32):
I think people think of, you know, research and model
science as being fairly cut and dry. I actually think
that there's a lot of art and taste to it
as well. You are making a lot of decisions from
a research engineering perspective around what are the task the
model needs to improve on, how will it improve on that,
how will we know that it's improving on that? And
so a lot of that reinforcement learning post training piece

(15:53):
is the key shape of what we really thought about.

Speaker 11 (15:55):
And son at four point five.

Speaker 4 (15:57):
My creagain andthropic chief product off. So it was a
real deep dive into Sonnet four point five. Thank you
so much for joining us. Back on Bloombo Tech, I've
coming up on the show Door Dash has a new
autonomous robot.

Speaker 2 (16:08):
Meet Dot, Hi Dot.

Speaker 4 (16:11):
We'll talk to the VP of door Dash Labs about
what this robot delivers.

Speaker 2 (16:16):
In terms of its capabilities. That's next. This is Bloomberg tech.

Speaker 4 (16:46):
Delivery company door Dash is unveiling a new autonomous delivery
robot called Dot. The company says Dot is the first
commercial delivery bot to seamlessly navigate bike lanes, roads and sidewalks.
His stirs more actually read vice president of door Dash Labs.
This robotics experience includes work with zooks and video, but

(17:08):
the real terms parameters for deployment, which cities, how many
deliveries when.

Speaker 3 (17:17):
So, we have been focusing on the greater Phoenix area
and that's our focus for this year. We are hoping
to address about one point five million customers by the
end of this year, and then you know, as we progress,
we'll see what happens. DoD likes to travel though, so
hopefully we can expand to more cities.

Speaker 4 (17:36):
Team, let's bring up some pictures of Dot again. I
mean to me assue you that this looks kind of
the design like a bit like a stroller or a
pram as we might have said in the United Kingdom.
Could you just talk us through the design of it
and how this is what you arrived at.

Speaker 11 (17:55):
Sure.

Speaker 3 (17:55):
So there were three sort of key pillars for us
when we looked into the line of Dot. The first
one was product market fit, so we wanted to make
sure that Dot would have the right cargo capacity and
the payload that it can carry up to thirty pounds
fits a vast amount of the door dash deliveries that

(18:15):
we do today. The other big part was it had
to be going at speeds that allowed us to do
a big chunk of deliveries in the three to mile range,
and that's why the design point was it needs to
be able to go on bike lanes, on roads, on sidewalks.
But the other key part is the pickup and drop

(18:36):
off for deliveries, which makes it very different from right hail.
And one of the key feedbacks we had gotten from
merchants was it is absolutely imperative to have the robot
come as close as possible to the merchant, right next
to the doorstep, which is what Dot does, and so
that was a key design feature, was that it could
go on sidewalks and be sort of narrow enough to

(18:59):
navigate sidewalks.

Speaker 5 (19:00):
And pretty well that what our shoe currently isn't on
the market because you're already using Coco Robotics spat by
some aaltman. There are others on the market. Isn't just
the speed the pace that they can't meet.

Speaker 3 (19:11):
Yeah, So we actually also announced as the key product
called the Autonomous Delivery Platform, which we actually in which
all our partners, including Coco and our drone partners and
other robots including Dot can participate. And the idea is
that as the demand for delivery is growing, we have

(19:31):
an all of the above approach, which is there'll be dashers,
they'll be drones, they'll be Dot and so we are
looking to work with all of our partners to enable
this different modalities. So we believe the future will be
multi modal for our delivery. And that's this is the
first step towards.

Speaker 5 (19:48):
That I cycle every day, I navigate dashes. How will
how many dashes will I be navigating in the future?
Is this going to replace careers in the long term?
How haven't much of a percentage do you want delivered
by Dot?

Speaker 3 (20:00):
Well, we don't look at this as a percentage. We
look at it as just a growing pie. So if
you think about the just over the last few years, DoorDash,
the demand for delivery has been growing significant a year
over year. So we think of this as like I
said before, it's just going to be multimodel. There are
going to be many, many different forms of delivery. And
you look at the range of deliveries we do. If

(20:23):
you just look back to you in five years ago,
we were doing mostly restaurants. Now DoorDash does groceries, It
does household items like you know, toothpaste or diapers, even
home electronics.

Speaker 11 (20:33):
You can buy a.

Speaker 3 (20:34):
Complete laptop on door Dash and have it delivered to
your door step. So we expect all kinds of deliveries
to be happening on our platform. And this ADP that
we announced yesterday is the first step towards at.

Speaker 4 (20:46):
Ashu who is going to manufacture dot and where will
they manufacture dot?

Speaker 3 (20:52):
So at this point we are you know, going through
that process of figuring out all the all of the
pieces of where the manifest actually is going to be
and where you know, all the components, et cetera.

Speaker 2 (21:02):
So it's not decided, it's still.

Speaker 3 (21:05):
Being discussed internally as to how we will end up doing.

Speaker 5 (21:08):
This is the aim to have it largely us.

Speaker 3 (21:11):
Made, so we you know, a big the technology that
we have developed here is all built here in the
US in Indoor Dash Labs, so it's pretty much purpose built,
homegrown robot, you know, and it uses a state of
the art technology, again developed by the engineer's at door Dash.
It is fully LPHO autonomous system, so it, you know,

(21:33):
as you can see, navigates through all of the various
situations that can encounter, whether it is pedestrians or cars,
and again going up in sidewalks, going up to your doorsteps,
lots of pedestrian and what are called vr US vulnerable
road users which means kids, pets, and so on. So
all of that technology was developed in DoorDash Labs.

Speaker 5 (21:54):
So motoring along at twenty miles an hour next to
me on the cycle lane, so surely I'm sure she
thanks so much for joining us. Shue Reggae, vice president
of door Dash Labs at door Dash and coming up
AI chip startup Cerebras Systems. Well, it's closed a new
funding round an over eight billion dollar valuation. We're going
to talk to the CEO Andrew Folman, because remember this

(22:15):
is a company that actually wanted to IPO. Does that
delay that inevitable? We could talk the ambitions next and
what are you looking.

Speaker 4 (22:21):
At to private markets, conversation, in equity markets and technology,
there is some stuff going on. I mean the broad
story is that we are basically flatten the now's like
one hundred traders are in weight and see mode.

Speaker 2 (22:32):
Will we get some economic data? Will we not?

Speaker 4 (22:34):
But we know the big story out there on the
single mover side is core weave up thirteen percent, highest
level than six weeks on a fourteen point two billion
dollar compute deal with Meadow, which is down one and
a half percent. So much more to come on the show,
stay with us. This is Bloomberg Tech.

Speaker 5 (23:00):
Welcome back to Bloomberg Tech. Let's check in on these
markets because on the surface some calm, we're currently off
by just a quarter percent, and then adds that one
hundred anxiety about a potential government shut down here in
the US. What does that mean for key data in
the jobs market, particularly on Friday? What does it mean
for the federal Reserve? We stampat in terms of big tech,
but underneath the hood, let's look at individual movers because
there are some big deals being done.

Speaker 9 (23:21):
Once again in the spectrum space.

Speaker 5 (23:23):
We are having the best month for EchoStar on record,
at more than twenty percent because it keeps on selling
Spectrum this time we understand it's likely to be selling
to Verizon. There's talks of Plum and the moment for
AWS three licenses. Already they've been selling to AT and
T and SpaceX. Let's have a little look at what's
also on the move. Big deals being done in AI compute.
We're going to delve into that throughout this section. We're

(23:44):
up twelve thirteen percent on Core. Weave fourteen billion dollar
deal to Meta Yet more GPU access, more compute, but
we've got plenty more when it comes to the world
of AI infrastructure.

Speaker 11 (23:54):
ED.

Speaker 4 (23:54):
Yeah, a lot more happening in technology in private market.
It's AICHIP making startups. Rebrest Systems Is is a one
point one billion dollar funding round Cerebras, which aims to
rival and Video Now, has a post money valuation of
eight point one billion dollars. The company's CEO, Andrew Feldman,
joins us for more. Andrew, good morning, Welcome to Bloomberg TEG.
You know, my assessment with cerebras is is that you

(24:18):
claim that the technology is as competitive or better than
what in Vidia's systems offer, and you are aggressively building
out that infrastructure, and you are doing deals with end
customers in the context of this funding round, what's your
priority here?

Speaker 10 (24:35):
Sure, I think it's not just what we think. Every
third party benchmark has shown that we are ordered twenty
times faster than in video GPUs AT for inference work.
So that's for the using of AI. And so this
is the largest and sort of fastest growing part of

(24:57):
the market. So we wanted to fuel our continued extraordinary growth.
And so we're going to use this money to double
our manufacturing capacity. We manufacture in the US, and we're
going to double in the US, to extend our footprint
more data centers so we can support more customers and

(25:20):
those are in the US as well, okay, and to
continue to invest behind our pioneering technology which has sort
of solved problems that were open for seventy five years
in the compute.

Speaker 4 (25:34):
Industry andrew to what extent you supply constrained? In other words,
you have all of these customers, are you having to
choose who gets first DIBs when some capacity comes online?

Speaker 10 (25:45):
Right now, one of the most challenging components is to
get access to data centers, and those are some of
the massive contracts you discussed in your previous segment with
Core that they're providing to the like of Meta. Meta
is also one of our customers. But the ability to

(26:09):
stand up and deliver data centers filled with our gear
so that our customers can enjoy the benefit of the
fastest inference through the cloud is one of the limiting factors.

Speaker 2 (26:19):
Right now.

Speaker 5 (26:20):
You've got what five new data centers just in the
course of twenty twenty five. Looking at Dallas, Oklahoma City,
look at Santa Clara. What's really interesting, Andrew is that
where are you building? Are you literally going out there building?
Are you just trying to win future relationships with a
Core weave so that they take your compute rather than
of Nvidia.

Speaker 10 (26:40):
Well, I think we are renters of data centers. We
are not builders of data centers, and so we partner
with those who own the real estate and those who
stand up the facility. We fill the facility, We rent
the facility, and then fill it with our infrastructure. Infrastructure

(27:01):
is then used by customers around the world, either by
the by the day, by the month, by the token
to get their fast AI and so I think that's
the way we think about it.

Speaker 5 (27:17):
Caroly Andrew, we think about your customers and you're mentioning
the likes of Meta. I'm interested in also where the
money has been coming from for this round, because I
noticed that one of your key backers of the past
isn't on the list, seems G forty two key investor
in previous and of course Middle East and based relationships
potentially CEO with China, and that's been an issue for

(27:38):
Syphius in particular, if you wanted to IPO, have you
decided not to take money from them this time?

Speaker 10 (27:43):
No, that wasn't That wasn't the situation at all. I
think it is very common in your dash to get
to being public to raise raise a late stage round
from public market investors. Around was led by Fidelity in

(28:07):
a treatise. It included Tiger Global Valor seventeen eighty nine,
It included Alpha Wave, all of whom have large and
predominant public market practices, and so I think this was
a round that was aimed at a different class of investor.
We are continuing to collaborate with G forty two. They

(28:30):
are our strategic partner. We are building enormous clusters in
the US for them. We are training models, We are
serving models, including one that they built in partnership with
their leading university in the UA, nbz UAI, so that
partnership remains rock solid.

Speaker 2 (28:52):
Andrew quickly.

Speaker 4 (28:53):
Then the obvious question is does your ongoing association with
G forty two preclude you from pushing forward within investment
based on the history of the Syphius review.

Speaker 10 (29:03):
Well, not at all, not at all. I think each
investor has their their own appetite by stage, and what
we found in this round was that the lead investors
had a very large appetite and so we were able
to meet that appetite. We cleared Syphius in March, and

(29:25):
so there is nothing blocking the IPO.

Speaker 5 (29:30):
So when, Andrew, when in mid March we cleared when
will you IPO?

Speaker 9 (29:36):
Dare we ask?

Speaker 10 (29:39):
Yeah, that's the big question. I appreciate you asking, and
as you also know, I'm unable to tell you with
S one on file, but I appreciate it.

Speaker 5 (29:47):
We keep the dance going, Andrew Fellman, it's so good
to have you Ceo Cerebra systance. We appreciate it.

Speaker 1 (29:51):
Lott.

Speaker 5 (29:51):
We go stick with AI infrastructure now, because spending has
been dominant theme inequities ever since CHATCHYBT ten. Everyone's attention
to the technology, and of course it's compute needs. But
the proliferation of data centers across the United States is
putting pressure on energy supply and sending wholesale electricity costs
a sawing for you for customers paying their bills. It's
more on this with the big take. Let's bring in

(30:13):
Bloombog pot Power reporter Josh saul Sot of Bloomberg Data
is reporter Leo Nicoletti, and I start with you, Josh,
the compute You go down to the individuals this is
affecting you, particularly go to Baltimore. Can you tell us
a little bit about how you found these individuals and
what the effect on them is of Well, basically all
the data center is being built pretty close to.

Speaker 11 (30:34):
Them, right.

Speaker 7 (30:35):
So the headline here is that data centers and their
massive power demands are driving up power bills for everyone.
The people that we spoke to in Baltimore are some
we found because they had testified in a city council
meeting about their bills. Some we found because they were
eating eggs next to us at a diner. We talked
to a lot of people and everyone's really upset and
having a hard time paying their power bills because of

(30:58):
this effect.

Speaker 2 (30:59):
Leo.

Speaker 4 (31:00):
This is the latest in a series really of pieces
of data journalism from this newsroom about the impact of
the buildout in data centers. Just explain methodology and the
data set that we arrived at that gave us that headline.

Speaker 13 (31:14):
Yeah, so, you know, essentially, we dove into very granular
data on wholesale power prices, which is an important component
of what then makes up power bills, and we looked
at locations tens of thousands of locations throughout the country
and we kind of related it to the location of
data centers. And what we found essentially is that areas

(31:37):
located closer to data center activity are much more likely
to experience price increases than areas located far from data centers.

Speaker 5 (31:47):
That's what's so interesting about data center ali in the
Virginia region, and this is why you go to Baltimore.

Speaker 9 (31:53):
Can you, therefore, just break.

Speaker 5 (31:55):
Down us in New York are we going to be
feeling the impact? How are you going to start seeing
people built up and affected by region that they live,
and how certain local government is going to have to
stand in and help in some way.

Speaker 7 (32:07):
Data centers affect power prices and your power bills in
two main ways. One, they just use up so much
power that economics one O one it makes power more
expensive for everyone else. And two they require so much infrastructure,
you know, new transmission lines, new power plants. The way
utilities work is those costs are spread out among everyone.
The reason this might matter for some of your some

(32:27):
of our some of our listeners here, maybe they pay
their their like I pay my power bill, fine, that's okay,
and some poor people have a hard time paying theirs.

Speaker 2 (32:36):
Okay.

Speaker 7 (32:37):
But it can really affect these companies because if data
center developers, big tech firms have a harder time connecting
to the grid, or if that's slowed down because of
public anger, that slows down some of these AI plays
like we're just listening to. And it can also hurt
utilities if they're if regulators tell them that they that
they need to do something differently, that can affect everything

(32:57):
from their share price to their future planning.

Speaker 2 (33:00):
LEO.

Speaker 4 (33:00):
You know, data analysis can take you in lots of
different directions. So the headline is up to two hundred
and sixty seven percent more. But were we able to rank,
you know, in which geographies regions this this gain in
price is most present and what are the factors behind it.

Speaker 13 (33:18):
Yeah, so we looked at data, like I said, all
over the country, tens of thousands of locations, and of
course there are areas in the country that are more
effected than others, especially on the east coast, the northeast,
the PJM grid in Northern Virginia. But also interesting Baltimore,
which doesn't have data centers immediately nearby, but it is

(33:40):
actually very close to Northern Virginia that is Data Center Alley.
And what we found is people and locations in the
Baltimore area were experiencing some of the highest increases, sometimes
three times as much since twenty twenty in twenty twenty
five than other parts of the country.

Speaker 4 (34:02):
It is today's Bloomberg Big Take the impact of AI
data centers on boosting the cost of electricity consumers. Check
it out Bloomberg. Joshul Elia Nicoletti, thank you very much.
Now coming up. Shares of Spotify and lower today as
the company announces changes in the C suite with the
departure of CEO Daniel k And it is moving the
stock a little bit lower. We'll have more next, This

(34:24):
is Bloomberg Tech. Big changes over at Spotify. CEO Daniel
k is stepping aside after almost two decades, leaving the

(34:47):
leadership in the hands of Chief Products and Technology Officer
good Stuff Pseudostrom and Chief Business Officer Alex Norstrom starting
January first. The two officers have been co president since
twenty twenty three and have been largely leading strategic and
operational development. Bloomberg's Spotify reporter actually, Carmen, it's with us
on set.

Speaker 2 (35:07):
You know, the stock tells a story.

Speaker 4 (35:08):
It's down more than five percent on the news, but
we gave the context. After two decades, Daniel k is
passing on.

Speaker 14 (35:14):
The torch right and a founder, So that's obviously always
going to be big news, and investors always have strong
feelings about that. But the way they're portraying this is
that kind of status quo. They're saying Alex and Gustav
have been doing this work essentially since they took over
as co presidents, and that Daniel's kind of the visionary
and he's still going to be very hands on.

Speaker 9 (35:33):
That's what they're telling everybody.

Speaker 5 (35:34):
But what's not status quo is the world of music
and the way in which Spotify is trying to identify
new ways of doing audio books and trying to galvanize
people perhaps around AI development of music and whether or
not they want to be paying actual musicians royalties. How
will these two leaders navigate that?

Speaker 14 (35:53):
That's the big question. I mean, they're portraying this as
a huge opportunity, right, AI music, huge new frontier to
do markets that have never really adopted streaming or haven't
in the same amount that the US and Europe have.

Speaker 9 (36:05):
But these are also the big challenges.

Speaker 14 (36:06):
I mean, there's a world in which if everyone can
create their own music, do they need to have a
streaming service? So these are the big questions that Spotify
and alex Obusep are going to have to answer. Can
they actually bring the value of AI, get these emerging
markets to pay more, and then from a video front,
compete with YouTube.

Speaker 4 (36:24):
I think it's worth a health check on Spotify generally,
Like at the end of twenty twenty two, this was
a seventy five eighty dollars stock It's now six hundred
and eighty nine dollars, Like, clearly something's going right there.
I spend quite a lot of time on Spotify. I
know carrots too. They're still dominant in certain categories like
how do you see them as the beat reporter?

Speaker 14 (36:42):
Yeah, no, they absolutely are the biggest music streaming service,
so they have owned that category completely. Part of the
reason the stock was down at that time was because
they had been investing so much money in podcasting.

Speaker 9 (36:53):
Yeah, and the investors didn't like that.

Speaker 14 (36:55):
Now in this past year, I mean they switched to
a loss now, but they were profitable for a full year.

Speaker 9 (37:00):
They showed they can do this.

Speaker 14 (37:01):
Investors loved that, and I think they just have this
new confidence that they're going to spend their money more
wisely and if necessarily, flip the switch to turn to profitability.

Speaker 5 (37:10):
Now. That was a big message for met himself. I
leave it in profitable hands. I leave it on a high. Basically,
what's he leaving to do because he's staying on as chair,
is going to be there for strategy. But he has
been active in founderland, building new companies and new ventures
and backing defense ones, health ones exactly.

Speaker 14 (37:27):
So he didn't really speak about that. He really kept
to focus on Spotify. But it does seem like he's
going to spend more of his time in his investments,
probably focusing.

Speaker 9 (37:35):
On what they're up to.

Speaker 14 (37:36):
But he is again really saying he's going to be
involved in the day to day with Spotify. He's going
to be sharing his office as he does with Alex
and Gustav as well.

Speaker 5 (37:43):
Actually common of Bloomberg, So appreciate the time on little
things Spotify. Meanwhile, coming up, we speak the Microsoft Security
Corporate VP A Sir Jacka, as the company unveils Microsoft
Sentinel or improvements to it, about a unified focus on security.

Speaker 9 (37:58):
This is a Bloomberg Tech
Advertise With Us

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.