All Episodes

February 4, 2025 52 mins

Send us a text

Jim McGregor of TIRIAS Research and Karl Freund of Cambrian-AI Research joined me to recap January 2025, another action-packed month in the world of semiconductors and accelerated and non-accelerated computing on the neXt Curve reThink Podcast series, Silicon Futures. The trio also shares their thoughts on where the industry and the tech are going in 2025.

We parse through the key announcements semiconductor industry and AI headlines of January 2025 and their year-end thoughts:

➡️ CES 2025 - AI everywhere (1:53)

➡️ Nvidia's supercomputing at the edge (3.00)

➡️ Nvidia's vision for the AI PC challenging the meaning of the AI PC (5:49)

➡️ The compelling use case of AI on the PC (8:14)

➡️ CES 2025 impressions by Intel, AMD and Qualcomm (14:48)

➡️ Jim's sparse trek on the CES 2025 floor - MIPS, Synaptics, NXP (18:18)

➡️ Arm's reinvention and their ongoing battle with Qualcomm (19:10)

➡️ CES 2025's hidden wireless gem (20:20)

➡️ DeepSeek and the implications on the AI industry (22:30)

➡️ Reasoning and agentic AI changing the slope of AI (28:30)

➡️ DeepSeek's semi-open kimono (30:34)

➡️ A middle finger to U.S. AI diffusion rule, Stargate, and regulation (34:05)

➡️ The engineer who will design the first 8th generation tactical fighter (37:20)

➡️ The prospects and pivot of Quantum - Willow & Trillium (37:55)

➡️ Jim, Karl, and Leonard give their thoughts on AI and chips in 2025 (42:46)

Hit both Leonard, Jim, and Karl up on LinkedIn and take part in their industry and tech insights. 

Check out Jim and his research at Tirias Research at www.tiriasresearch.com.
Check out Karl and his research at Cambrian AI Research LLC at www.cambrian-ai.com.

Please subscribe to our podcast which will be featured on the neXt Curve YouTube Channel. Check out the audio version on BuzzSprout or find us on your favorite Podcast platform.  

Also, subscribe to the neXt Curve research portal at www.next-curve.com for the tech and industry insights that matter.

Happy New Year!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:06):
Next curve.

Leonard Lee (00:09):
Hey everyone.
Welcome to this next curverethink podcast episode, where
we break down the latest techand industry events and
happening into the insights thatmatter.
And I'm Leonard Lee, executiveanalyst at next curve.
And then this Silicon futuristepisode, we will be talking
about the crazy beginning of2025.

(00:33):
I don't recall.
Ever witnessing a January soinsane in my life.
But anyways, I am joined by thescaled up call Freund of
Cambrian hyphen AI research andthe scaled out Jim McGregor of

(00:55):
the famed curious research.
Gentlemen, how are you doing?
Very

Jim McGregor (01:01):
well.

Karl Freund (01:02):
Happy.
Well, confused, but very well.

Jim McGregor (01:04):
Yeah, I think he goes out of his way to try to
redo the introduction or make itgrander ever more grandiose at
every

Karl Freund (01:12):
grander.
No, no, the proper word.
Actually, Jim is ridiculous.
There you go.

Jim McGregor (01:21):
Yeah, I told you I have the best word.
So, yeah, you tend to make themup though.
So you're rubbing off on me,

Leonard Lee (01:30):
right?
So, Yeah, this is just pureinsanity.
But before we get started,remember to like, share and
comment, you know, share yourthoughts on this episode and
remember to subscribe to thenext Curve Rethink podcast I
don't even know where to startgentlemen, but, why don't we

(01:50):
talk about, let's just start offwith the beginning of the year.
Which kicks always kicks offwith CES 2025.
I know, Jim, you were there.
I was there.
well, let's start off with that.
I think maybe we do things inorder because, otherwise, we'll
be all over the map.
So let's talk about let's talkabout Silicon at CES 2025.

(02:15):
What were some of the highlightsthat you think that the audience
needs to be aware of andcontinue to keep in mind as we
go through the course of theyear?

Jim McGregor (02:24):
Well, I think you have to be aware of the broader
trend, and that is AIeverywhere.
And obviously, we saw that asCES with them trying to put AI
and everything all the way downto a walking cane.
it gets a little ridiculous,but, Jensen has talked about
these, super cycles or superwaves.
And so has, Victor Peng how it'sgoing to be the data center.

(02:47):
And that's going to be theenterprise.
That's going to be the edge.
I think we're starting to see in2025 that edge, solution, but at
the same time, the other wavesare not stopping.
the big news, especially on thesilicon front at CES was
NVIDIA's introduction of theBlackwell for desktop and
mobile.
New GPUs, AI GPUs, if you will,for desktop and mobile PCs, as

(03:11):
well as what they call digits.
This little 4x4 box that theybasically crammed a DGX down
into a mini box.
You can put on your desktop fordoing AI stuff.
and they worked with, MediaTekon it.
MediaTek actually did the SoC,development integration of it.
It used the NVIDIA Grace CPU andthe Blackwell GPU into

(03:37):
essentially of what amounts to amobile SoC.
So, it's not stopping wedefinitely see that in the fact
that, AI is just pushing theboundaries of everything at this
point in time.

Karl Freund (03:48):
I agree.
Um, the Digits platform to mewas the highlight of CES.
It didn't get the press itprobably deserved just because
it doesn't yet run Windows.
I underline yet.
I'm convinced it will runWindows, shortly.
I don't know when, but, whywould you go to that expense if
you're not going to tap into thelarger market?
And that larger market isanxious to have something like

(04:09):
Digits.
it's got like a hundred timesperformance of any other AI PC
out there.
just in terms of AI performance.
you say 3, 000.
It's like less expensive for aPC.
And I said, not for a PC thatthinks it's a supercomputer.
it's not too expensive at all.
So, I think it's a game changer.

(04:30):
I think it's NVIDIA's way ofentering the x86, dominated
market for PCs and eventuallyeven laptops, I suspect.
And it was so small.
I mean, it looks like a DGX,too.
The face of it looks like a DGX.

Jim McGregor (04:47):
the placard to describe it.
It actually got the name on it.
It was four times larger thanthe device.

Karl Freund (04:55):
Yeah, I would imagine we'll hear more about it
at GTC in another month,
Yes.
I think it will steal a show, especially if they
can announce Windows.

Leonard Lee (05:06):
Yeah, I thought it was kind of, funny that some
folks thought that the shieldthat he brought out, Jensen
brought out, was a new chip.
That

Jim McGregor (05:15):
was funny.
They said, wow,

Leonard Lee (05:16):
they have a wafer scale accelerator now.
I was like, no, no, no, no, no,no.
That's not what he was trying tosay.
No, that, that, that was a joke.
Yeah, yeah, yeah.
No, they don't have, they didn'tmake wafers that big.
It was definitely interesting tosee them bring that whole
supercomputing story to theedge.
I think that's really what itwas all about physically, I a

(05:39):
lot of attention to that.
But then, Jim, both of you guysare making a great point about,
NVIDIA bring their a I.
P.
C.
story.
And we've heard this in thepast.
Jensen say, Hey, look, wealready have, like, what to
whatever 1, 000, 000, a I.
P.
C.
S.
out there already.
And this was even before.
Yeah.
AMD even introduced the firstquote unquote, PC processor that

(06:02):
had what you might deem an MPU.
Um, they didn't call that at thetime, but that was like, what,
almost 3 years ago now going onthrough 2, 2 years, right?
It's a 2 or 3.

Jim McGregor (06:13):
And we've had 3 generations.
At least three generations havebeen refused running AI on them
and capable of a hundred to athousand times the performance
of the processor AI capabilitieswith the NPU.

Leonard Lee (06:27):
it's interesting to see NVIDIA bring that angle to
the AI PC story.
It's not novel, but it'sdefinitely getting more,
obviously, denser.
And in a much smaller formfactor that's represented by
digits.

Jim McGregor (06:42):
fortunately, it all boils down to Microsoft
support.
So when Microsoft startssupporting AI on the discrete
graphics cards and on the digitsplatform and everything else,
that really opens up everything.
But it's not just that.
Also, it's support on thoseplatforms themselves.
I mean, like, if you use Rockhamtoday from AMD, it doesn't run

(07:05):
on Windows yet.
So, you're having to use theLinux platform, which isn't a
big deal for a PC, but still,

Karl Freund (07:12):
it's a big deal for users.
It's not a big deal fordevelopers.
Developers should buy and getLinux.

Leonard Lee (07:17):
Yeah, I mean, that's a great point though.
Right?
And then it also begs thequestion of, well, number 1,
what do we mean by AI?
is it really the serious stuff,or is it the stuff that's kind
of, answering feature type stuffthat we're seeing, quite
frankly, in Copilot plus PC,right?

(07:38):
There's nothing there that, atthe moment, that's really
compelling.
I think what we're seeing interms of serious is still cloud
bound for the most part.
Right?
And then it's moving its way to,higher power edge
infrastructure.
But then I think this is the 1sttime we're seeing at least,

(07:59):
Nvidia bring that.
Super computing, capability orarchitecture down to, a smaller
form factor now,

Jim McGregor (08:08):
I would have agreed with you going to 2025.
that there aren't really thatmany compelling use cases for AI
in the PC at this point in time,because most of it's in the
cloud.
However, I now see that a littlebit differently, and that is the
compelling use case isn'tnecessarily these applications
that are using AI.
It's AI itself.
It's the ability to download,LLAMA, to download DeepSeq, to

(08:32):
download all these differentmodels and be able to do
development, and models andmodification of models and
integration of models.
On your PC.
I mean, yeah, that's still asmaller community than the
broader ecosystem.
But still, I think that is thekiller app today.
I think the killer app is thedeveloper.

(08:53):
The killer app is a developer?
Okay.
The killer app is being able todo AI locally, for a developer.
And just the power and thecapability that we have to do
that with.

Leonard Lee (09:04):
Sure.
I mean, developers developingapplication and there's that
broader question of.
monetization that's stilllingering out there.
Oh, yeah, it's only going tohappen through these scaled out
and valuable applications.
And I think that's really whatI'm referring to.
I don't argue with you in termsof the value for the developer,
because that's where I thinkactually, Nvidia might be

(09:27):
bringing some serious gain.
in the approach that they'retaking, which is what I consider
top down.
But then, maybe we don't dwellso much on Nvidia, although I
think they made the biggestimpression.
But there was a lot of otherstuff that happened, right?
Obviously, we still have sort ofthat bottoms up stuff that's
happening.
with, what Intel, AMD and,Qualcomm are doing, right?

(09:50):
and, I don't know, what are someof your thoughts there?

Karl Freund (09:53):
think the world's still looking for the killer
app.
And to say it's the developers,true, Jim, but I, what I think
people are looking for is, yeah,but I'm not a developer.
So what does this mean for me?
Right.
And so that can go one of twoways either you make that person
a developer By giving them thetools they need to create real

(10:13):
value added applications forthemselves or you realize that
The real killer app isn't an appat all.
It's the data.
It's the data that's on your pcIt does not exist in the cloud
except for in a backup formthat's where the AI PC could
find real traction and makingeverybody more productive by

(10:36):
helping them find and use andsummarize data that's on their
PC today.
It's not dependent on the cloud,Qualcomm and AMD and Intel.
They all talk about.
Well, it's all about being ableto access data.
And be able to, excuse me,process data on a free processor

(10:57):
because you already bought it.
Yeah but AI is becomingcommoditized so quickly that
cloud AI is becoming practicallyfree.
So I don't know if that's enoughof an incentive to developers to
create real value add apps.
on the PC itself.

Jim McGregor (11:15):
Actually, I think gaming is going to drive that,
because I think gaming is goingto drive it to a hybrid model,
like what we've seen in thepast, where some of it's done in
the cloud and some of it's ondevice.
So I think gaming is one ofthose applications that's going
to drive more and more of thatAI processing on device.

Karl Freund (11:36):
You mean casual gaming or serious gaming?

Jim McGregor (11:38):
Well, there's a fine line there depending on who
you talk to, but I woulddefinitely say serious gaming.
I mean, we're not talkingAndroid games.
We're talking, anything that isat least a double A or triple A
game.

Karl Freund (11:51):
Well, that's where I think NVIDIA has got a lot to
contribute, right?
I mean, Qualcomm's been verycareful when they talk about
gaming, to talk about it ascasual gaming, because they know
you need an add on, PC card,excuse me, GPU card to handle
serious gaming.
But with Digits, you don't.
You don't need a PC, you don'tneed a GPU card.

(12:12):
You've already got a Blackboardon your desktop and that's where
I think it could now in terms ofmarket size It's a relatively
small market when you look atthe market for data center GPUs,
but it is another DifferentiatorI think for Nvidia.

Leonard Lee (12:28):
I struggle a little bit Trying to figure out where
else other than, let's say,augmentation of non player
characters, generative AI inparticular, that form is really
going to make a huge impactbecause, if you think about AI
and how it's actually.
Impacted, gaming, gamedevelopment, a lot of it has to

(12:51):
do with, DLSS and the renderingengines and, improvements there
that tend to be step changes,but in terms of the grand scheme
of things, they tend to beincremental improvements.
It's the dynacism.
How do you leverage these largelanguage models generative AI to

(13:12):
introduce dynacism into the,experience.
especially in the form of thesenon player characters Where I
think there's differentiatedutility, and that's always been
the weak point of a lot ofvarious, gaming formats, whether
it's, first person shooters oropen world type, fantasy

(13:34):
adventure games.
so I think.
we'll probably need to do a lotmore tactical thinking to figure
out where generative AI reallyfits and where it's going to
express this, exponential orlogarithmic value everyone is
talking about.
I just don't see it, even thoughthere's a lot of great visual

(13:55):
representations or presentationsof the impact.
It's at a tactical level.
I don't see it as being thatbroad.
It tends to be a little bit morenarrow than I think a lot of
folks assume.
But anyway, That's just my takenow.
Oh, what did I stun you guys?

(14:17):
No And we need to have that whatdo you call it BS card, right?
Like the little sound we need alittle sound.
No,

Jim McGregor (14:26):
no, we need the slap sound,

Leonard Lee (14:27):
you know,

Jim McGregor (14:28):
don't

Leonard Lee (14:29):
No, we don't you really I mean I can add it later
Maybe we do use sound just youknow, like hand gestures for
that And then as cues, right?
But, I love you guys.
so, yeah, the otherannouncements, obviously, 1 of
the things on the front was theannouncement of, a, you know,

(14:52):
the rise in 300 pro was a pro.
Yeah, pro and then, Dell comingon board, with, you know, going
to enterprise or commercial withthese guys.
I think that was like a, aheadline that really drew a lot
of attention and excitement,obviously, curious research.
You guys are at the Intel thing,right?

(15:14):
Oh, yeah.
So what's your take there?
We have Arrow Lake coming tomarket.

Jim McGregor (15:20):
Actually, the big takeaway there, obviously Arrow
Lake's coming to market, but thebig takeaway there was Panther
Lake.
They were showing off PantherLake, which is the next
generation, and it's on 18A,which says 18A is on track and
everything else.
And I've actually seen, over thepast month, I've seen some of
the yield numbers on 18a.
It is definitely on track.

(15:42):
So when we start thinking aboutgeneration Xeon, we start
thinking of Panther Lake, allthose products, in terms of
manufacturability are on trackfor, later this year.
So that was the biggest takeawayfor me, for, for us at, Intel's
event.

Leonard Lee (15:58):
Yeah.
Okay.
I thought they were prettystealth, they didn't get a lot
of attention, but they wereeverywhere.
I was a bunch of OEM, likeSamsung, Lenovo.
Events and you saw Intel folkseverywhere and so I think their
go to market motion is a littlebit stealth in previous years.
They were pretty boisterous.

(16:19):
Right?
but it's interesting that Iheard quite often.
Hey, where's Intel?
They, they're so quiet.
It's like, no, They're there,

Jim McGregor (16:26):
well, and they're meteor like core ultra is doing
exceptionally well in themarket.
you have to remember that whenthey introduce products, every
time they introduce a newproduct, they have more than 100
OEM design wins out of the gate.
Yeah,
they are just there and they're cranking and
they're ready to go.
and Arrow Lake, obviously thebig thing there, which Meteor

(16:46):
Lake lacked is the fact that youcan have a discrete GPU with it.
So we're going to see thosehigher performance mobile
workstations, all those realSTEM type PCs and everything
else are going to be coming out,using Arrow Lake as kind of the
base CPU and using an AMD

Leonard Lee (17:03):
Yeah.
Qualcomm went the oppositedirection.
With, what you just, mentionedhere with, the introduction of,
I think it's snapdragon X,right?
600 and below, lower tier when

Jim McGregor (17:18):
they're building out their product family.

Leonard Lee (17:21):
yeah, I think they're, going off of the
smartphone playbook and tryingto expand, the footprint and
support of copilot plus PC, tothe lower tiers, and actually 1
of the things that I noted in 1of the, reports that I am
putting together is that they'rediffusing copilot plus PC, with

(17:42):
hexagon, which has the 40, topscompute much faster than what
we've seen them do on thesmartphone side of things,
right?
Historically.
So that's an interesting dynamicand strategy that Qualcomm is
executing on, I think, largelyon behalf of Microsoft.

(18:05):
It's interesting.

Jim McGregor (18:06):
They're keeping that TOPS number up there, that
NPU performance up there andscaling the rest of the chip
around it.

Leonard Lee (18:12):
Yeah.
And so, anyways, I think that'sCES.
What else is happening?

Jim McGregor (18:19):
unfortunately I didn't get the chance to
actually even walk the tradeshow floor because, some of the
vendors took up entire days, atCES with, events.
but still, there was a lot ofreally, really good information
there.
I did get to see some of theother stuff, especially some of
the IOT stuff.
synaptics had their Astraplatform where they're being

(18:39):
able to put AI and just.
The smallest devices, some ofthe things that we're going to
see coming out that are going tobe leveraging companies like
synaptics, NXP, even talking tothe MIPS guys, which, are now
based on risk five architectureand where they're going, they
didn't announce anything at CES,but they are at embedded world
coming up.
And the fact that they're prettymuch supercharged around where

(19:01):
they think risk five is going togo and how it's going to
compete.
especially in this new dynamicwhere, people are viewing arm
and a little bit different lightafter the, the lawsuit with
Qualcomm in December.
So it's going to be interestingand good for them.
The 1 thing that I'll note forthem is the fact that they're
not trying to be a pure IPcompany because I don't think

(19:22):
you can survive as a pure IPcompany in our industry anymore.
You have to provide.
Complete systems, you have toprovide the software, the
platforms and even arms goingthat direction with trying to
provide chips for some of their,key customers.

Leonard Lee (19:36):
That's where I think a lot of these companies
run to an identity crisis andthe challenges associated with
it.
Right?
Because they're positioned in acertain way in the ecosystem.
So how do you, how do youreinvent yourself?
And you've mentioned Thechallenges of doing that,
especially the impacts oncustomers or licensees.
but, speaking of, the lawsuit, Iknow that, you were there,

(19:57):
right?
I was at trial.
Yes.
But, it's interesting Qualcommfiled, what they call a post,
trial brief, that was on, the29th.
that's an interestingdevelopment or follow on to the
trial.
I'm really interested to seehow, this particular motion,
plays out,

Jim McGregor (20:16):
one of the things that I'm interested in, how
companies are using wireless andstuff.
And there's one company, andthis was their second time at
CES.
It's called Morse Micro.
They're out of Australia.
And it's interesting becausethey're using older Wi Fi
technology or lower band Wi Fitechnology to get longer range.
we're talking like a half, we'retalking like a mile range,

(20:38):
several thousand kilometers.
I have one of their systems.
I'm just, I'm hoping to test itthis week, but that might be a
game changer in terms ofwireless.
Cause right now, if you're usingwifi, you're limited to,
especially either, well, you'reeither using proprietary RF
technology where you have tohave line of sight or using wifi
where you're limited to a coupleof hundred meters at best.

(21:00):
And.
having a long range wifi insteadof having to do cellular and
everything else that couldchange the dynamics of the
market.
so I'm interested in testingthis system out.
it's kind of interesting to meto see what people are doing.
And obviously on the other endof the spectrum, there's other
companies there that are pushingBluetooth and wifi to the lower

(21:20):
power levels.
So, a lot of innovation aroundwireless technology out, even
outside of the cellularspectrum.

Leonard Lee (21:29):
So

Jim McGregor (21:29):
are

Leonard Lee (21:29):
they

Jim McGregor (21:29):
using like beam forming or something like that
to be able to keep Pretty mucheveryone uses beam forming, but
they're also using lowerfrequencies.
They're using, I think, oh, 900megahertz to like, I'm not sure
how high they go, you're limitedto about, I think it's like 42
or 43 megabits per second uhhuhrather than Okay, broadband
speeds.
But, you start thinking aboutsecurity applications, you start

(21:52):
thinking about, coverage ofevents and stuff like that.
There's.
Significance there.
So it'll be interesting.
I'm just seeing that, and we'restarting to see NTNN NTNs, we're
gonna see more NTN this year,especially as we get into Mobile
World Congress, the nonterrestrial networks, especially
as companies start launching newsatellites and new satellite
networks.

(22:12):
So there's gonna be a lot ofinnovation, I think, in 2025,
around wireless technology.
Okay.
And this is coming outta CESright?
This is coming outta CES andOkay.
I think it's, we're gonna seeeven more going into mobile
world nce.

Leonard Lee (22:26):
Okay, well, let's then now dive into the topic
that I know Carl really wants tojump into, which is this deep
seek stuff.

Jim McGregor (22:36):
Is there anybody here believe that they trained
those models for 6 million?

Karl Freund (22:43):
No,

Jim McGregor (22:43):
no,

Karl Freund (22:45):
no, absolutely not.
But that being said, it doesshow that, the U S engineering
community has been so focused onjust getting stuff out the door.
And so they kind of keep alltheir programming at this level,
the Chinese engineers didn'thave the chip that would allow
them to stay at this level andget the performance they need.
So they came down a level.

(23:06):
So it's a lot of hard work ofoptimization at the machine code
level that they did and a lot ofinnovation at the model level
that they did to be able to getthose costs down.
I don't think they did it for 6million dollars, but they
definitely.
Got a lot of attention.
All of a sudden, everybody,including NVIDIA, now supports

(23:26):
it, and people are playing withit, and enterprises are trying
to figure out whether they canuse it or not.
And if they do use it, is thereany, security concerns that they
should be, aware of and try tomitigate?
So, we haven't seen the end ofthis play out yet.
but obviously, it's a dual edgedsword.
You bring the costs down, I needless chips.

(23:47):
But, Devon's paradox says, ifyou bring the cost down, then
more people will use it.
So you need more chips.
And we don't know where thatdividing line is going to fall.
We don't know whether it's netpositive or net negative in the
overall market.
And from a volume standpoint,from a use standpoint, I think

(24:08):
it's a huge step forward tomaking AI more pervasive and
Helping lower the costs and youwill drive more use cases.
Again, we just don't know yet.
We'll have to see what happens.

Jim McGregor (24:21):
And that cost is the big thing.
At their introductory price,they were lowering the cost over
using OpenAI by 97%.
97 percent to do an inferenceprocessing workload.
now that's just an introductoryprice.
They're raising that up.
But still, if you reduce thatprice by, 80, 40, I think it's

(24:41):
going to be 80 percent orsomewhere around there, but even
40 percent or 50 percent you cutthe cost in half.
That really cuts the legs out ofthe current business AI so far.
And that gets to a veryimportant point, the fact that
we have to make it more costeffective.
And I agree with you, Carl.
I don't see a change in demand.
I don't see people stop orderingGPUs because they still think

(25:05):
they're going to need them, andhopefully this is going to spur
more demand more than anythingelse.
But it puts a lot of pressure,even if they did use illegally
other people's models.
that's still being decided, butit still puts a lot of pressure
on everyone else out there inthe A.
I.
community, not just to get stuffout there, but they have to be

(25:25):
able to offer that inferenceprocessing of workloads on these
models at a much more efficientprice.
Yeah,

Karl Freund (25:33):
and it may actually drive increased demand for older
GPUs.
I've read recently that there'sa lot of demand they can't meet
even a 100s because they'recheap And if you can get the job
done using DeepSeq's model orderivative of that, then you're
going to save a lot of money,but that will increase demand

(25:55):
for older GPUs.
I don't know that NVIDIA plannedfor enough wafer capacity to
continue those product linesbeyond what they would normally
last.
Given the normal evolution ofgoing from H 100 to B 100 onto
the next generation, beyondthat.

(26:15):
So it'll be interesting to seehow it shakes out.
We'll know a lot more in,probably six months maybe.

Leonard Lee (26:21):
Yeah.
Here's the thing that everyoneneeds to reckon with is the
economic impact.
I hear this whole, Jevin'sparadox thing.
I don't quite buy it.
There's way too much associationof empirical laws, scaling laws,
all kinds of weird theoriesabout what's going on and what
will happen.

(26:42):
I think what we have to do isreally look at the reality of.
What's in front of us and toyour points, they brought the
cost inference downridiculously, to a level that I
think, the current ecosystem,global ecosystem, primarily
dominated by, us.

(27:05):
It is disruptive.
It changes the economic and Ithink that's what the industry
is going to be grappling withprobably for the next 3 months.
And in terms of the game theoryhere.
whether or not they, um, tookthe data or these accusations,
they need to be proven out.
the game theory is, if they,trained a certain aspect, of the

(27:29):
model or a phase in itstraining, whether it's pre or
post at that cost andcomparatively it is.
Orders of magnitude moreefficient and, cost efficient,
right?
and, with quality parity, whichapparently they've achieved that

(27:49):
is the biggest risk to, The U.
S.
led ecosystem, and I thinkthat's the thing that everyone
should be really concernedabout.
And I think there's a lot ofdeflection and denial going on
around that.
And I think that, frankly, isdangerous.
if the Chinese, company deepseek actually prove that you can

(28:10):
do all this stuff.
That we assume has to be donewith the super, massive data
centers, right?
That are going to create anenergy crisis, in, a fraction of
that footprint, a fraction ofthe resources required that's
game changing,

Karl Freund (28:28):
that there's been two developments in AI in the
last six months that I thinkchange the slope.
one is genetic AI.
And the other is reason.
both of which are, you read andwatch demos of what this stuff
can do.
It's just mind blowing.
But what they don't tell you is,but you can't afford it.

(28:48):
You can't afford the power andyou can't afford the cost of
doing it.
You can't afford it.
I read an article today about,open AI's new reasoning PLA
platform called Research, whichis, kind of a web search
capability.
Yeah.
But it would make, Jim and I, out of business just
because it can do what we do andprobably do it better.

(29:09):
The problem is, it's soexpensive, it's only available
for the$200 a month.
open AI users.
And at that, I don't know thatOpenAI is even making any money
at it.
if you apply the techniques thatDeepSeek has applied, all of a
sudden, those kind of, earthshattering applications become
much more readily available.

(29:30):
And that, to me, is the nextphase of AI, where it's going to
go, and it'll go this year.

Jim McGregor (29:37):
Yeah,

Karl Freund (29:37):
there's

Jim McGregor (29:37):
no doubt.
There's two key points here.
One is the fact that we alwaysknew that using recursive
training and knowledgedistillation and these other
techniques was going to bringdown, the cost of AI and make it
easier to make, more focusedmodels, more efficient models
and everything else, andobviously DeepSeq used some of
these techniques.
They had to, but also the factthat after they did that, they

(30:00):
made it open source.
And that changes everything.
When all of a sudden they'rewilling to give it away to where
you can download it, you canplay with it.
You can do your own research onit.
You can do your own training orretraining and optimizations on
it.
That, really changes the game.
I mean, we've had a rush just toget stuff out there, but not a

(30:22):
rush to really enable it.
And that's really what DeepSeekhas done is enable it.

Leonard Lee (30:27):
Yeah,

Jim McGregor (30:29):
but,

Leonard Lee (30:30):
I want to get your impressions on this, observation
is that they didn't open thekimono entirely.
What they open source was afraction of what I think folks
would have liked to have seenand wanted.
They open source the weights.
They didn't open source the dataset or even let people know what

(30:53):
was actually used, even thoughthere are references in their
papers and 1 thing that we doknow is it was, trained on.
data sets that were largely, offof Chinese content or data.
Yeah, go ahead about TiananmenSquare.
Yeah, yeah, yeah.
No, none of it had anythingabout Tiananmen.

(31:15):
It's obviously biased.
If it was entirely done on,let's say English Western, um,
you know, maybe, there wouldhave been some.
Injection of, uh, someundesired, hallucinations, or at
least on the part of the CCP.
The other thing is that theydidn't entirely describe what
the entire system looks like andhow it comes together.

(31:36):
Right.
What I noticed is a lot ofpeople are taking the model and
just running on standard, H 100instances and they think that
that's replicating everything.
my impression just reading thereports actually, and doing a
little bit of, deeper divingthat those are things That
haven't been replicated yet.

(31:59):
And so I don't think the fullpicture of what the
optimization, actually was isvisible or even replicable.
I think they've kept that closeto the vest.

Jim McGregor (32:13):
You're right.
They have and obviously therecould be legal issues there if
they are using somebody else'sdata set or models So you got to
kind of expect that and onceagain believing that they did
this for six million dollarsNobody really believes that at
least nobody that has everworked on these systems believes
that so obviously there'sinformation we don't have about

(32:36):
it, but I think we're gonna seeI think it's funny because It
seemed like everyone thoughtthat, oh, this is going to crash
the demand for chips.
this is going to raise up allthese AI companies.
It drives the demand for chipsand it opens the door to
hundreds or thousands of AIcompanies that are going to look

(32:57):
at what DeepSea can theorizeabout what DeepSea did and try
to replicate it.

Leonard Lee (33:03):
Yeah, but you know what, I think we're in an
information digestion period.
I mean, all the indications thathyperscalers are continuing to
Invest is, that's a laggingindicator of the event, right?
The deep, what we need to see iswhat happens in the next 3 to 6
months as we start to learn moreabout, what actually happened

(33:28):
here and what the, you know,what deep seek is actually
trying to do.
I think you're pointing outsomething really important.
The fact that they open source.
This is just really curious.
Especially in light of, what theBiden administration issued
before they left office, whichwas that, uh, what do they call

(33:51):
it?
It is a goofy name, uh, interiminterim thing of, AI

Karl Freund (33:55):
diffusion, AI diffusion, AI diffusion or
terminology they use.

Leonard Lee (34:01):
Yeah, I mean, it was almost like open sourcing.
It was like, thumbing their noseat that whole.
Construct, you know,

Karl Freund (34:08):
absolutely.
Well, open sourcing was designedto impact us competitiveness in
the marketplace.
and really hurt the companiesthat are leading the research in
AI, whether it's Google or openAI or what have you.
so I think there were politicsat play in that call more than

(34:29):
some people believe.

Leonard Lee (34:30):
Jim, your thoughts,

Jim McGregor (34:31):
No, I completely agree.
I think that, uh, we've had, AIdiffusion.
We've had the Stargateannouncement about,
yes,
and then we had the announcement of deep seek.
the 1st thing I'll say is thatthinking that.
Any government, especially theUS government or Western

(34:52):
governments are actually goingto regulate impact or do
anything otherwise in thissegment is kind of foolish when
they can't even spell AI.
And yes, it's only 2 letters.
I, I don't believe that they aremeans Apple intelligence.

Karl Freund (35:05):
1,

Jim McGregor (35:07):
that's 1 token, by the way.
The, the market, the industry,the technology is moving just so
rapidly and it also kind ofpoints to the fact that this is
a global ecosystem andespecially, we've spent the past
40 years making it a globaltechnology ecosystem and
thinking that we can segmentthis.

(35:30):
And build barriers betweencountries and everything else to
stop this.
We can't.
Matter of fact, if China lockeddown rare earth materials, our
whole electronics industry wouldtank overnight.
If we didn't have assembly inChina.
Yeah, our industry would tankovernight.
We are a global industry and howmany companies how many u.

(35:52):
Companies have developerresources all over the world in
china and india and Every partof the globe, you go where the
talent is so to think that thesetariffs, these regulations or
anything else, or even,investment is going to have a
huge impact on the direction ofthe market.
The industry of the technologyis absolutely foolish.

Karl Freund (36:13):
I agree.
you can't contain thisinnovation and by the
regulations, the US governmenthas tried to implement this just
shows that it didn't work.

Leonard Lee (36:23):
Yeah,

Karl Freund (36:24):
it shows that strategy is not working.
So let's not double down on it.
Let's figure out how we can getmore talent in the U.
S.
not just through H1B visas, butby investing in our education.
systems and getting morestudents trained.
China's, blowing passage just interms of the raw wetware that's

(36:46):
being applied to AI and that's along term threat to U.
S.
competitiveness.
You're not going to stop that bypassing regulations, preventing
people from getting access toour technology.
That's not going to work.
We have to invest in the.
intelligence ecosystem, whichstarts with people.

Jim McGregor (37:05):
I completely agree.
Matter of fact, we should beinvesting in education.
We should invest, be investingstrongly in the next generation.
Now I have two boys right nowthat are getting their masters
in AI and robotics.
So I'm doing my job.
What are you two doing?

Leonard Lee (37:23):
You know,

Karl Freund (37:23):
I have a, my daughter's a veterinarian, so
that's not gonna help.

Leonard Lee (37:27):
Yeah.
I have a five-year-old who Iknow is going to be designing
the eighth generation offighters of the future, so, you
know, can be gen platform.
Come on.
Yeah.
I mean, already prototyping itright now, it's like ridiculous.
Yeah.

(37:47):
I'm doing my job.
Jim

Karl Freund (37:49):
Yeah.
There's one area we should touchwith, we're going to run out of
time, but one area we shouldtouch on briefly, and that's
quantum.
We saw a tremendous explosion ininterest and investment dollars
being poured into quantum inNovember and December.
And then, Jensen and Zuck, said,nah, that's not real.

(38:11):
and just yesterday, Bill Gatescame out and said, no, actually,
this is really going to happen.
It's going to happen in the nextthree to five years.
meanwhile, IBM is quietlydigging the trenches to fight
that quantum battle, which isthe next big battle that will
come about.
So if you take a look at whatGoogle did with Willow, you
combine that with Trillium,which is really an impressive

(38:34):
chip.
You start to get a feel that weare approaching an inflection
point where everything is goingto change.
And, both in terms of, youcombine that with the
portability that DeepSeq bringsin, and you end up with a very
different world than we'veenjoyed for the last two years.

Leonard Lee (38:50):
Well, don't you think quantum would just
completely change everythinganyways, especially as you're
looking at neuromorphiccomputing, and then maybe even
having to go to a differentarchitecture for.
Age quote, unquote, if that'swhat you're trying to pursue.
I mean, I can't imagine thateverything is just portable the

(39:13):
way, you know, I mean,

Karl Freund (39:15):
it's quantum will coexist with traditional
computers.
You need to have traditionalcomputers did pre processing and
post processing and normal scalescaler operations.
but you combine the concepts ofAGI and quantum and that's where
the magic could occur.

Jim McGregor (39:32):
just the amount of processing you can do and the
data you can process in quantum.
And everyone right now istalking about how AI will
impact, will benefit quantum,especially in terms of
developing circuits and doingerror mitigation and correction
and everything else.
But no doubt, eventually,quantum is going to benefit AI

(39:52):
as well.

Leonard Lee (39:55):
Yeah, you mentioned IBM.
I know that they've always beenkind of conservative about
expectations around quantum.
my concern is that there is thispivot, to continue this AGI
narrative and it's just anotherhype cycle.
I think the danger here is thatyou start to place expectations

(40:18):
on this.
as you start to see theeconomics of scaling the current
approach to quote unquote AGI,kind of run its course and wear
itself out.
I think IBM to their credit atleast 7 years ago was the right
one where this is experimentalAWS.

(40:38):
Their head of quantum said thesame thing.
This is like research stuff,guys.
We're not ready to go commercialwith this stuff, even though
there may be some like, youknow, applications, like, in the
early days of supercomputing.
Where you might be doing weathermodels with this stuff or

(40:58):
molecular biology research andthings like that, or maybe even
drug discovery.
I mean, not one always coming,but these are like, super
computing themes that willcontinue to find a new home in
the next generation ofcomputing.
But in terms of, Revolutionizingthings on your smartphone.
Let's not get our hopes up toohigh, you know,

Jim McGregor (41:19):
it's data center play, but I think that, it
advances the data centersignificantly.
And, you know, any video wouldagree with that.
I think by 2030.
with the advancements we'reseeing with, especially with
error correction and with bitdevelopment, talking to Alison,
Bob about their cat bits thismorning and how they're using a

(41:42):
feedback loop to reduce the, bitflipping, this is truly, this is
a physicist level.
But, and there's so manydifferent qubit technologies
under development and underconsideration at this point in
time.
But I agree with Carl.
I think by 2030 we're at a pointwhere you have to be invested in

(42:03):
taking quantum seriously becauseit raises the bar again.

Leonard Lee (42:09):
Yeah.
I mean no doubt when it happens.
I think it's always a matter oftiming, right?

Jim McGregor (42:17):
Okay.
We ready for the

Leonard Lee (42:22):
okay.
So here, let's do this.
This is the 1st episode of 2025and we're all just completely
baffled brain fried.
Oh, totally brain fried.
And, yeah, unfortunately, we'renot going to do this 5 hour
thing that, Lex Freeman doeseven, we're trying to.
Do something much denser, butI'd like to get both of your

(42:45):
impressions of what, what we canexpect in 2025.
I know it's so cliche, but let'sjust do it.

Karl Freund (42:53):
Jim and I were talking earlier.
We were both working on ourannual, here's what to expect in
the coming year, blogs, and Istopped writing it because I
have no idea.
I really don't know what's goingto happen this year.
I do think that we're going tosee a shift where the return on
investment.
Of cloud computing companies andother challengers to NVIDIA,

(43:16):
including Intel, which has justcancelled, their next big AI
chip.
They cancelled it.
Um, AMD's, we'll see.
At least it's got a calltomorrow.
We're going to find out moretomorrow, I guess, but not clear
what market they're pursuing.
you've got Avidia and you've gotall the cloud providers.
where's room for another,commercial semiconductor vendor

(43:41):
to come in and say, yeah, well,I've got a better solution here.
And that'll apply to Intel,apply to AMD and all the other
startups out there.
And so they're all going to bescrambling to find a niche they
can fill and own.
I'm not sure where that's goingto land.
I think at the end of the year,NVIDIA is strong, even stronger
as probably the only viablecommercial semiconductor

(44:03):
provider of AI accelerators.
What do you think, Jim?

Jim McGregor (44:07):
I would agree with you in the fact that they have
the complete stack from, Jetsonplatform all the way up to
Blackwell, and they're lookingat the whole solution.
They're looking at the completesoftware stack as well.
So I don't think anybody has thevision that they have.
And the fact that the entireindustry right now is focused on

(44:27):
going towards agentic AI, andthey're already looking at
physical layout.
I think they're doing it rightbecause they're the only player
in there.
There there's no one else there.
And if you saw some of thesehumanoid robots, the CES, they
were impressive.
I think 2025 is a year ofpractical AI.
I think we see a lot of AI startmoving into practical

(44:48):
applications, whether that's forthe chatbots or the productivity
tools for the enterprise, orwhether that's actually for
embedded systems that add moreintelligence, not necessarily
generative AI, but even justtraditional AI.
Yeah.
Seeing practical uses of it.
Throughout 2025, I think we'regoing to see more innovation

(45:11):
around wireless technology thatkind of build up to the next
wave, which is going to be 60.
it's going to be WiFi.
It's going to be Bluetooth.
It's going to be all these, butthey're expanding as they go.
So, I think I am excited aboutsome of that stuff.
I think that, quantum is, let'sput it this way, I don't know
where the huge innovation thingsare going to happen, because

(45:35):
there's so many things going on,from silicon technology, to
software technology, to physicstechnology.
I would say that is probably thetheme that we're going towards
2025 and beyond, is that we'renow not just looking at,
battling Moore's law.

(45:55):
We're not looking at scaling,you know, Dennard scaling and
stuff like that.
We're battling the laws ofphysics, and we're trying to
address the limitations that wehave with physics.
And that's going to lead to awhole new wave of innovation,
whether that's this year orbeyond.

Leonard Lee (46:12):
But hasn't that always been the case though?
That's the thing, right?

Jim McGregor (46:16):
I think we're going to a whole new level.

Leonard Lee (46:20):
it's an entirely different technology, in a way,
right?

Jim McGregor (46:24):
I think that 2025 is also going to be filled with
a lot of Despair, we don't knowwhat's going to happen with, the
social political situation thatwe have between countries and
governments in this point intime.
And quite honestly, they couldtank everything, very quickly.
so I'm very concerned aboutthat.
very concerned about.

(46:45):
building up barriers betweencountries, especially when we
have a global industry, I'mworried about, the distribution
of talent, which Carl broughtup, definitely worried about,
trade wars.
And tariffs.

Leonard Lee (46:57):
Yeah, definitely.

Karl Freund (46:59):
My company is Cambrian AI.
But what happened after CambrianAI, a mass extinction of six of
them, depending on whether youbelieve we're currently in one
or not.
I don't know if there's gonna bea mass extinction event of
companies developing ai,especially companies developing
AI chips or not.

(47:20):
And I think, uh, Jenna's pointabout the current,
sociopolitical environment,probably will have more impact.
whether this is a massextinction event or a
continuation of the camera andexplosion of a, um,

Jim McGregor (47:32):
so, um, that doesn't there's your, there's
your headline, Carl Freundpredicts mass extinction.

Leonard Lee (47:40):
Well, actually, what I'm really curious about
is, are you going to berebranding to a cretaceous?
Devonian.

Karl Freund (47:46):
That's what I want to know.
Decavrian was Devonian.
Decavrian Devonian Extinction

Leonard Lee (47:55):
of

Karl Freund (47:56):
Evolution.
Yeah.

Leonard Lee (47:58):
What about you, Leonard?

Jim McGregor (48:00):
What's your prediction for 2025?
My

Leonard Lee (48:02):
prediction?
Well, I have thoughts.
Number one, I think generativeAI supercomputing or this AGI
supercomputing is going to haveto Reconcile with, new economics
and then these are realities.
These are real economic shiftsthat are happening and it's
going to create opportunities,in certain parts of the quote

(48:22):
unquote ecosystem.
But then it's also going to,maybe be that, extinction event
for others.
the other thing I think the hypeis going to settle this year.
We're gonna find that the CPUand the principles of XP are
probably gonna dominate and bemore relevant for AI in general
or at large, at the edge, thanwhat we're seeing in the, you

(48:45):
know, what we, what the industryand the media and the broader
public, thinks of the GPU,oriented, world of, AI
computing.
the other thing, maybe a movetoward custom training systems
and inference.
So, optimizing because if thereis credence in what the DeepSeq

(49:07):
folks have done and there's thisorder of magnitude improvement
that you can get across theboard, there's probably going to
be focused on that, especially,you know, like, when you look at
the hyperscalers looking tooperationalize.
A lot of these models and,incorporate into their
recommender systems and, agenticplatforms, it has to be cost
efficient because a lot of thisis going to be given either away

(49:30):
for free, or it's going tomonitor the ties to a freemium,
business model, like what.
Meta does right?
and then I think there's justgoing to be like, on the
enterprise side, just simply, adigestion issue.
the systems, you know, you guysknow, I mean, we're on this
friggin.
What is it?
1 year cadence at thesupercomputing level?
Think about what's happening atthe across the edge environments

(49:53):
where.
Systems are evolving probablyway too fast for a lot of
enterprises and consumers toreally deal with.
And so, I think that's a dynamicthat, the, the end markets are
going to have to grapple withand obviously it's going to have
impacts, up and down theecosystem.

(50:13):
across the ecosystem and up anddown the stack.
So those are my thoughts

Karl Freund (50:19):
to think about.
It's gonna be a fun year.
That's for sure.

Jim McGregor (50:21):
Maybe we should be doing these every week.
No, I'm just kidding.

Leonard Lee (50:25):
No, I was thinking we should.
Otherwise, we make this like afive hour thing and then we'll
have to use AI to edit the wholething and then maybe hallucinate
in between.
So anyway, but, hey, it's greatcatching up with both of you.
hopefully we'll have a calmer.
February, I doubt it, but, I'malmost ready getting back on for

(50:49):
next

Jim McGregor (50:50):
week.
We have a lot coming up.
I mean, bubble world.
Congress is coming up and betterworld, followed by G.
T.
C.
there is just on and on.
March is ridiculous.
I don't think I'm home more thantwo days.

Leonard Lee (51:03):
Yeah.
Yeah.
All the more reason to watch ourpodcast here.
the Silicon futures podcast,here and gentlemen, thank you so
much.
everyone, if you want to, if youwant to get in contact with
Carl, make sure to reach out tohim at, Cambrian hyphen
research, LLC.

(51:25):
And I think it's a.
com, right?
But he's on LinkedIn.
Reach out to him.
And of course there's, JimMcGregor of TEUs Research.
Just go to www.teusresearch.comand Oh,

Jim McGregor (51:38):
Jim at TEUs

Leonard Lee (51:40):
or Jim.
Okay, there you go.
And, connect with both of thesegentlemen.
They are thought leaders in thefield and, of course.
reach out to next curve at wwwdot next dash curve.
com.
And remember to like, share,comment, share your reactions to
this podcast episode and,subscribe to the next curve

(52:04):
rethink podcast on YouTube aswell as Buzzsprout and, we will
see you next week or next month,right?
Back here to recap the month ofFebruary, which we hope is a
little bit more mellow.
Never.

(52:24):
Okay.

Karl Freund (52:26):
All right.
Thanks Leonard.
Take care.
Cheers guys.
Thanks for watching everybody.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.