Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to the Nick, Dick and Pole Show. We are
this week. So there are people that get really hyped
up about certain topics, like you can bring up politics
and they'll get very upset. You can bring up the
best pizza in New York, the best burrito in San Francisco.
Speaker 2 (00:18):
Cat videos, cat videos.
Speaker 1 (00:21):
But if you say the next two words to Paul Kadrowski,
you will see the most passionate response from anyone on earth.
Are you ready? Data centers?
Speaker 3 (00:32):
You know what's funny about it is it actually pisses
me off that that pisses me off.
Speaker 2 (00:35):
It's one of these matter problems.
Speaker 3 (00:37):
See, like they really pissed me off, but it pisses
me off that I even care, which just feels ridiculous, right.
I Mean it's like, I don't know, obsessing about climbing
gyms or something.
Speaker 1 (01:00):
So for people listening, they're going to think they're literally
about to turn this off because they're going to be like,
I do not want to listen to a hour long
podcast about data centers.
Speaker 2 (01:09):
Right right, especially one that's going to probably run long.
Speaker 1 (01:14):
That's right, But but so on our on our text
thread for the past, how long has it been dickons.
Speaker 2 (01:21):
He's been sending these data centers quite sometimes six months,
hundreds of them.
Speaker 1 (01:25):
Yeah, we have been getting conscoundreds of passionate texts from
Paul about data centers. Do you want to do? You
want to start this? Like Ki, just kick just lead
us in, all right, Dick and I are going to
take a nap and I'll just talk and you just
what is it the pisses you off so much about
data centers right now?
Speaker 3 (01:43):
I think it's that people well to at least three things,
but I'll.
Speaker 2 (01:47):
Give you at least three. At least three. But that's
at the high level. There's a thousand at a lower level.
Speaker 1 (01:52):
Do we need to set anything up for listeners to say?
Speaker 3 (01:54):
Data centers are these giant air conditioned buildings containing huge
number members of GPUs and video chips in giant server racks.
They've been around forever. They've been around for as long
as we've had web services, back early days of Amazon,
early days of Google, early days of Yahoo, we've had
data centers. So that's nothing new, right. The idea that
data centers are somehow new is of course just wrong.
(02:17):
The difference is that all of a sudden spending on
data centers went from being kind of like this, like
growing four percent a year to growing like twenty two
percent a year just in the last two and.
Speaker 2 (02:25):
A half years.
Speaker 1 (02:25):
Right, but that's because of AI.
Speaker 3 (02:26):
Right, Really, don't piss me off. So yeah, it has
to do with this AI stuff. But the point is
it's all of a sudden taken off and everyone's kind
of playing this game of trying to say, you have
one hundred megawat data center, I can build a giga
wat data center. You can build a giga wat data center.
(02:47):
I can build a ten giga wat data center. And
everyone's piling into this business of proposing to construct giant
data centers all over the US, all over North America,
all over the world, and so on, and all in
large part because of this notional idea that it's the future.
These are the new engines of capitalism, that if I
(03:07):
don't have a data center, it's not just that I
can't fuel AI. I might as well just shut down
the economy because from a regional economic development standpoint, that's
what matters. And you'll hear this all the time from
officials in Georgia and Virginia and everywhere else. Why are
you so obsessed about these things. It's because it's an
annuity income for regional tax base, and it's because it's
the future.
Speaker 2 (03:28):
They're keeping told that, they're keeping told it's the future.
Let's talk about all the first of all, ignoring ignoring
the climate change impact, which is like, yes, it's the
same as other that missus Lincoln always the play ignoring
but ignoring the climate change impact. There are multiple, multiple,
(03:51):
multiple multiplicious oh endless. I mean the first one or
one of one of the most obvious that's really only
starting to be I mean this morning we were looking
at the news and seeing that it's starting to be
traded this way, and the debt from some of these companies,
the temporal disconnect between the life of the asset GPUs yeah,
(04:11):
and the term on the debt of upwards forty years
an Oracles case. Yeah, you know, you've got a two
and a half year asset being you know, finance, it's
supposed to be rolled over every year.
Speaker 1 (04:23):
Can you explain that to people that go for it? Wait,
hold on, you can explain it. You ran a company
they had no debt, though I like to defer to
Paul and then come in for the joke. Can you
explain what he just said in English.
Speaker 3 (04:36):
So data centers are expensive, yeah, but money has to
come from somewhere.
Speaker 1 (04:40):
Yeah.
Speaker 3 (04:40):
Companies don't want to fund them entirely from profits because
it's too much money, so they raise debt to do it,
and then they raise more debt later on because you
have to keep refinancing these things. But the debt has
a term, often thirty years, very long term debt. But
you pay the debt, pay off the debt, and make
interest payments based on the use renting the GPUs. GPUs
(05:02):
don't last very long for at least two reasons. One
is that technology changes pretty quickly, so you know, every
four or five years, you just turn them over because
there's some newer chip from Nvidio or whatever else and then.
But the other reason is that if you use chips
for training, it's kind of like using a car for
street racing. You're running a chip flat out seven days
a week, twenty four hours a day, so that the
(05:23):
thermal degradation the chip breaks down fast. So it's not
just that the chips have to be changed every few years.
But if you use a chip for training, it's like
heat stress. It's just sitting there in this really hot
environment running flat out.
Speaker 2 (05:35):
So there are two things that companies do to gussy
up the looks of that's thank you, I went old
West there. Yeah, that's good.
Speaker 1 (05:48):
One of the two things.
Speaker 2 (05:49):
One they take the debt and they move it off
the balance sheet and do it through some SPV. You
one is reminded. One might say, isn't that what Enron did?
And you would be correct, that is what. Yeah, let's
not put this right here, let's move this over here
and to a third party and have And the second
thing they do is, of course they don't mark down
(06:11):
the ad They don't you know, they don't mark down
the life of the acid over two and a half years.
They're like, well, really, these things are going to last
like six to eight years.
Speaker 3 (06:18):
Which is which which was true in the old world
of data centers. So I was running like an S
three cloud for Amazon for storage. You could realistically say
most of the hardware was six or seven years. But
it was completely different because the more people who use
Amazon for storage, the more money I make is Amazon,
I don't have to keep adding chips for can I
can I still talk here?
Speaker 2 (06:36):
No question?
Speaker 1 (06:38):
That's really important for this right when you build a
data center. How much of the money is sixty percent,
which is the sixty percent for the chokes, So the
chips are sixty percent, So the building's only forty percent.
The air conditioning, the food trucks, all the other nonsense.
Speaker 3 (06:52):
There's a few or few food trucks in you might think.
Speaker 1 (06:54):
Because there's not many people working there.
Speaker 3 (06:56):
Well, no, it's really tiny. Yeah that's one of them.
But that's actually an important point. These things are not
huge that was going to be.
Speaker 2 (07:02):
That's they go into these communities and we're going to
create you know, at least five hundred, even even a
low number like five hundred, they the community doesn't end
up going.
Speaker 1 (07:12):
Well, this happened. My dad lives in Racine, Wisconsin, and
they were going to build a massive data center there.
I forget who was the guys who make the chips
for the iPhone. No, no, I forget the name of
the company. Anyway, So they had all these bids and
they offered all these deals, and they said they were
(07:33):
going to build all these houses, and it turned out
like they needed no housing. Uh and uh yeah.
Speaker 3 (07:38):
But it's even it's even worse than that in some
ways because of the if you take the total cost
of the data center and the tax abatements that people give.
They give them guaranteed power, guaranteed water, they cut the
real estate tax. In many cases, you're paying over fifty
million per job created.
Speaker 2 (07:52):
So just give me the money. Yeah, I don't want that.
I got a great way to save you twenty five
million dollars. Give me twenty five million dollars and don't
do that.
Speaker 3 (08:00):
Don't do the data center. Cut at the middleman.
Speaker 1 (08:03):
Okay, So let me just ask the obvious question here.
Speaker 2 (08:05):
Wait, there's more.
Speaker 1 (08:07):
Why Why should anyone who still listening to this podcast
is maybe two or three people.
Speaker 3 (08:11):
Oh, you know what, here's what I actually I'm gonna,
I'm gonna I'm going to flame out for a second.
I actually think it's ridiculous that people aren't paying attention
because it's probably the single most important thing that could
cause the next major financial crisis in the world. And
the reason is because all major prior financial crises have
been tied to one of four things. Right, It's either
an amazing technology story that gets people over excited, really
(08:32):
loose credit, real estate, or some kind of government backstop
for something that causes people to do too much of
a thing. And it could have been Fanny and Freddie
back in the financial crisis. It could have been the
south Sea bubble back in the eighteenth century. This nonsense
has all four it's the first buy. It's got a
real estate component. Data centers are essentially built on the
(08:53):
idea of scarce land land that has water, power and
interconnect I can't just build one anywhere I want to,
so there's a scarcity compone that's causing people to speculate,
like a kind of mini Chinatown about real estate. There's
a there's a credit component. We're seeing that already with
all these private credit firms all coming and saying, no,
pick me, I can create the STVA for you. This
will be exciting, right, So there's a credit component. There's
(09:14):
a government component, because the governments have now all decided
we have to win because China can't win, so we're
gonna That's why we got into this big discussion I
think this week about whether or not.
Speaker 2 (09:23):
Yeah, that was the word backstop.
Speaker 3 (09:27):
I thought I didn't mean that, but that's implicit and
a lot of what these companies are doing. They approach
other countries and say you need to build data centers
because don't you want to have a Norwegian AI. Don't
you want to have a Swiss AI, Mongolian AI. But
that's all this cynical idea of trying to wrap AI
in a flag. And so you need you need each
of these pieces, but you also need the technology story.
Speaker 2 (09:45):
You need a really great story.
Speaker 3 (09:46):
It's not enough to just say, like, you know, we'll
build small free you know, Tamagatchi's or something. We need
some kind of great technology story. And so this has
one of the best technology stories ever because it actually
happens to be true. AI is incredibly important, It actually
works to change people's lives. So you put all those
four pieces together and there's no way this doesn't become
one of the largest financial bubbles in history. So the
(10:08):
idea that people say, oh, data center is boring, screw you.
I mean, this is like the sort of stuff that
takes down economy. This is has the potential to be.
Speaker 2 (10:15):
Larger than all the other ones.
Speaker 1 (10:17):
We five billion or something this year, what's the numbers.
Speaker 2 (10:20):
It's going to be closer to three hundred.
Speaker 3 (10:21):
Billion, Jesus, and then and then probably two trillion over
the next too.
Speaker 2 (10:25):
So all all of this now going on while are.
You know, you need power to plug in to make
the GPS go, which we're horrible at. Our grid is
already trash. And you know, compared to the grid in China,
which they have ample power, they have ample power.
Speaker 3 (10:48):
And it's also this that there's this perverse incentive to
agree to add a data center to the grid because
regional politicians want it for all the reasons we've talked about.
The grid want, the utilities want it because it's a
very predictable source. I know how much load there's going
to be. I can add it. It's not like speculating it.
I know what it's going to do. But the problem
is adding it adds well as much as gigawatts of
(11:09):
an additional power needs to the grid, and where's that
going to come from?
Speaker 2 (11:13):
Right?
Speaker 1 (11:13):
In the one article I read that you sent of
the seven hundred and sixty two, I was I saw
that I maybe read a couple of them, but I
saw that one data center is equal to one hundred
thousand households in power.
Speaker 2 (11:25):
That's a small data center, but yes, a.
Speaker 1 (11:27):
Big one is what ten x text easily.
Speaker 2 (11:29):
Ten xc right.
Speaker 3 (11:31):
So so so the problem is the grid's not as
Dick was saying the grid's not well set up to
add all that extra power. So you have these so
if you have these fanciful ideas, people will do there's
all kinds of great terms of our People will say, well,
I'll add power behind the meter and it's like, what
is that?
Speaker 2 (11:45):
Right, Well, that's the.
Speaker 3 (11:45):
Idea that I won't rely on the grid to provide
me with power because who knows when they'll add you know,
that nuclear plant that I needed, So I build my own, right,
So this is the idea increasingly in Texas, New Mexico.
Speaker 1 (11:57):
Is this ferny and things like that.
Speaker 3 (11:59):
Yeah, so that's an example of one of them. But
there's lots of others, Googles doing this, oracles doing this.
There's lots of people who are doing the idea of
adding power behind behind the meter because I don't want
to wait for the grid to do it. So now
you get into a secondary problem. So where's the money
coming from to build your own private natural gas plant?
Well that's more debt. So now you've got debt for
the natural gas plant being paid for by GPUs that
(12:20):
are only you're gonna have to return over every four years,
thirty year debt on a natural gas plant that probably
has a forty year lifespan. So that's what they call it,
like a like a mismatch in the debt markets. Like
you have a thing that's generating money for only a
short period of time paying for something an obligation, you
have the stretches up forty years.
Speaker 2 (12:37):
This never works.
Speaker 3 (12:39):
This goes very badly over and over and over again,
over like two hundred years of financial history.
Speaker 1 (12:43):
So how does it play out in a like how
does it end? Do we have like you know, like
you drive up and down any city street today and
there's more four lease signs than there are Like, do
we just do dated centers become ice skating rings? Like
I say, laser tags, laser tag great rooms for late
Can Nick invest in a laser tag development business right now?
Speaker 2 (13:05):
Yeah? Like that, but I think you're going to have
free space. One of the many things that pisses me
off about this subject.
Speaker 3 (13:11):
Is people all want to say, like how does this end?
And they just want to pick one thing, and.
Speaker 1 (13:14):
That's just lots.
Speaker 3 (13:16):
There's like seventeen different ways that can break this. The
only the only thing that's inevitable is it's not sustainable.
We'll either have over capacity, we have a debt spiral
that cracks all of this stuff, or you know, you
can even dive under the hood and say most of
these data centers are being created backward looking, and the
reason why is that they all imagine that if I
just run my training cycles long enough on top of
(13:37):
these data centers, I can push open AIE out of
the market, which is a completely ridiculous idea. Right, We're
already seeing that the improvement in large language models has
really tailed off in the last two years. It's costs
more money, takes more time, and I can't tell the
difference from the last one. So GPT five versus four
is a good example. And so the idea that I
can train my way out of this, which is what
(13:58):
floats a lot of these data center or boats. It's
not about inference, it's about training new models. That's just
not true. It's all this backward thinking, this idea that
I can continue to run this old training model, and
it doesn't work that way. So not only are we
building too many, we're building it based on horrible logic.
Speaker 1 (14:14):
I'm going to just push back for a second. I
have no idea what I'm talking about, but I'm going
to pretend I do for a minute, and Dick is
going to back me up.
Speaker 3 (14:20):
It never stopped you before.
Speaker 1 (14:21):
So when the housing bubble crashed, and how am I
going to back you up? I'm agree with me so
that we can argue with Paul.
Speaker 2 (14:32):
I know you are, but what am I I do?
Get there eventually?
Speaker 1 (14:36):
Yeah, okay, So the housing bubble when it crashed, it
was devastating for society because every people's homes and their investments,
and their money and and and their savings were all a.
Speaker 3 (14:48):
Fine and the global banking system really and.
Speaker 1 (14:50):
The global banking system, but it was, but it was
more so that it ended up affecting actual moms and
pops and people and so on. When the stock market
crash in ninety nine, the same thing, like we had
all put money into, or a lot of people would
put money into these companies the world.
Speaker 2 (15:06):
I know exactly where this is going.
Speaker 1 (15:07):
And then Dick's going to finish my question.
Speaker 2 (15:09):
Where's going? So let's see if I can. Let me
see if I can finish it. But we are now
in I think I can be Nick Bilton for a
moment here, But we're now in a world where eighty
percent eighty five percent, eighty seven percent of the shareholders
of you know these stocks, the Man seven are only
(15:30):
the top twenty percent of the you know, ten percent
of the wealthy or already wealthy in America, So it's private.
If I were if I were Nick Bilton, I would
finish my thought by saying, so, yes, this is all
going to blow up, but it's just going to beat
up the billionaires and the rich people, which is the rich.
Thank you, the fat cats to go with my earlier
(15:52):
term of gussie, well, to go with the fat cats.
Speaker 3 (15:55):
Let's bring out all the nineteen twenty so.
Speaker 2 (15:57):
So it's just going to hurt the fat cats and
not mon pot kettle. Another good one.
Speaker 1 (16:03):
Yeah, thanks, So he did it way better than I really, what.
Speaker 3 (16:06):
Argument you're gonna make?
Speaker 1 (16:06):
That was was going to thank you, Dick, thank you.
Speaker 2 (16:11):
That wasn't Dick Costal though, by the way, that was
Nick Bilton.
Speaker 1 (16:13):
Well would you not make that argument?
Speaker 2 (16:15):
No? No, but why?
Speaker 1 (16:18):
But what does it affect I don't understand. How does
it affect me?
Speaker 3 (16:21):
Have you ever heard of this thing called the S
and P five hundred?
Speaker 1 (16:23):
Yes, I've heard of the S and P.
Speaker 3 (16:25):
Percentage of the S and P five hundred is mag
seven stocks. I don't meaning the largest AI textalks ten percent?
Speaker 1 (16:32):
No, no, no, sorry, nine percent?
Speaker 2 (16:35):
Did you say ninety?
Speaker 1 (16:36):
Yeah?
Speaker 2 (16:36):
No, do you get out of the house.
Speaker 1 (16:39):
I'm not an economist. I'm just I'm ring together. I
read books, I do these things. I read Grendel by
the way, it was fantastic. But we'll come back to that.
Speaker 2 (16:46):
Just give you a retort to why why is this
gonna Roughly, your dad thirty thirty five percent of the
S and P five.
Speaker 1 (16:53):
I was in the middle when I said ten and ninety.
Speaker 2 (16:55):
Yeah he was. He was narrowing.
Speaker 3 (16:57):
He was mightily in. My head's in the oven, my
feet are in the on balance.
Speaker 2 (17:02):
Yeah, yeah, no.
Speaker 3 (17:04):
Anyways, so so roughly, what is gonna say next for
a single sector, Yes, to be that larger percentage of
the S and P five hundred is unprecedent. That doesn't happen, Okay, okay,
so at least in the last hundred years. So that's
incredibly important because it means that as these average jos
exposed to the market via passive index funde, which is
(17:24):
the largest chunk of the market now is passive funds,
and the largest piece of passive funds is index funds.
You're long AI. You're incredibly long. It's thirty percent of
your portfolio. If you're largely in passive fund.
Speaker 1 (17:35):
Oh, I just invest in laser tag.
Speaker 3 (17:36):
But yeah, yeah, So so the idea that somehow you
can dodge it, you're deeply exposed to this. You're incredibly
long AI by default.
Speaker 2 (17:45):
Paul saying, everybody's iras that are just in the vanguard.
Speaker 3 (17:48):
You know, of the gains in the last two years
were those seven stars. So it's not just that you
were exposed those unwined. Half your gains in the last
two years go away. Okay, so it's a double web.
So that's probably on the order of about eight hundred
billion dollars of capital be withdrawn that people think that
they're holding onto in these relatively safe conservative But that
(18:10):
doesn't stop there. It goes on like, for example, you
know exchange traded funds that are large holders of real
estate income trusts, which are essentially real estate funds. In
the ETF market, something like fifteen percent of the assets
under management at the largest ETFs are data centers.
Speaker 2 (18:26):
Again another problem.
Speaker 3 (18:27):
So who holds rates not people who are out there speculating,
and these are typically people who are retired, they want
income from some investment. They're some of the most conservative
investors in the market. They're incredibly long this stuff because
by default, these ETFs are increasingly holding a huge percentage
of data center So the idea that somehow this has
just to do with private equity is on the numbers ridiculous.
(18:49):
It's actually mostly retail investors who will take it on
the chin, and then secondarily it'll spiral to debt markets.
Within the debt markets in turn or collateral for other things,
so it'll contract spending, shrink the dock market, throw us
into a recession that's not dodgible.
Speaker 1 (19:27):
Okay, So Dick's gonna ask the next question for me,
which is, Okay, here's the question. We've all talked about this,
we all know it. Everyone's saying it ten years, five years,
twenty years, whatever the fuck it is we are most
of us are going to be replaced by AI.
Speaker 2 (19:47):
Oh.
Speaker 3 (19:47):
I thought you're gonna make a different better We're gonna
I thought you're gonna make the argument that, well, it
really doesn't matter because in a couple of years, all
this stuff, no matter how much it all gets wasted,
some of it will turn out to be useful, like
the dot Com cycle. So even if it turns out
that all these data centers we've overbuilt them, it's pointless.
Speaker 2 (20:01):
Yes, that's a big argument right now, and that's an
emerging argument. Yes, there will it will be overbuilt, and
yes we'll but what will come out the back end
will be all these amazing you know, technologies that improve
our lives.
Speaker 3 (20:14):
Right, it's the dot Com example. You know, all that
stuff didn't work out then, but we laid the groundwork
with fiber and all these other kinds of things. It's
that argument, right, which is a really popular response to this.
Speaker 2 (20:23):
It really doesn't matter that.
Speaker 1 (20:24):
I wasn't making that argument, but go for it.
Speaker 2 (20:26):
But it's a very it's a very sort of a
current argument. Do you like how I've gone from old
West to say yeah?
Speaker 3 (20:36):
So that argument, though, is like a very common response
from the tech the technirati that this is you just
don't understand the dynamics of how technologies work in these
bubble cycles and.
Speaker 2 (20:47):
Everything has a certain response to that.
Speaker 1 (20:52):
Do you agree with all of this?
Speaker 2 (20:54):
He's yes, I do agree with everything, saying no, I mean,
I mean sure there was some one or two things
in there, I disagreed with, but I generally agree with
everything you're say.
Speaker 3 (21:03):
Okay, all right, Well, so the response to that point, though,
the ridiculous point about this the cycles is one you're
operating from a data stet of about three two or
three historical examples, possibly the dot Com period. Yeah, maybe
you can go back to like the PCs. Maybe it
wasn't nearly as big. Possibly rural electrification in the nineteen twenties,
maybe railroads in the mid nineteen SEO.
Speaker 1 (21:24):
We're going to go to rural electrication in nineteen That's.
Speaker 2 (21:26):
All you got, my friend.
Speaker 3 (21:27):
So I was to launch a drug on that basis,
I wouldn't get approved. So the notion that that should
somehow convince me that this is some kind of law
of economic life for starters is just ridiculous on the face, right,
because we have a couple of anecdotes we like to tell,
and then that's all we have. Because even if you
take it back to the Industrial Revolution and say, well,
you know it all worked out, it took forty years
(21:48):
for the average person to get back to the same
standard of living as they had before the industrial society
got wealthier. Most people got poor for forty years. Try
that in the United States today and see how that goes.
Speaker 1 (21:57):
And that's going to be in two years, not forty.
Speaker 3 (21:59):
All get six months faster, more widespread, more consequential.
Speaker 2 (22:03):
And what's your sense about quickly this starts. I mean,
it's kind of starting to on one right now, people
realize I think.
Speaker 3 (22:09):
I always remind like I remind myself that back in
the eighties or nineties, whenever the dot com thing come
was beginning to blow up. So six four years before
it blew up, greasepads out there warning people about irrational exuberance.
So it was four years later people forget he was
warning people and saying, oh, you know, this is out
of control. I actually think it's going to saw tooth
considerably higher before it blows up. And at each stage
(22:31):
people will think, well that's the end, and it won't
be the end, and it'll be the end, you know,
so probably, you know, look out two to four years
is a good guess, but it'll saw tooth higher some
you know, catching people at each step all the way.
Speaker 2 (22:43):
The very end of these things tends to be the
most juice. That's sort of the last six months, if
they will, it doubles again It's like.
Speaker 3 (22:49):
The drunken Miller moment, the famously stand drunk and Miller
well on hedge fund manager was short this stuff, the
dot com stuff, all the way up, and then about
six weeks before at the top hit he.
Speaker 2 (23:00):
Went like February twenty, February two.
Speaker 3 (23:02):
Thousands, two thousand, he sold everything else, bought it all
and immediately blew up and the whole thing ended. And
he's a very smart guy. But the point is is
that you reach this point of maximum pain where people say,
you know what, this is all ridiculous, but I can't
take being on the other side of this anymore. I
need to just go all in whatever I need to
do in terms of lending or anything else. And that's
the moment when things go, you know, catastrophically badly.
Speaker 2 (23:24):
I mean, we're not there yet.
Speaker 1 (23:25):
Let me ask you a logistical question. How many of
all the data centers, how many are like an AWS
type situation where people are renting space, GPUs and so
on and so forth, versus Opening Eye and Facebook, and
those guys building their own.
Speaker 3 (23:40):
Even when they build their own, they don't build their
own obviously, right, I mean, these guys are not in
the construction business. So they're doing this in conjunction with
construction companies and special in the context of these special
purpose vehicles and others like blue out and private credit
and what have you. So the tend to still have
there's operators and sponsors in the middle of it. They
just become the primary lease ore of the data center.
Speaker 1 (24:02):
But given that, what are your what's your blood pressure
at right now?
Speaker 2 (24:05):
Did it?
Speaker 1 (24:06):
Is it up to like one sixty over?
Speaker 2 (24:07):
Actually I find it soon and he's getting it off
his chessfest so much.
Speaker 1 (24:14):
Let me do this little therapy session.
Speaker 2 (24:16):
I'm all good.
Speaker 1 (24:17):
No, but some of these guys, so we're all going
to be replaced by AI. This was gonna be my question, right,
everyone's saying this. It's going to happen. I I know
for a fact that I'm going to be right when.
Speaker 2 (24:27):
You say everyone's saying this. Do you meet it in
the Donald Trump?
Speaker 1 (24:30):
Everybody says we or you Actually the other day No,
but you're hearing it's it's I mean, it's not it's
so obvious. Yes, it may take a while, maybe it's
a year, maybe it's twenty, but we are going to
most people in society are going to be replaced by AI.
If that is the let's just hypothetically for one minute,
(24:52):
all agree that's going to happen.
Speaker 3 (24:53):
And to see where this is going. What does this
have to do with the current financial problem?
Speaker 1 (24:56):
You are going to need these data centers. You're not
to need data centers when AI runs the world and
humans are just taking their butts watching AI created.
Speaker 3 (25:06):
Slop again back for I was saying earlier, the current
buildout of data centers is predicated on a terrible backward
looking idea, which is that right now, sixty to seventy
percent of data center usage is not for making my
fake nick do fake nick things.
Speaker 2 (25:22):
It's for training.
Speaker 1 (25:23):
Coslos for that.
Speaker 3 (25:24):
Yeah, yeah, that's true. We have here here for that.
It's for training a new AI's training new frontier models.
If we're not training a new frontier models. And we
can come back to why that's not going to happen
if you're not training new frontier models. We arguably have
enough capacity now in data centers to take us to
handle inference through about twenty thirty two. Like we're already
(25:44):
overbuilt for doing fake nicks. The whole model is built
on a broken assumption, which is that we're going to
continue training frontier models for the next decade, and we're
already at a point where the deltas between models are collapsing.
The models from different vendors are all converging. I defy
you to tell the difference between like Anthropic and Claw
or Cloud and open TPT or whatever. They're all the same.
(26:07):
They all feel exactly the same.
Speaker 2 (26:08):
For general purpose frontier models. You can even see in
the data from these there are companies that are like, hey,
plug into us, and then we'll route your request to
whatever the cheap open router yes, for example, Yeah, plug
in your request US, and we'll route the request to
the which whatever each expensive provider is right now, and
(26:30):
people don't care. What you quickly see from looking at
a month to month graph of open router API API
calls is that for general purpose requests, nobody cares what's
what it's which frontier models being routed to, which tells
you that there and in fact, nobody owns more than
like x percent of that open router share, And it's
very fluid, and it's very fluid changes, so you can
(26:51):
just sort of see in the graph that there's no
I have to route it to this one right now.
Because it's so far ahead of these other ones. Yeah,
and these things can change constantly, and saying which shows you,
I mean it's basically it's a commoditization ground. It's a
graph that highlights commoditization.
Speaker 3 (27:09):
Totally shows it now. And I mean in saying that
you know we're going to automate you out of existence
or whatever we're going.
Speaker 1 (27:13):
To do to you, it's going to happen.
Speaker 2 (27:15):
I mean, we are going to automate you out of
a specifically as soon as par see.
Speaker 3 (27:20):
One of the mistakes people make, I think, in talking
about this stuff is they say, if you say there's
an AI capex bubble, it's somehow that becomes turned into
you think AI is nonsense. I think the exact opposite.
I think is an incredibly important technology. It's like saying
I thought there was a global financial crisis about to
happen back in two thousand and seven, two thousand and eight. Oh,
you're against houses if I'm not against houses.
Speaker 2 (27:42):
Specifically, so you're against like retail investors getting in on
the housing or even people owning houses.
Speaker 3 (27:49):
Somehow I'm dismissing the whole technology of houses.
Speaker 2 (27:52):
Right.
Speaker 3 (27:52):
Oh, let's go back to lean to's or something like this,
and so this.
Speaker 2 (27:56):
Is the same for me.
Speaker 3 (27:57):
I find it a ridiculous argument that if you think
that there's a CAPEX bubble, that must mean that somehow
AI therefore isn't going to do all of this damage
in the economy. No, it's going to do men's damage
in the economy. But that's a different point. Right, We're
going to have two things happen at once. We're going
to have this incredible overbuilding of AI fueled by debt,
which will all collapse, which will have consequences for this
stock market, for the economy. And then we'll also, at
(28:18):
the same time, at the other end of this process,
we'll still have data centers to putting out cheap tokens,
probably even cheaper, because they're all going to be excess capacity.
It's going to be like you know, opeck just put
pumping the stuff out, and that's going to make it
easier automating it.
Speaker 1 (28:32):
Okay, So let's just say that you're correct with everything, right,
Let's just say I'm correct. Let's just say that you're right.
Speaker 2 (28:39):
Okay.
Speaker 1 (28:39):
The question I have is for us to reach this.
Speaker 3 (28:44):
If you say, what's the trade, I'm going to turn
off my microphone.
Speaker 1 (28:48):
That's not me. The question I have is for this,
so we're going to get to this point where I'm replaced. Okay,
let's just pretend that that's true.
Speaker 3 (28:58):
Okay, just to be clear, this is a sum narcissistic
view of the world.
Speaker 1 (29:01):
No, it's not. I'm just saying. I'm saying most people,
me being one of them. Okay, most jobs. Let's can
we can we finally agree on one thing on this
podcast that a lot of jobs are going to vanish? Yes, okay, great, Okay,
so if that happens, I literally forgot my question.
Speaker 3 (29:22):
Yes, it probably wasn't.
Speaker 1 (29:24):
Okay, if that happens, so we have to the models
have to get better for that to happen, right, They
don't know because the models, they feel like they're kind
of going back. Was you said this the other day?
Like you said the check.
Speaker 3 (29:37):
But that's a different problem. But half of people are
dumber than average.
Speaker 1 (29:41):
Okay, but the models still have to the models still
have to get better to replace most people. So the
question I have, that's okay, but let's just pang with
me for one second. The question I have is what
what needs to happen with the technology to get there,
(30:01):
and don't the data centers need to exist to get
us to that point.
Speaker 3 (30:06):
No, it's back to the same argument, which is to
say that, yes, if we continue doing what we're doing
pre twenty twenty two in terms of these large data
sets and training in multiple cycles, these incredibly huge, these
huge frontier models based on you know, exabytes of data,
blah blah blah blah. Absolutely, but again that's like saying,
you know, well, there's only room for like, you know,
(30:27):
four computers in the world. Was the IBM argument in
the fifties, because you know, these things are going to
be so big and take.
Speaker 2 (30:32):
So much power.
Speaker 3 (30:33):
It just doesn't That's an extrapolative way of thinking about
what's ahead based on this really brief period where we
had all this free data to train on and we've
already exhausted it. I mean, I can make it just
like the Saudi Arabia of data and it's gone.
Speaker 2 (30:44):
Yeah. What Paul is saying is they are these they're
the diminishing returns on training now based on we've we've
consumed the web, the models have consumed the web, and
the diminishing returns on training data and the amount of
money the incredibly more vast amount of money you have
to pour in for the incremental and bit of training.
Speaker 3 (31:08):
Is again to get only marginal gains.
Speaker 2 (31:11):
To get only marginal little gains in frontier in the
front term models.
Speaker 3 (31:14):
It's on so so to take it the one x
step further so the notion that therefore I need even
more data centers because to make these things even more smart,
even smarter. No, we've basically we've we're frozen in place.
We're gonna kind of like go, you know, what's the
French term, cul de sac. We're in a dead end
with respect of this stuff in terms of progress right
now because of the architectural limitations of large language models.
Speaker 2 (31:37):
That's incredibly important.
Speaker 3 (31:38):
The architectural limitations of large language models, in conjunction with
the the end of the Saudi Arabia of data leaves
us more or less where we are very expensively, you.
Speaker 2 (31:48):
Know, and so so.
Speaker 3 (31:51):
Things like no, it's but if you think about like
this whole you know, what was the outrage at the
end of the summer about like sick Offencian models, the
idea that they kiss asshole the time.
Speaker 2 (32:00):
Yeah, people treat it as this really.
Speaker 3 (32:02):
Superficial thing about oh you know, they're just trying to
create more engagement. Now, that was totally predictable from what's
happening right now. And why is it totally predictable Because
if you can't get gains from what's called pre training,
which is what happens whenever you bring in new data
and you get smarter and smarter models, what do you
do to improve your model? You do what's called post training,
which happens after the frontier model has been created. You
(32:22):
do these reinforcement learning and mixture of expert stuff. But
what does that lead to? Well, that leads to people
giving you feedback on your model and say, oh, I
like when it does that, I evaluated more. So you
reward the models for doing things that make me happy. Well,
guess what. That leads to sycophancy. So sycophancy is a
natural outcome from an exhausted data set in models that
are no longer improving. It's totally predictable. So you just
(32:44):
get more and more sick of fans, and then you
can even go off the cliff, which is what we're
seeing now. Models actually be starting to become worse because
there's too much post training, and it's suddenly it's like,
wait a minute, you're dumber than you were six months ago.
Speaker 2 (32:56):
That's why.
Speaker 3 (32:57):
But people, you got to go on. If you go underneath,
this is what's happening architecturally, that's what you're seeing. What
you're really seeing is that if only these guys were
good enough to actually make the models into trusick. That's
not what's happening. It's this attempt to use post training
to improve their models.
Speaker 2 (33:11):
So the model.
Speaker 1 (33:11):
Is choosing to become a sick event.
Speaker 3 (33:14):
No, it's with reinforcement learning. You take you give people
the opportunity to respond and then incorporate the reward function
in terms of how oh I like that, I don't
like that, and that in turn becomes the launch product.
But it post training inherently has this problem of leading
to these what we've been calling the sycophantic models, but
(33:35):
more importantly, just leads to a performance collapse because there's
no new data. You're just over optimizing and I think
the thing that makes users happiest, which isn't the same thing.
Speaker 2 (33:43):
At what point does someone like Saudi Arabia come in
and go, hey, great news, everybody, we have super cheap
inference compute here for you. With the tank The tankers
are hooked right up to the data center. We use ours.
But that's that's already happening. It's already happened.
Speaker 3 (34:00):
It's really interesting when you talk to people on that
side of things. They're like, here we are, we've got
all this energy. We've got the energy leaving aside the cooling.
We can fuel it though, we'll make it all happen.
That's already happening. Like they would cheerfully absorb a huge
fraction of the workloads we're seeing here. And so you know,
my expectation is, you know, a huge fraction of what's
going on in the US will.
Speaker 2 (34:19):
All move offshore.
Speaker 3 (34:20):
It's going to move to it's gonna move to places
like China, and it's going to move to the Middle
East places.
Speaker 1 (34:25):
I saw someone talking about building it in space. Thanks.
Speaker 3 (34:27):
Yeah, yeah, yeah sure.
Speaker 1 (34:30):
Sure. People have their questions, is that you want to
move on to duck ragu or is working on a
really great ducks.
Speaker 3 (34:39):
I just think the thing that people have to keep
in mind is this idea that it's not just another
tech bubble, right, It's really not.
Speaker 2 (34:46):
It's the thing that's.
Speaker 3 (34:46):
Super unusual is that you have all the pieces of everything,
all four combined on one place.
Speaker 1 (34:52):
Have we ever had that before in history?
Speaker 2 (34:53):
Never?
Speaker 1 (34:54):
Never?
Speaker 2 (34:54):
Never.
Speaker 3 (34:55):
That's the thing that's really interesting about this one. And
so the idea that it's like, oh, you know, I
know how this plays out, that's like, no, you don't.
Speaker 1 (35:01):
But we've never had it before in history. But we've
also never had a moment in time where humans are
going to be replaced in the way that they.
Speaker 2 (35:09):
Will, yes, just make things worse.
Speaker 3 (35:10):
Though, that's just like an accelerant, right, I mean that
if you look at the other side of this, what
are we going to have lots of underused data set
is pumping out tokens at very very low prices reasonably effectively,
that's going to make it much cheaper to automate people.
So the consequences on the other side are even larger,
and there's people wandering around making this argument better than
I can, and I mean, but that's I think that's
(35:31):
an important point that leaving aside what happens in terms
of the next two or three years of captex frenzy,
at the other end, it's really predictable. You have this
incredibly low cost token producing engines out there where everybody's
trying to cover their marginal costs because they got all
this debt behind it. And guess what they're going to do.
They're going to drop prices as low as they can
and pump tokens out like crazy while they can, which
will have implications for further automation of the workforce.
Speaker 2 (35:53):
And make companies who are looking to replace jobs with
you know, and it will make it even more price
cost effective.
Speaker 3 (36:03):
Yeah, I can just waste tokens with impunity. But and
so that's at the other end of this.
Speaker 2 (36:07):
That's absolutely a different And if you think there's a
lot of video and music slop on Spotify and Instagram
and everywhere else, right now, just wait, just wait, just wait,
it's free. Just wait till it's all free. Right.
Speaker 3 (36:19):
So, I just think the idea that people get stuck
on this thing of like, well it's just a ws
or it's just another tech bubble or whatever else, is
like completely lost.
Speaker 1 (36:29):
I'm not going to ask you what the trade is,
but I do know people are listening and thinking, well,
what do I do in this situation? Dick, do you
have any answers?
Speaker 2 (36:38):
Doge coin?
Speaker 1 (36:42):
I should have said that absolutely all right, there you go.
You have it you heard it here first.
Speaker 3 (36:47):
No, there's nothing it's it's this is like a planetary
object smashing into another planetary object.
Speaker 2 (36:52):
There's nothing you can do.
Speaker 3 (36:53):
Just it's like that scene in Hitchhiker's Gat to the Galaxy.
You know, what should we do right now?
Speaker 2 (36:58):
Yah? Yeah? But it makes you feel better? Do you
think this is the Michael Burry so for people who
are listening, I don't know that is that from the
Big Short has been commenting a lot on this online lay.
One of his sort of cryptic tweets the other day was, sometimes,
you know, there are sometimes moments where you just see
it coming in. There's nothing you can do other than
not play the game. Yeah, yeah, And it seems like
(37:20):
he was referencing this. Yeah. I think so.
Speaker 3 (37:23):
And I mean and I think in a lot of ways,
this current moment has, you know, echoes of some of
the stuff that he was that he did well during
that particular crisis in terms of recognizing this sort of
flywheel of debt creation, because we haven't talked about this
idea that one of the reasons why things are spiraling
out of control is because some of the people who
are providing the capital don't get a rats ask about
what's going on in AI. They just like the idea
(37:44):
that a really good credit like Google, Google or Amazon
whoever else, is going to pay money for renting access
to a data center. And then at the other end,
I can take that income stream and repackage it and
sell it. Whether there's AI going on in the middle
or hide and seek, I don't care.
Speaker 2 (38:01):
It doesn't make any difference. And that's so good.
Speaker 3 (38:03):
So that capital just keeps going around and round the
rond because you're perfectly happy to just keep repackaging it
and it doesn't matter what's going on.
Speaker 2 (38:08):
So from their standpoint, and that's the pattern he's recognized.
Speaker 3 (38:11):
That's the pattern he's recognized because from their standpoint, data
centers are basically just apartment buildings and a bunch of tenants.
They're all paying rent. Some of these tenants are really good.
I want more of that action because then I can
take the income stream that's produced and repackage and.
Speaker 2 (38:25):
Relic of different pieces, resell it and resell it.
Speaker 3 (38:28):
And so this is the mistake the tech techies make
all the time as they get hung up on the
idea of well it's AI. No, that has something to
do with the credit side of this. As I said,
it could literally be like you know, securitize rented hide
and seek competitions. It doesn't really matter. And that's why
you're going to have so much more data centers created,
is because that on the other side, they love the
income stream in particular, given that they're like, well, this
(38:50):
can't fail because look it's Google making payments on this
or Meta or whoever.
Speaker 1 (39:18):
Well, given that my prediction was correct. When I said
the word data center to Paul, we got this response,
I'm gonna I think we should. We should come to
a close with this podcast by saying something to Dick
Costello that will equally get him upset. It is not
duck regu. It's not data centers.
Speaker 2 (39:36):
It is I'm not making a duck ragu. Forget's a rabbit,
one's a bird, and one's a mammal.
Speaker 1 (39:42):
It's not like, sorry, rabbit ragu. It is you ready,
Paul travel and TikTok.
Speaker 2 (39:50):
Yeah, So it's really bad.
Speaker 1 (39:53):
It's so fucking terrible.
Speaker 2 (39:55):
So I was I was in uh Florence, Italy on
a cooking trip and.
Speaker 1 (40:02):
To make duck regu or rabbit regue.
Speaker 2 (40:04):
You know what, I'm just gonna pretend you're not here.
Speaker 3 (40:06):
For a little while you're close enough to hit them.
Speaker 2 (40:08):
Yeah, I know. And there are two things that I observed.
One is the the classic uh spots, you know, Michelangelo's David,
the Duomo. They're they're they're like the lines are insurmountable.
I mean the lines are the lines at this near
(40:32):
the Duomo in the center Florence. I mean, I'm guessing
three three and a half hours. They're they're hundreds of
yards long, winding around, and there are like you can't
fit an infinite you know, there are only so many
people that are gonna allow in the church at a time,
and those people want to hang out in there and
take pictures and post their pictures and and and you're thinking, like, okay,
(40:54):
you're you're in this town for some period of time
and you're going to spend four hours of it. Then
the second order effects of that are horrible. Second order
effects include one, the entire piazza is now Choschke salespeople
who are selling little trinkets, and you know, the beautiful
the little cafes and everything are being replaced by the
(41:16):
A the Choshke shops and b the places that just
all serve the same pasta because I have to go
to Florence and get you know, carbonar or whatever. The
one the number two bad derivative consequence of these horrible
you know when I talk about the other thing that
happens on on TikTok and Instagram, of course, is you know,
(41:39):
the influencer goes like, these are the top ten restaurants
in Laurence, Italy. We're obsessed. We went, you know, number three, Cystonza.
We're obsessed with the butter chicken. So you go to Cistanza,
which is this great little trotteria on Gobi Street, and
used to be like, you know, go there and walk in,
(42:00):
maybe at a table, maybe wait a little bit. And
now it's like you have a reservation or you don't
sit because it's you know, number three on the influencers,
and it's all it's all almost all Americans, and they
all only order the buttered chicken. So you go back
(42:22):
in the back of the restaurant and talk to the
guy who owns the place, and he's like, I go
through five and a half kilos of butter a day,
unpacked right now, which is great. People only order the
buttered chicken, and there's going to come and by the way,
all the locals now who can't come here anymore because
(42:43):
it's booked way up in advance, and so they don't
even think about it. There's obviously going to come a
point a week, a month, a year, whenever from now
when people stop going to Cistanza for the butter chicken,
and the place will be empty, plates will be empty,
and you know, he'll be like, I got to start
basically start over. I have this great little.
Speaker 1 (43:02):
Does the owner have any like any like? Is he
annoyed by Yeah, but.
Speaker 2 (43:07):
He's also like what am I gonna do? Yeah? Yeah,
what's my what's my chicken? Yeah? I'm not He's like
unpacked every night.
Speaker 3 (43:18):
Yeah. But there's a can I abstract upper level please.
There's a tremendous book that.
Speaker 1 (43:25):
You can't say the word data center though.
Speaker 3 (43:27):
So in terms of data centers. Actually this is connected
with data centers.
Speaker 2 (43:32):
What I was.
Speaker 3 (43:34):
There's a tremendous book from like eighty eight or eighty
nine by a guy named Steve Strogatz who's a mathematician
at Cornell, which is about this idea of the spontaneous
emergence of synchrony in complex systems. So the idea that,
for example, like butterflies in Georgia's once one there's initially
start off very chaot not butterfly, sorry, fireflies, the one
will start off signaling and it's very chaotic, but the
(43:55):
system rapidly converges until the entire group of fireflies is
all pulsing in uni right, which is you see this
in natural systems all the time. And there's certain characteristics
of systems where that tends to happen. But among them
is this idea that everyone can observe each other. Right,
all the fireflies see what the other firefly is doing
and spontaneously changes to be in sync with each of them.
In human systems, for a long time that was very
(44:16):
hard to do because we couldn't observe each other at
large scale. But now with social media, we can observe
each other at large scale. So guess what happens. Much
like the fireflies pulsing in the forest.
Speaker 2 (44:26):
We all ordered the butter chicken.
Speaker 3 (44:27):
We all order the buttered chicken. So this synchrony emerges
in these systems that never happened before, and it becomes
even more.
Speaker 2 (44:36):
Organized.
Speaker 3 (44:36):
So you'll have the point masses of people converge on
a single plaza or on a single restaurant. But the
difference isn't just the existence of influencers. The existence is
these systems make possible something that was never possible before,
because now we can all observe each other in real time,
and even if we don't want to. Humans are hurting creatures.
They like to see what each other is doing and
(44:57):
kind of fit in with the tribe, and so they
all organize themselves in a way that causes exactly this.
And it's really it goes back to like strokeouts and
fireflies and all this stuff, and it's like, you can
only see it getting more dramatic with hert.
Speaker 2 (45:08):
It'll be interesting to see what the reaction is from
towns that are become fed up with it, where it's
just I mean, there are certain streets in central floors
you just can't walk through. This is in October. It's
not like I remember you summer, you just can't. It's
literally playing Mario Kart, just trying to get out of
the neighborhood. And you wonder if there'll be like, hey,
(45:31):
we don't you know, restaurants they're like, we don't allow no,
can't like no phones in the restaurant. I mean you
have to imagine start to I.
Speaker 3 (45:38):
Remember like frisk you to see if where they have
an Instagram on your phone.
Speaker 1 (45:41):
It's like, dude, you're It's like there was that pushback
where it was like probably ten years ago where people
were like coffee shops were like no computers, you know,
Wi Fi, no, whatever it is, and you have there
has to be at some point because I've seen it's horrific.
Speaker 3 (45:55):
I think it actually gets even more dramatic because it
extends across more systems, like you see it in financial markets, right,
you see people synchronized because I can now observe other
people's trades in Robin Hood, I get a real time
sense of what's going on. That's the exact same phenomenons
you get prediction markets.
Speaker 2 (46:09):
Right, all of a.
Speaker 3 (46:10):
Sudden, I can see what other people think is going
to happen, synchronize it so you see it on Instagram.
Speaker 2 (46:15):
By the way, speaking speak speaking of it's always funny
to me when people are shocked shocked to learn that
some of the actual performers in the sport are gambling,
Like meanwhile, you're watching the sport on TV and every
ad is parlay this right now and lah blah blah
if you want to on the second half. It's just
it's just like begging people to do it. Yeah, of
(46:37):
course they're gonna get. Of course you're going to be
able to like, what's my edge, you're going to be
in this particular market. I'm going to get one of
the players or coaches or so and so I'll find somebody.
Speaker 3 (46:45):
Yeah, just the story back to like early boxing and
so on. But this, but this, the synchrony thing, I
think is incredibly important.
Speaker 1 (46:51):
No, it really is.
Speaker 3 (46:52):
I have it, and it is people don't understand the mechanics,
so they don't realize in a sense, how they're being
led into it via these top ten lists, which in
the sense is kind of like observing other fireflies.
Speaker 1 (47:03):
Is this the last couple of questions and then we'll
wrap up. But if there's anyone still listening, I don't know,
but maybe an AI you know, uh listening from the
data center. There was this study that was done recently,
like the last couple of weeks where they gave all
these different ais ten thousand dollars to trade and they
all lost like nine hundred and you know, all of
it pretty much is that because of the same thing.
Speaker 2 (47:24):
They blew it all on liquor and Horse.
Speaker 3 (47:28):
I did see that, which should come as no surprise.
They were trained on Reddit.
Speaker 1 (47:35):
It's all fit, all right, you have any last things,
Dick before we wrap.
Speaker 3 (47:39):
This though, Like even the stuff on the market, I'm just.
Speaker 2 (47:42):
Gonna sit here and keep talking.
Speaker 1 (47:43):
We can go to lunch now, please, stuff on the markets.
Speaker 3 (47:47):
But even the stuff on the markets, like, I just
think it comes back to people misunderstanding like how these
models are trained, right, I mean I always I joke
all the time that it's basically instead of when someone
says a large language model told me X, should just
say like a thirty seven year old dude on red
told me, yeah right, I mean.
Speaker 1 (48:01):
That's basically open eyes training.
Speaker 3 (48:04):
That's the median influence. So again, so now thinking about
it in stock market terms, if some thirty seven year
old dude on Reddit told me that I should be
long whatever, would it surprise me to find out that
terry doesn't work out?
Speaker 1 (48:16):
If the other fireflies blinking I'm an link too, Yeah, But.
Speaker 3 (48:19):
I just think I this is the idea of how
badly people misunderstand this thing that they've committed so much
time and energy to and then they're surprised at the outcome.
Is really I think important because then it shows you
why it's going to continue to synchronize behaviors and do
all these other things that people will be completely surprised
by because it's leading you to the median outcome all
the time.
Speaker 2 (48:37):
And people anoint it with more capacity than it deserves
because it's conversational, and so they think, like, obviously it's
thinking about this because it's having this conversational tone with me.
It's not just reposting. Thirty seven year old guy on
Reddit says, these are the top five movies of the year.
Speaker 3 (48:57):
You know, it feels like someone I know it likes
me because it's it will tell me things like that
was a really sharp question, Like.
Speaker 1 (49:04):
I hate when it tells me anything good. I'm like,
I'm like, tell me bad things. That's all I want
to know it again.
Speaker 2 (49:10):
That's you can pay extra for that.
Speaker 1 (49:14):
Well, this has been a fascinating episode. Uh, The Nick
dickm Polshow will be coming live to a bunch of
data centers in the new year. And that conclusive episode.
Thanks guys,