All Episodes

May 8, 2025 92 mins

Tom Bilyeu is a billion-dollar brand builder, co-founder of Quest Nutrition, and the visionary behind Impact Theory—one of the world’s most influential personal development platforms. With over 4 million YouTube subscribers and decades of real-world experience scaling companies and mastering mindset, Tom is on a mission to help people thrive in a rapidly changing world. In this episode, we dive deep into the future of work, AI disruption, and what separates the 2% who adapt from the 98% who get left behind. Tom breaks down the truth about automation, the urgency of building mental frameworks, and why emotional control is the real superpower in chaotic times. We also explore the rise of billion-dollar AI startups built by teams of just 2–3 people, and why the next 24 months may determine your financial future. If you want to survive the shift and thrive in the age of disruption, this is the conversation you can't afford to miss.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Every time a new technology comes into place, it ends
up creating more jobs, not less, even though it always
seems like it's about to put everybody out of work.
And then it just opens up all these new avenues
and the people with the talents flood ind.

Speaker 2 (00:11):
O case, more job go yes, but are the people
able to go into those jobs? No.

Speaker 1 (00:15):
So this is where I want people to remember. Two
percent of people can make that kind of change. Ninety
eight percent.

Speaker 2 (00:20):
One Trump's tariffs, markets crashing, AI's takeover, and the rise
of bitcoin. Now there's one hidden force behind all three
that could reshape the economy, and most investors aren't ready
for it. And if you get this one thing wrong,
you could be holding the wrong assets, working in the
wrong job, and.

Speaker 3 (00:36):
Watching your wealth disappear.

Speaker 2 (00:37):
But if you get it right, you have the chance
to be miles ahead of everyone. Today, I am sitting
down with Tom Baylou. Tom's built a billion dollar brand,
interviewed over six hundred top performers. He's a leading voice
in the world through impact theory, his top point zero
one percent podcast, billion view media company, and Tom teaches
people how to master mindset, strategy, and adaptability, which are

(01:00):
essential skills in the world that we're going into. So
in this video, we're going to discuss Trump's tariffs, we're
gonna talk about the markets crashing, we're gonna talk about
AI's takeover and what that means over the next couple
of years, and the rise of bitcoin. Of course, what
he thinks you're gonna need to know to survive and
thrive over the next three to five years.

Speaker 3 (01:18):
So let's just jump right in with Tom.

Speaker 2 (01:21):
So, what would you say some mind shift that you
could give to people that might be useful for them today.
The ultimate framework is really simple.

Speaker 3 (01:29):
You at all times need to know.

Speaker 1 (01:32):
This is the big one that people don't understand about themselves,
which is that they trust their.

Speaker 3 (01:37):
And they shouldn't.

Speaker 1 (01:38):
If people just run that all day, every day, they
can have all the success that their emotional stability, their drive,
and their level of intellect will allow. The big thing
I would think people should look at right now if
you really want to understand part of the game they're playing,
whether this is going to work because of how much
of the debt has to be refinanced. You've got to
get the we are in a cold war with very

(02:01):
high potential for being a kinetic war with China, they
can't control.

Speaker 2 (02:05):
Your manufacturing, so that has to change. What do you
see AI agents doing to disrupt sort of the world
and at what speed less than twenty four months away
that you're seeing like this kind of blinding speed.

Speaker 1 (02:19):
You need to be at the edge of it figuring
out how do I make this work. There's going to
be many billion dollar companies made by one, two or
three people. They get together, they deploy full AI orgs
all of a sudden, All of those employees that you
would have historically had they just go away?

Speaker 2 (02:39):
All right, Tom, And one of your latest videos, you've
been talking a lot about tariffs and Trump and you
called the tariff planet global financial earthquake. Obviously, we've seen
markets crashing, seems like it's heating up with the US
and China.

Speaker 3 (02:53):
A lot of people are on edge about this. Now.

Speaker 2 (02:56):
You've been interviewing a lot of people I know, macro
thinkers that I know like ral Paul ray Alio.

Speaker 3 (03:02):
Talking about the economic chaos that's happening.

Speaker 2 (03:05):
So what do you think investors should be paying attention
to or doing right.

Speaker 1 (03:11):
Now that there's absolutely no way they know what's happening?
And boy do I warn people against being day traders,
so I would have people asking one question, what do
you do in times of uncertainty? At this point, it's
probably too late, to be honest, for them to make
any sort of flight to safety, but I certainly could

(03:33):
have seen somebody going, Huh, this is going to bring
a lot of uncertainty. Markets hate uncertainty, so I'm going
to move into T bills or something along those lines
to avoid it. But honestly, from where I'm sitting, I'm
always taking the perspective that the odds of me beating
the market are effectively zero, and so I'm going to
play the old adage of time in the market versus

(03:55):
time being the market. Nothing about my thesis about where
certainly American companies go has changed, So I've already got
and I've watched your videos on Ray Dalio, and I
know his historical records. But the reason that I still
buy into that strategy is because I'm partly my age,
so I still have a long time in the market,

(04:17):
so I'm not super worried about, you know, a ten
year period where it gets a little flat. I am
way in a protective stance, so all of my high
risk risk on stuff is all going to be in
the companies I'm building. So I would advise people in
a time of massive economic uncertainty unless you really feel
like you have some insider information insider trading, but like

(04:40):
you really understand that market just obscenely, well, yeah, I
would be going somewhere that is going to ride the turbulence. Well,
so for me, it's my thesis on bitcoin has remained
the same, So I didn't touch anything there. My thesis
on the market over the long run that I'm invested
for is the same. So I've just left everything there,

(05:01):
and I'm already so defensive and have so much in
government debt that I left that there. But for me,
most of that stuff is really short term because I'm
building my own company. So my big concern is always
having your capital tied up somewhere. So right now, I
would tell people to make sure that you can move
at a moment's notice, that you've got enough capital that

(05:22):
you can deploy that if something really gnarly happens, like
if we go into a proper global recession, you want
to have enough access to capital that you don't have
to worry about your lifestyle for years and years and
years and you want to be able to if you
see like a real opportunity, like, for instance, if AI
and robotics just absolutely got hammered, I'd want to be
able to get into those in a super broad way,

(05:45):
very diversified across the sector, because I have no idea
who's going to win. But my thesis about the future
is so aggressively clear.

Speaker 3 (05:53):
I could be wrong, obviously.

Speaker 1 (05:54):
But it's so clear to me what the next call
it five to seven years is going to look like
that if I saw that segment depressed, then I'd be
pretty eager to get in at a discount.

Speaker 2 (06:07):
I love AI and bitcoin, and we are going to
dive into those two topics for sure, because we're gonna
have a big conversation around that and what the next
five to seven years looks like for sure. But what
I heard you saying was number one for your current position,
So you have a famously sold quest nutrition for whatever
billion dollar deal that was, and so you're in more

(06:28):
of a defensive state for sure. However, I would add
on too that that regardless of what state you're in,
even if you're still trying to make your money, I
would listen to what you said, which is you're focused
on your businesses because that's where you make your money,
and then you're trying to defend and keep the capital
that you have. And so I think people that are
thinking about the short term, trying to trade that to
make more money, you're probably gonna end up on the

(06:49):
wrong side of the timing.

Speaker 1 (06:50):
Yeah, And I mean, look, you know this better than
I do, but I would be so curious. I've heard
a data point that rings so true to me, which
is that something and this is for sure true in
the crypto world, five percent of the walllet's make ninety
five percent of the games. So it's like there are
people that they're in those WhatsApp and signal groups. They're
betting on culture. They know how much they can influence

(07:13):
said culture, and they win and basically everybody else loses.
I'm just not that that game, to me is so crazy.
So I get I'm a very particular flavor of person
who talks about the market, but all the betting, all
the gambling, that stuff, to me is like a major turnoff.

Speaker 3 (07:30):
Yeah.

Speaker 2 (07:31):
Yeah, And unfortunately that's what the crypto space has turned into.
I mean, meme stocks are jokes, right, and so it's
turned into a cryptocus you know at this point literally yes,
literally jokes aggressively agree with that. Yeah, sad story. Somebody
who was in one of my training programs. I'd consider
him a friend.

Speaker 3 (07:48):
Now.

Speaker 2 (07:48):
He had sort of lost a lot in the crypto space,
decided he was going to go all in bitcoin only
from here on out, built up a pretty big personal brand,
made a lot of money, did pretty good, had about
six million dollars in bitcoin. He's young, in his late twenties.
The crypto thing kind of called again. He started pumping.
He started a crypto trading group. Uh huh. He made

(08:10):
a lot of money, twelve thousand dollars memberships for he
sold like two nre people made a couple of million
bucks selling memberships, but they all lost a lot of money.

Speaker 3 (08:17):
So then he started a meme trading group. Oh god.
And it was about three weeks ago.

Speaker 2 (08:22):
He messaged me, sent me a text message and he said, Hey, Mark,
can we talk? I messed up really bad and I
need to talk because I have this famous story of
kind of losing everything in two thousand and eight.

Speaker 3 (08:32):
And uh.

Speaker 2 (08:32):
I got on the phone with him and he's like, man,
I chased the memes all the way to the bottom
and I lost all my money six millions US and
not only that, but everybody that was paying him to
be in that meat trading group as well.

Speaker 3 (08:45):
So anyway, I.

Speaker 2 (08:47):
Want to dive deeper into tariff. So I do want
to get into bitcoin AI. We're gonna have a lot
of fun with that. But if we talk about tariffs,
like I said, you've been all over it. It's the talk
of the talk of the town, if you will, all
over the news. But you talked about like Trump's tariff strategy.
Now a lot of people actually let me start with this.
You said that, I think on your intro you said
that nobody has any idea what's going on or what's

(09:10):
going to happen.

Speaker 1 (09:10):
Yeah, I don't think anybody can accurately predict how this
is going to play out.

Speaker 3 (09:13):
People f thess and that's good.

Speaker 2 (09:15):
So certainly, in a complex system, the lava unintended consequences
is going to kick in. And so you start trying
to play with a couple pieces of a complex system,
and we have multiple systems together, certainly the unintended consequences
will win. But the narrative seems to be with mainstream
media that Trump and his team are a bunch of
buffoons and have no idea what they're doing.

Speaker 1 (09:36):
Yeah, yeah, I mean listen, at the end of this
half the world's going to look like morons and the
other half is going through have been right to some degree,
but I don't think anybody right now should have the
degree of confidence that I see people putting forward. Trump
really is the right way to think of him is
he is the CEO of America and he's he's going

(09:58):
for Mount Rushmore, whether it's going to work or not.
I legitimately have no idea. I can articulate a very kind,
generous read on what he's doing, and I pay a
lot more attention to Scott Bessen and Lutnik than I
do to Trump. Trump is a chaos agent. Maybe the
chaos is exactly what the system needs right now. But
if you really want to understand again whether they're right

(10:21):
or not, they have a very easy to follow string
of logic, and it just becomes a question to what
you were saying, cool your first sort of consequences, Yeah,
there's a chance that those will play out just like
you're saying. The bad news is this is an insanely
complicated system where egos of other nations are getting involved.

(10:42):
This is like a huge gamble you're playing in with
the setup of Thucidides trap, so it's UV China. You're
fighting for the whole world at a time where Breton
Woods is dead, so you've got to come up with
a whole new monetary system. And Trump is like pulled
the pin out of the grenade and tossed it and
has gone, Okay, this is going to break the mold,
and I've got to get everybody out of that mold

(11:03):
so that I can move them into something new. And
I'm perfectly willing to believe that they have a vision
for what that's something new is they've articulated it. There's
probably the part they're saying, and then there's my mind
read about the part they're not saying, which we'll see
if I end up being right about it. And for
people that don't know me, I am not emotionally invested
in being right. I am only interested the only way

(11:25):
to make sense of all the words coming out of
my mouth are I am trying to accurately map the
way the world works and so where it's going. Yeah, well,
so if you know how the world works, then you
have a sense where it's going. If you do not
understand how the world actually works, you're never going to
be able to figure out where this is going. I
try to follow everything from a cause and effects standpoint,

(11:45):
so I will routinely have people interface with me as
if I'm saying Trump is right, he has to do this,
this is wonderful, it's going to work. I'm not at
all saying that. I'm saying this is the logic of
what they're laying out. This might be a hidden layer,
and if these things are accurate, it should play out
like this. And now, like everybody, I'm just gonna watch

(12:06):
and see if my mental map is accurate. I've already
explained what I'm actually doing with my money, so people
don't need to guess about what bags I'm pumping or
anything like that. I'm just I'm trying to mentally map
the world and see if I end up being accurate
not so I can feel good about being right, so
I can say, cool, this map of the world of
this moment had a high degree of predictive validity.

Speaker 3 (12:26):
That's it. That's all I'm going for.

Speaker 1 (12:28):
So yeah, I think that this system is so complicated.
They're running what I call the shoot me in the
ear bro strategy, where it's like laying that Oh my god.
So when Trump first got shot in the ear, somebody
who I know and respect said he probably just faked
it and he was just doing it for attention, and

(12:50):
I was like, hold on, you actually think that. He
said to somebody, go ahead, shoot this off my head.

Speaker 3 (12:56):
Sure, graize my ear.

Speaker 1 (13:00):
That's so dangerous, there's no way. Yeah, this feels like that.
It's so high risk what he's doing. If he pulls
it off and this ends well, and we completely reorganize
the world order in America's favor, and we now get
to take advantage of being the world's either first or
second largest economy, depending on who you speak to, and

(13:22):
we get all that benefit resounding to the US. We
get out of all of the problems that we have
with China controlling all of our manufacturing.

Speaker 3 (13:31):
That is crazy.

Speaker 1 (13:32):
We are in a cold war with very high potential
for being a kinetic war with China. They can't control
your manufacturing, so that has to change. But the way
that he's going about it with the just hyper aggressive,
ultra fast.

Speaker 3 (13:49):
Is shoot me in the ear.

Speaker 1 (13:50):
And so if nick him and he just bleeds a
little and is like the world's trying to kill me,
and now everybody rallies around him. Amazing, but an inch
in either Russian and he's dead.

Speaker 3 (14:01):
Yeah.

Speaker 1 (14:01):
So this is when I really go prognostication mode. I'm like,
it starts working, but not in time for the midterms.
He loses the midterms, it flips, they no longer control
the House, he becomes lame duck, and Republicans.

Speaker 3 (14:15):
Are out in four years.

Speaker 1 (14:16):
That's like if I were gonna, say fifty one forty nine,
Because again, I really do feel like I'm guessing it's
just such a complicated system that would.

Speaker 3 (14:24):
Be I would give the edge to that.

Speaker 2 (14:27):
It just won't work in time, even if ultimately it
would have one thing that I've seen. So the last
time we had this big of a drawdown in the
markets was in twenty twenty two October twenty twenty two,
when Biden was president, and nobody was saying it was
all Biden's fault. Granted he didn't go throw a grenade
to your point or the illustration that he gave. But
today this morning in the gym, I saw in the news, right,

(14:47):
it's just like Trump Trump, Trump, Trump, Trump, and it's
all his fault.

Speaker 3 (14:50):
Yeah, and a lot of it is heated.

Speaker 2 (14:52):
Did move by executive order, so a lot of it
is sort of on him. But if they're going to
assign this whole crash to him, if he pulls it off,
they're gonna forgive him all the credit for it, and
they won't.

Speaker 3 (15:01):
So brace that and they won't.

Speaker 2 (15:03):
But I want to go back you said that the
risk is so high. I want to put a pin
in that and come back to that. But just going
back to we do have no idea of the future.
Tomorrow is not guaranteed for any of us. Certainly, it's
a complex system. But I want to talk about the
competency of the people. So I mean, I saw I've

(15:24):
seen many interviews of Howard Lutnik. I don't know if
you saw the one of them on the all in Pod. Yeah,
I mean, very smart, very capable, the way that he's
looking at things, like he said, like you hear about
the debt of the country thirty six trillion, you're about
the deficit of the country. You hear about the income
of the expense of the country. But as a business owner,

(15:47):
there's a number that's glaring, that's missing out of that,
what about the balance sheet and like just hearing like
an approach like that in a business mind number one.
But then you have Bescent and you said you listened
to him quite a bit a pretty good interview on
Tucker recently.

Speaker 3 (16:02):
And I see a lot.

Speaker 2 (16:05):
Of people on Twitter and YouTube, people that I respect,
people that I know, and people that I align with
their worldview, and they're just telling, Oh, this guy's making
all these mistakes. He didn't understand this. He didn't understand that.
It's like, yeah, you're not smarter than Scott Piscent. So
Scott Pisent's pedigree, right, he worked for the goat the
greatest of all time, Druck and Miller, the second greatest
of all time Soros.

Speaker 3 (16:26):
But specifically what he did.

Speaker 2 (16:28):
Working for those guys, right, specifically, what they did was
take down countries and manipulate currencies.

Speaker 3 (16:36):
That's what they did, dude.

Speaker 1 (16:37):
Take down the Bank of England. The Bank of England,
not not a rando like small country that's hyperinflating their currency.

Speaker 2 (16:45):
The Bank of England. It's pretty crazy, the Bank of England.
So basically he was hired to do the opposite of that.
So understanding what makes the bank of England vulnerable? What
makes other they've taken down many other country since the
Bank of England, what makes those countries vulnerable being over
leveraged to being off sides? And now he's been hired,

(17:07):
as he said on Tucker, like to sell dollars around
the world. Right, So he understand that part. So what
could we do to do the opposite of that? But
more specifically, I.

Speaker 3 (17:17):
Mean, you took down bank in England.

Speaker 2 (17:18):
But China is a pretty big thing game, if we
would call it that, right, So understanding what makes the
US dangerous and how can we derist the US? But
then specifically, what are China's leverage points? And and you know,
he said on Tucker a couple of times that he
sees China inter recession depression. Maybe we can box them in.
We have no idea, But to say that he doesn't

(17:42):
know what he's doing would be I think a mistake.

Speaker 3 (17:46):
Aggressively aggressive.

Speaker 1 (17:48):
Yeah, I mean, listen, I think that if you were
going to have anybody run the playbook that they're running,
you've got some of the greatest capital allocators on planet
Earth who have made unimaginable amounts of money. By saying
I understand how global economics works, and I'm going to
bet hugely that I'm accurate. And they've won and so

(18:10):
over and over and over. So many of these guys
that I look at that are just like the world's
greatest capital allocators, I'm like, huh, They're all like this
could work.

Speaker 3 (18:18):
Listen, take Ray Dalio.

Speaker 1 (18:20):
I think you and I may see him a little
bit differently, but nonetheless, Ray has made just unimaginable amounts
of money doing this game. Has this crazy historical perspective,
and he's very much in my camp, which is, you
have to do something because of what's going on with China,
you have to do something because of the debt, Like

(18:41):
these things cannot stand. So given that, given that, in
action is not an option, right, and that's what I
want to ask, then it's like, what do you do now? Again,
if you listen to Trump, it's like, that's way too crazy.
I don't want to do that. And maybe there is
some of that. Maybe he's doing way too much Art
of the Deal all that stuff, and he's pissing off
all of our allies and they're going to be very
necessary for us to weather this storm. Well, but when

(19:05):
you listen to Lutnik and Bessen. It's like, Okay, I
at least understand what you guys are trying to do. Yeah,
so some of the early signs are there that again,
if you can graze the ear and not.

Speaker 2 (19:16):
Shoot the patient in the head, it could work.

Speaker 3 (19:22):
But we'll see.

Speaker 1 (19:23):
And to be very specific, the big thing I would
think people should look at right now if you really
want to understand part of the game they're playing, whether
this is going to work because of how much of
the debt has to be refinanced. You've got to get
the yield on the tenure treasury down, and they're doing
it now. The question is to like the method by
which they're doing it. That's a different question. But they're

(19:44):
already driving it down.

Speaker 2 (19:45):
They could keep loading the front end like Janet Yellen did.
And just for comparison, Janet Yellen was a academic, a
Ckensian academic, an mmtre again back to descent, an academic
like this MMT freaks me right, So she's an MMT academic.
Percent literally ran the playbook for the two greatest investors

(20:06):
of all time. And you said we're successful. Stanley DRUGA
Miller is the goat because thirty years without a loss,
it's crazy. Without a loss, crazy, it's crazy. But back
to the risk, So the risk is really high to
your point, you get my ear who take my head out?
But to your point, in action really is an option
either And percent sort of made that statement. So he said, like,

(20:26):
the top ten percent owns eighty eight percent of equities,
eighty eight percent, forty percent owns twelve percent of the market,
and the bottom fifty are in debt, so it's not
working for them, and so like, we have this massive
risk if you don't do something to debt all these things.
So I think that's interesting. I think at the time

(20:47):
of this recording, which is the seventh, the latest I
saw already eighty countries have already come together saying they
want to strike a deal. That's what the latest I
saw today, even the you now, so they're ready to
strike a deal and go go down to zero. So
it seems like it's coming together pretty quickly. Probably not
for China.

Speaker 1 (21:08):
Yeah, China's making very different noises with their face, so
we'll see.

Speaker 3 (21:13):
How it plays out.

Speaker 1 (21:14):
But yeah, I mean there, look, how much of this
is hear say, how much of it's real? I have
not seen coming from China this directly, but I've heard
about it indirectly that they're prepared to.

Speaker 3 (21:24):
Go all the way.

Speaker 1 (21:25):
Yeah, where it's like, okay, you guys don't want to
play ball. Not only are we going to do retaliatory
strikes with tariffs, but we're now going to openly sell
your IP again. I don't know that's true. I want
to be very clear. I am unfortunately right now spreading rumors.
But that is certainly a weapon that they have in
their toolkit that a lot of other countries don't. The

(21:45):
thing that they've gotten extremely good at. And this is
not derogatory. They've gotten very good at going cool. That's
how it's done. Now we're going to show you how
to do it at scale, hyper efficiently. And so they've
gotten just ridiculously good at that. And so they do
have and this this goes back to this whole idea
of choke points. They have a choke point for us

(22:06):
on manufacturing. They have because of that, they have a
choke point on us from an IP perspective, and.

Speaker 3 (22:14):
You can't let those play out negatively.

Speaker 1 (22:15):
And this is exactly how things escalate. This is how
you go from a cold war to something that threatens
to be kinetic really fast, because if.

Speaker 2 (22:24):
They do that with our IP we have to get aggressive.

Speaker 3 (22:27):
Yeah.

Speaker 2 (22:28):
So I mean there's there's Taiwan on the table, you know,
potentially taking Taiwan. There's dumping US treasuries, which is another
option would force the QUI. So there's a lot of
things that China could do. But I did see today
also that Trump said that I think by the tenth
that he's going to increase their tariff's another fifty percent, yep, Like,

(22:48):
so yeah.

Speaker 1 (22:48):
It could escalate, but that's retaliation for the retaliation, right,
So we'll see how that all plays out. But China
is not being conciliatory at all, and so this is
the part that worries me. So you have the two
players on the board that matter. You have the US
and China that are being escalatory with each other, and
now they're going to be fighting for allies. And so

(23:09):
you hear Elon making noises about, hey, we should basically
have free trade with Europe. We should be able to
go back and forth. I mean as if America and
the EU were all part of the European Union and
let anybody go where anybody wants. So I get where
he's coming from, and he sees America and Europe is
like a super ally, because then we really if we
were truly locks up with Europe, which we are aggressively not,

(23:32):
but if we were truly in lockstep with Europe, now
we've got enough economic might that we could really stand
up to China economically.

Speaker 2 (23:41):
But we'll see, yeah, lines will be drawn.

Speaker 3 (23:45):
I think.

Speaker 2 (23:46):
I think that's looking only at one component of the board.
If you look at other components of the board. So
it's not just the tariffs, right, it's making it making
America easier to invest in, so strengthening the dollar and
the treasury asset number one. If China, if Besson thinks
that China's boxed in, potentially in the yuan is going
to have to be debased in order to keep up

(24:07):
with this, China is less investible and a lot of
that comes back to the US. But also when you're
adding the massive tax cuts, the spending cuts, and then
the massive deregulation cuts, I think, and and then sort
of the drill baby drill energy dominance narrative, those are
other pieces of the board that I think weigh in.

(24:28):
But but we'll see, we're just speculating, you know, speculating
at this point, that's all we can do. That's that's
the fun, right, playing playing that.

Speaker 3 (24:35):
But I think.

Speaker 2 (24:39):
History books are going to be talked about this period
of time, that is for sure.

Speaker 3 (24:43):
So we get the benefit of living through that.

Speaker 1 (24:46):
Is it the benefit I'd really like to learn from it,
that is for sure. So moving forward I will be
armed with a lot more information. But yeah, there is
a supposed Chinese curse. May you live in interesting times,
and we live in interesting times, and humans don't like

(25:07):
change in the volatility, there's certainly opportunity, but it will
also the average person if it goes south. The average
person just gets cloppered.

Speaker 2 (25:16):
Well, not the fifty percent of people who own nothing.

Speaker 1 (25:18):
Now, it'll be worse for them for sure, because they
have no protective mechanism. So the problem is when you
were saying that we'd have to turn to QE, the
good news is that would make the debt manageable. So you,
in my words, you steal from all Americans, which just
absolutely decimates the poor and working class. But you steal

(25:39):
from everybody by inflating the money supply, and now that
makes the debt worth far less and it becomes far
easier to pay off if you can get some economic growth.
So if you can both weaken the dollar keep it
the reserve currency, and through deregulation unlock GDP growth, you
actually can pull this off. But if you're doing it
through QE, then you're going to exis disacerbate the difference

(26:02):
between the rich and the poor, and that will get
ugly and it really does become a let them meat
flat screens moment, where sure people have a lot of
cheap stuff, but that's how revolutions start. So yeah, that
is the one that I think is most immoral. But
there's a couple thoughts on that, like one Bescent talked
about on Tucker How last year twenty twenty four.

Speaker 3 (26:24):
Summer of twenty twenty four, we had two records set.

Speaker 2 (26:26):
Number one record was more Americans busy in Europe in
the summer than on record, But we also had more
Americans going to food banks on record at the same time.
And so that's sort of that divide if you will,
And unfortunately to your point, as the debasement continues, then
your wages buy less and less than more people go
to the food bank.

Speaker 1 (26:46):
Yeah, and I imagine your audience completely understands that. The
problem is that the average American doesn't. Yeah, so they
just do not understand how money actually works. They don't
understand the dangers of debt, they don't understand how it
could possibly be bad that the government is giving them money,
and so we end up in this death spiral where

(27:08):
they can tell something's wrong, they can tell they're getting screwed,
but they don't understand how. So this has been the
conundrum of the last three years of my life is
as I've learned about how money actually works. Because when
COVID hit, I thought, Okay, I've worked in the inner cities.

Speaker 3 (27:22):
I had a thousand employees.

Speaker 1 (27:23):
That grew up like hard hard heart, and when COVID
lockdowns happened, I thought, they're all going to lose their
jobs and they're just going to be toast. So I
started making financial content to help them out and to
hope that I could give them some understanding of how
they're going to weather this storm. And as I started,
because I've always been good at making money, but not
investing money, and so I wanted to learn about investing

(27:46):
so I could help them out. And as I started
learning about it, I was like, wait, a second, like
this does not work the way that I thought it did,
and realized, oh my god, that printing money. If I
was going to oversimplify my own stance, would be printing
money is immoral and is exactly how the rich get
rich from the poor get poor. I understand it's more
complicated than that, but when you simplify something down to

(28:08):
its essence, there's a clarity.

Speaker 2 (28:09):
Yeah, I want to we're gonna talk about that, We're
gonna talk about AI, We're gonna talk about bitcoin, which
I consider the cheat code today. But I want to
go back to kind of what you just talked about there,
and for a minute, I want to kind of go back.
You talked about scrounging and your couch cushions, so having
a billion dollar company and so what I see so

(28:30):
kind of going back to this sort of trade war
bringing back jobs, the tale of two Americans, if you will.
It's not just that the US offshort it's manufacturing base,
which it did, but those were low level jobs, and
so bringing back T shirts and sneakers to the US
for manufacturing is not an answer for that, right, So
nations have to grow their way out of it, and

(28:50):
we grew ight.

Speaker 3 (28:51):
Away out of it.

Speaker 2 (28:51):
Like technology takes low level tests, so we can work
on higher value tasks.

Speaker 3 (28:55):
But it's more about moving from.

Speaker 2 (28:57):
The industrial age to the information age. So, right, we
had those giant manufacturing facilities and everybody smart and dumb
worked equally on the assembly lines with this massive middle class.
But in the information age, now a person with a
laptop and AI.

Speaker 3 (29:11):
Can create a billion dollar business.

Speaker 2 (29:12):
I mean, we'll see that, right, And so then it's
like almost more like a meritocracy. So we have all
these people who were trained from an industrial school system
with an industrial mindset, but now they find themselves in
this information age, but they're not equipped with the mindset
or the tools in order to succeed in that world.
So they still think in terms of like just get

(29:34):
a job and that job will take care of me,
as opposed to I should learn, I should change my
mindset to learn skill sets and use those skill sets
to bring more value, to increase the money amount of
money I make, because we have make money and then
we have invest money. So I'm curious, I mean, your
own mindset shift. Right, going from the couch cushions to
your point to a billion dollar company, you had to

(29:54):
learn that model. And is that something that you're trying
to teach in this You said your workers right, trying
to kind of teach how this works. Are you trying
to kind of give them that mindset? Okay, so I
think you have to you have to face really two things.
I want to take a break real quick and just
say that there's only so much you can learn through videos. Yeah,
build your knowledge, build your skills, but you need to

(30:15):
build your relationships. Relationships plus skills equals money. So come
build your relationships and your knowledge at the Bitcoin Conference
May twenty seventh through twenty ninth in Las Vegas. I'm
gonna be there speaking for the fourth year in a row,
and lots of other people way bigger than me. Entertainment, politics, media, finance,
you name it, they'll be there. So come check it out.
Save some money with my code Mark Moss or I'll

(30:37):
put a link down below if you use my code
to save some money. I'm going to do a private
meetup just for you and some of my friends. So
let me know, use that code, save some money, send
me a message and.

Speaker 3 (30:48):
We'll get you in the private meetup.

Speaker 1 (30:49):
And I hope to see you in Las Vegas generational
poverty is a mindset problem. It's not a money problem.
I expect that to be very inflammatory. It is every
time I say it, but nonetheless, having seen it up close,
you realize, wait a second, this guy is smarter than me,
and he's going nowhere fast. So good hardware, bad software.
The really bad news is, while it is very possible

(31:11):
for somebody to change their mindset as an adult, I
ballpark it. This is not a real number, but this
is so directionally accurate that if people form their worldview
with this, it will have high predictive validity. Only two
percent of people are capable or willing, however you want
to say it, to change their mindset once they're an adult.
So I learned pretty quickly that I'm a filtering mechanism.

(31:36):
I've given up on changing adults that are in the
ninety eight percent. Maybe there will come somebody who's better
than me and they can do it, but I've really
really tried so long before there was a camera on me,
long before I thought about YouTube any of that. I
told my employees, I will come in early, I will
stay late. I will teach you anything that I know
about entrepreneurship because I want you to be here.

Speaker 3 (31:57):
One.

Speaker 1 (31:57):
I want you to be the best person you can
be that benefits me. But I want you to stay
here not because you feel forced. I want you to
stay here because you know I care more about your
future than your own mother. So started what I called
Quest University, and literally I taught them anything, to the
point that we had three of the people that the
sort of informal students that came through that started competitive

(32:17):
nutrition companies. So I was like, I will tell you anything, and.

Speaker 2 (32:22):
Three students started competing company out of how many oh god, thousands, no,
not on hundreds, but we hundreds, hundreds for sure.

Speaker 1 (32:31):
And in all of that, there was also two students
that were in it, and one of them punched another
one in the face because he said, you've changed, You've
started reading, And I was just like, wow, So I'm
up against like really stupid cultural programming and the people

(32:55):
that actually did something, the two percent have some obviously
not all, but some have gone on to start companies
that are still running today and I'll get text from
them and updates. I'm just like, this is so gratifying.
And then I remember how many people that I tried
to explain this stuff to and walk them through. So anyway,
that becomes a genesis of the YouTube channel, and I'm like, okay,

(33:15):
I want to impart this mindset advice and again realize
I'm the bat symbol in the sky. I throw it
up and the people for whom that already matches, they respond.
And a big part of the reason I started changing
my content because my first few years in YouTube were
all mindset, That's all I talked about. And I just
realized that for the vast majority of people engaging with

(33:37):
that content, it was spiritual entertainment. And I'm like, I'm
already rich, so this does not make sense. This's not
interesting for me.

Speaker 3 (33:44):
What do you mean spiritual entertainment?

Speaker 1 (33:47):
I have a two hour declining arc of influence on people,
and I can make them feel like anything as possible,
and they.

Speaker 2 (33:54):
Feel very good, very capable.

Speaker 1 (33:57):
But then they have to come back for more, come
back for more, come back for more. Yeh, Because they're
not actually trying to gain a set of skills that
allow them to go execute in the world, in that
hyper uncertain world that we were talking about a few
minutes ago. That's not what they're looking for. What they're
looking for is the narcotic like feeling that they could
do that if they wanted to, and that feels so

(34:19):
good that they just never do anything with right, And
that's wholly uninteresting to me. So by temperament, I'm just like, Okay, well,
if that's how this works, then I'm not interested. So
I began reinventing my channel to get to what I
do now, which is talk about the most important things
you should be thinking about in your life, because if
you get it right, the consequence is financially, emotionally, longevity

(34:40):
of relationships, all that it's going to matter so much,
and if you get it wrong, your life is worth
on every measurable metric from wealth, health, emotional stability, progress,
which is a foundational pillar of human happiness, I mean
just literally every dimension. So that's the focus now. But okay,

(35:03):
so all of that is around just that idea of
how do you help people with a mindset thing. But
the second part that you have to face is that
some people do not have the intellectual horsepower, and no
matter what you teach them, they are forever going to
need a blue collar job. And I'll even say, maybe

(35:25):
it's not intellectual horsepower, maybe it's just temperament. Maybe it's
skill set that their skill set is so visceral that
they need to make things, build things, touch things, move things.
They need to live in the world of electrons and
move them around. And evolution for so long could not
over index on the thinkers, the intellectuals who now run
the world. But they certainly did not for millennia. So

(35:47):
I think what we're learning now is you hollow that
blue collar segment of your population out at your own peril.
First they start killing themselves and you can actually see
deaths of despair bringing down the life expectancy of men.
If you take out the depths of despair, it bounces
back to normal. So it's so frequent that it is

(36:11):
impacting the national stat on the length of male life
in America. That's so insane to me. So I'm like,
you have to address that. So I bang the drum
for you've got to bring some manufacturing back to the US.
Because there are humans, by temperament or intellectual power, they
have to be in the world of physical electrons moving
them around, just period, full stop, end of story. And

(36:33):
unless we start genetically modifying people like you're going to
have a base of largely men that you either keep
numb or brace yourself for impact when they revolt.

Speaker 2 (36:44):
Yes, I would agree with that. Right, we have to
have purpose, we have to have meaning, we have to
have something to do. But does it have to be
dumb labor or can it be smarter labor? So for example,
you could talk about the fine feminine feminism movement.

Speaker 3 (36:59):
And how much how many jobs is taken from men?

Speaker 2 (37:01):
I was just recently thinking about my own company and
the amount of people we employ, and no, how the
majority of them are.

Speaker 3 (37:06):
Women, even in my own company, right, and.

Speaker 2 (37:08):
So like how many men are sort of put out of.

Speaker 3 (37:10):
The workforce based off of that?

Speaker 2 (37:11):
And those are physical jobs, I mean, you know, not
hard labor jobs, but physical office jobs, right, white collar jobs,
I guess you would say. But you know, when the
industrial revolution came, you had a machine that could do
the work of five thousand field workers.

Speaker 3 (37:24):
But those field workers.

Speaker 2 (37:25):
Eventually went on to do higher level things like science
and medicine and things like that. In this vision of
Trump re onshoring manufacturing, Latinick says, yeah, it's gonna be automated,
so someone's going to have to build the facilities. So
there's hard you know, Blue labor, blue collar labor there.
But someone's also have to build the robotics program the machines,
run the machines, and stuff like that. So to your point,

(37:45):
I mean, we need the physical things, but could they
do those physical things or you think that people just
are incapable of even learning something like that.

Speaker 1 (37:52):
You're intelligence is the spectrum and there will always be
people that are going to be ill suited to an
information technology only world. Now, AI is a meteorite screaming
towards Earth. So the things I'm saying right now, I'm
just ignoring that meteorite for now.

Speaker 2 (38:13):
It is going to come back yet, I have no doubt.

Speaker 3 (38:16):
Is one of my favorite topics.

Speaker 1 (38:17):
But nonetheless, you really have to bifurcate your conversation into
what do we do right now today versus then how
do we manage that transition period when it happens. And
I've written extensively in the realm of fiction on this idea,
but you have to deal with people that certainly do

(38:40):
not have the intellectual horsepower to do things other than
move electrons.

Speaker 3 (38:47):
You got to do something.

Speaker 1 (38:49):
So what that answer is, Manufacturing certainly is one Obviously
a lot of that's going to go to robotics. Now,
if you look backwards, and this is always a good
test when you look backwards, every time a new technology
comes in the place, it ends up creating more jobs,
not less, even though it always seems like it's about
to put everybody out of work, and then it just
opens up all these new avenues and the people with
the talents fled in. So as an act of faith,

(39:12):
I choose to believe that that will happen here as well,
that there is something I just can't conceive of yet that.

Speaker 2 (39:19):
Will if people can level their skills, that's the problem.

Speaker 3 (39:21):
Well that's a conundrum for you, O.

Speaker 2 (39:24):
Case more go Yes, but are the people able to
go into those jobs?

Speaker 3 (39:27):
No?

Speaker 1 (39:28):
So this is where I want people to remember. Two
percent of people can make that kind of change. Ninety
eight percent won't. Evolution of culture or otherwise never cares
about any given generation. So evolution does not care if
we're about to introduce a utopia. But some people just
can't cross that threshold because they completely emotionally derail because

(39:49):
they can see I am getting left behind.

Speaker 2 (39:51):
I am a trucker. I'm not going to learn to code.
I don't want to work in a robotics factory like
that person. But those were things they refuse to do.
I don't want to learn how to code. I refuse
to work robotics.

Speaker 1 (40:01):
But that's the fact that that's what you're up against.
Anything else is an abstraction. So humans are such that
if they've spent, if they're a forty five year old,
they are going to many, many, many of them. It
will never be one hundred percent, thank god, but many, many,
the overwhelming majority will not be willing to make that transition.

(40:22):
It will be too hard in them, and so they
will check out.

Speaker 2 (40:26):
Hey, small business owner, are you buried in all types
of work keeping you from the real thing that makes
you money? Well that's where just Works comes in. They're
the all in one platform that supports small business growth.
You can get all their tools that help with benefits
like payroll and HR and compliance with transparent pricing. Now
they help you hire top talent internationally, internew markets, quickly

(40:48):
scale international operations without the workload, and for every how.

Speaker 3 (40:52):
Do I do it? Question?

Speaker 2 (40:53):
You can reach out to their expert staff from sole
proprietor or.

Speaker 3 (40:57):
A team of twenty.

Speaker 2 (40:58):
Just Works empowers all all kinds of small businesses.

Speaker 3 (41:01):
With real human support.

Speaker 2 (41:03):
So visit Justworks dot com slash podcast to join the
thousands of small businesses that trust Just Works to take
care of payroll, benefits, compliance and more. Again, that's Just
Works dot com slash podcast. And those are the deaths
of despair you're talking about.

Speaker 3 (41:20):
That'll be one avenue.

Speaker 1 (41:22):
Depending on what age they are, it may also manifest
as active rebellion.

Speaker 3 (41:30):
Yeah.

Speaker 2 (41:32):
When I was a kid, I didn't want to eat
my vegetables, and my dad would tell my mom, if
he's hungry enough, he'll eat. And I think about, like,
you know the sign in the state parks don't feed
the animals because the animals become dependent. But humans aren't animals, right,
So like, if we're hungry enough, we're going to go
dig a hole, we're gonna go we'll go work, I
would think. So I'm afraid of you know, people are
calling for like the need for UBI to offset that

(41:53):
that sort of you know problem that you're talking about,
But it seems like if we didn't have that safety
that people would go work.

Speaker 1 (41:59):
Right, depends on how disruptive this moment's going to be.
I think the reason that so many people they reach
into their brain and say, how do we solve this problem?
How do we re educate people all of that and
they understand the rate of change. Most people do not
understand the rate of change of AI. So think about this.
If Sam Altman is right and it's three hundred percent

(42:21):
year over a year, okay, first of all, that's more
than exponential growth.

Speaker 3 (42:27):
So that's like it.

Speaker 1 (42:28):
In fact, if you want to make it exponential, it's
exponential growth every five point nine months, So that is
almost a percentage improvement every day, and we're seeing it.

Speaker 3 (42:39):
Yeah, it's crazy.

Speaker 1 (42:41):
So the rate of change is going to hit people
like a sledgehammer. Like even if they're able to adapt
to wave one, Wave two, those are going to be
like done in six months. So what do you do
when it's wave three, four five and they're like, I
can't keep up. I don't want to keep up anymore.
This is too confusing. People will get paralyzed by the
fear of all the change. And when you really start

(43:03):
thinking through this problem, eventually your brain hands you back
a null's signal that says, I don't know what to do.
I don't know what this is going.

Speaker 3 (43:08):
To look like.

Speaker 1 (43:08):
This is what everybody calls the technological event horizon or
the singularity. So you're at a point where you can
no longer predict the future, not even a future that's
six months away. So at that point, I'm telling you
there will be massive distress. And when humans are scared,
they act crazy. So people are like, I can't have

(43:28):
a populace that's both scared and hungry. So it's a
dark view of what's going to happen. But I think
that's very real. We're going to go through a valley
of despair before we come out the other side, and
we're able to capture all the energy of the sun.
And if people haven't thought about this, this is not
an analogy.

Speaker 3 (43:45):
I mean this literally.

Speaker 1 (43:46):
Robots eat sunshine, and so the way that they stay
energized is we're going to be able to capture the
energy from the sun, which is absolutely plentiful, far more
plentiful than we need to run even billions of robots
to feed all the humans, all of it, all the
manufacturing you could ever want to do. It's falling on
the earth. I forget how long a single hour. If
you could capture one hundred percent of the sunlight that

(44:08):
is emitted from the sun, how long it would run
the Earth. It's a long time. So we have way
more energy than we need. We have a capture problem
once you realize that unless AI hits an upper bound,
which I haven't heard anybody except maybe the storage problem. Yes,
but like again, these all are technological problems that show
no known reason why.

Speaker 3 (44:28):
We can't solve them.

Speaker 1 (44:30):
Doesn't mean there isn't one, because there are certain barriers.
Like right now, it's can we make the chips fast enough? Okay,
once we're able to make the chips fast enough, can
we get the energy fast enough. If we can get
the energy fast enough, can we get the transformers fast enough?
And so we'll keep pushing the problem down. But you're
gonna hit these bottlenecks. But nobody looks at it and says,
but it, we won't be able to overcome them. And

(44:51):
so if there is no upper bound to intelligence, I
will draw a math equation for people. They're a moron
is defined to somebody. I think with an eighty one IQ.
Einstein had like one sixty whatever the actual numbers are.
The difference between a definitional moron in Einstein is two

(45:12):
point four x. So Einstein was two point four x
times smarter than a moron, and a moron can't even
get in the army. So the Army's like, I can't
even put you forward to get shot. You'll create more problems.

Speaker 3 (45:25):
They're like, hold on, let.

Speaker 1 (45:26):
Me finish this, otherwise it won't make sense. So you've
got Einstein's two point four x smarter than that, and
he's like thinking his way to fundamental insights in physics
that bring about nuclear energy, nuclear bombs, GPS, like all
the things that have come of the atomic age. That's
two point four x smarter than a moron. People are

(45:46):
saying that AI isn't going to be ten times smarter
or a thousand times smarter or one hundred thousand times
or a million times. They're talking about there being no
upper limit, which means you get into something that's a
billion times smarter. So if you've got Einstein being two
point four x, what happens when you get to ten?

Speaker 3 (46:05):
So I don't even need people.

Speaker 1 (46:07):
To buy into like the crazy far fetched a million,
a billion times smarter, just ten x would be unrelatable.
It would be a world no one that's alive today
would recognize, And that what's that going to happen? In
twenty years. You're really going to make me push it
out thirty years. But like I plan to still be alive.
This is not some three hundred years from now problem.

Speaker 2 (46:29):
No, it's a now problem. I've been coaching my team today.
We had our calls and I said what I've been
harping on for the last couple of weeks is like, guys, look,
I need you all to be grasping onto these tools
right now because you're either gonna use them, you're gonna
get replaced by them, and our company will get replaced
if we don't use these right now. I want to
come back to that. We're going to dig into the aipiece.
I just want to finish this last piece though. So

(46:52):
you started teaching mindset. You realize people need it, You
started teaching it. It started to become sort of a fools
eron for you, at least on YouTube, but you still
do coach on mindset.

Speaker 3 (47:00):
I believe right here that's still sort the two percent
that are actually going to do something with you.

Speaker 2 (47:04):
Yes, So what would you say some mind shift or
tools that you could give to people that might be
useful for them today. Maybe someone who already has a
little bit high agency that's ready to sort of make
that jump.

Speaker 3 (47:16):
Do you have some frameworks for blue prosis?

Speaker 1 (47:18):
One hundred percent mindset is extraordinarily powerful and extraordinarily simple,
but people don't do it because of a small handful
of reasons. Okay, uh, the ultimate framework is really simple.
You at all times need to know what end state
you are seeking. What's your goal? If you don't have
your goal, literally stop, nothing matters. You need to be

(47:39):
able to say your goal and thirty five words or less.
It needs to contain a what, by when, and how much?
So the how much is like what's the kpi? How
am I going to know I achieved that? I want
to make the world a better place?

Speaker 3 (47:51):
Okay in what way?

Speaker 2 (47:53):
Like?

Speaker 3 (47:53):
How will you define it? And you're going to feed
like a million people at whatever it is?

Speaker 1 (47:57):
If you've got that goal, now cool goals make so
Now there's going to be a string of things you're
going to have to do successfully in order to achieve that.
But you have to overcome what I call the chaos machine.
The chaos machine is life. The second lot of thermodynamics
is that in a closed system, everything moves towards disorder.
So if you want to bring order to a closed system.

(48:17):
You have to pour energy in so I mean it's
not a metaphor, that's literal. So you're going to have
to be highly intentional, with very directional energy to overcome
the never ending set of problems that will be put
before you as you try to achieve your goals. Most
people break emotionally because it makes you confront something. This
is the big one that people don't understand about themselves,

(48:38):
which is that they trust their emotions, and they shouldn't.
Your emotions will lie to you all the time. Your
emotions are merely your your body's way of trying to
express something to you from the subconscious. So it does
not speak in words. It speaks and feelings. But those
feelings are tuned to make sure that you live long

(48:58):
enough to have kids that have kids. Now, unless you
just told me my north star is to have kids
that have kids, then your emotions are going to be
out of step with that. So now what people have
to do instead of steering by emotion, which is I'm
telling you the vast majority of the planet steers by emotions.
They do the things that make them feel the way

(49:18):
they want to feel, and they avoid the things that
make them feel the way they don't want to feel.
But if you're going to make progress, you're going to
fail a lot. And failure is not going to make
you feel the way you want to feel unless you
pull a trick, which I'll get to in a second.
So the rule is, you have your goal, you know
exactly what you need to do. That goal now makes demands.
All of those demands are basically how you're going to
overcome the obstacles between where you are and where you

(49:40):
want to get to. And there's only one way to
do that, and that is to think from first principles,
using what I call the physics of progress. So progress
works in a certain way. Then you're going to take
the framework of the scientific method and you're going to
recontextualize it for life, for business, for relationships, whatever. It

(50:00):
is a minor tweak, but it's still the same idea,
and that is you can say, I know where I
want to go, and I know what the obstacle is
between where I'm at and where I want to get to,
and now I'm going to come up with my best guess,
my hypothesis on how to overcome that obstacle. And the
physics of progress is a me thing so other people.
Basically everybody does the same thing. It's why I call

(50:21):
it the physics. Everybody calls it something different. The only
way to run it is to think from first principles.
So you have to get out from under your emotions.
You have to be asking yourself one simple question, how
does the world actually work? And then to understand that
your brain lies to you all the time. Your brain
is a shortcut machine. It is always looking for a
rule of thumb, like how does this work? I just

(50:43):
want to get the gist. The problem is if you're
just is wrong, then it's going to lead you nowhere fast.
But because people build their self esteem about around being better, faster, stronger, smarter,
they focus on being right because it makes them feel
good and they want to do that to make them
feel the way they want to feel. And so this
is what you see in politics. That's what you see

(51:04):
everybody screaming over tariffs because everybody wants to be right.
And this is why you see me banging the drum.
I'm just trying to figure out if I've mapped the
world accurately. Now, why do I care about that? Because
my life is driven by what I just walked you through.
I know what my goal is trying to get there,
that goal makes a whole bunch of demands. I'm going
up against the chaos machine. But I know something that
seemingly a lot of people do not internalize, which is

(51:24):
that skills have utility. Meaning I don't read a book
so I can say that I read it. I read
a book because it gives me information. I turn that
information into a thing I can now do in the
real world that other people don't know how to do.
That allows me to overcome more obstacles, to outcompete them
and make progress. So I don't care if I embarrass
myself by saying this is what I think is going
to happen. There's a fifty one percent chance that this
all goes wrong and then it goes swimmingly and it

(51:47):
just looked like it was guaranteed or it just completely
crashes and burns, and all the people that were like, see,
I told you so, Mike, Yeah, I don't care about that.
You have completely misjudged what I'm trying to do here.
I needed to plant to flag, which is why people
can get me to answer a question literally about anything,
and I'll say okay, this, So I'm thinking about it
now because I don't care if they make fun of
me in a year. What I know is skills have utility,

(52:08):
and that as long as I learn my lesson, I
can go ooh I thought this, but this actually ended
up happening. That means that I have a wrong vision
of how the world works. But this gave me a
little bit more information. And because I don't have an
ego about it, I'm just gonna be like cool, I'm
going to update my thinking now. I'm ready to move
forward in a more effective way. And so if people

(52:29):
just run that all day every day, they can have
all the success that their emotional stability, their drive, and
their level of.

Speaker 3 (52:38):
Intellect will allow. That's a really good walkthrough on that
mental model. Well, thank you.

Speaker 2 (52:46):
Starting with the end in mind, living with intentionality right
on everything. It's something that I come across quite a
bit somewhere. I've spent a lot of my time thinking
about lady. I have some high end coaching programs and
we coach some very successful people, and I find that
they're not. They come to me with basic, very vague
questions like should I buy bitcoin? Should I sell this? Asset,

(53:10):
Should I sell my business? Should I whatever? Right?

Speaker 3 (53:12):
And they're never able to come to any type of decisions.

Speaker 2 (53:15):
And I found after years of doing this, it's because
none of them are clear on where they're trying to go.
They don't know what problem they're trying to solve specifically,
So then almost no path looks correct. It's the Alice
in Wonderland, right, like.

Speaker 3 (53:26):
Which passion I go? I don't where you're trying to go?
I don't know.

Speaker 2 (53:28):
Well, then any path will do. What would you say,
because you've spent a lot of time thinking about this,
what would you say if you were to go go down,
go downtown and stop one hundred people, how many of
them could articulate in thirty five words or less their
specific goal?

Speaker 3 (53:43):
Zero zero.

Speaker 1 (53:46):
I have people in my program that have listened to
my classes. They come before me for the live part
and I'm like, cool, what's your goal?

Speaker 3 (53:55):
Yeah?

Speaker 1 (53:55):
They think they know. That's the terrifying part. So this
is and listen, everything I teach, I teach because I
made all of these mistakes. I'm constantly catching myself falling
into the same traps. So this is not me like, see,
I get it, and you guys don't this means just
like I get what it's what it's like to be
a human, but we are all trapped inside of a

(54:16):
brain that uses emotions to make us act. And people
think they're making decisions based on logic, but they're not.
They're making decisions based on feelings that give them the
ability to act. And once I realized, oh, my emotions
are actually not good at detecting what I need to
do in order to make progress, and so I'll end

(54:38):
up in this eternal loop of doing what feels good,
protecting my ego, never wanting to admit that I'm wrong,
and that's going to go nowhere fast. So what if
I flipped it and said, I'm going to value myself
for my willingness to stare nakedly at my inadequacies, and
so I'm gonna be proud of the fact that I
can be laughed at longer than the next person, that
people can think I'm a fool. But behind the scenes,

(54:59):
my knowledge is grow and grow and growing, growing, growing.
And so I take myself from scranging on my couch
cushions to find enough change, but gas my car to
go to a job interview to ultimately build in my
multiple companies, selling one of them for a billion dollars.
So once you do that, you realize whoa like. Actually
getting good at things is the name of the game.
But in a way where like the Kobe Bryant quote,

(55:22):
booze don't block dunks, meaning no matter how much somebody
hates you, if you're good enough, they can't stop you.
And so becoming obsessed with that idea is ultimately the
thing that freese people. But you have to get out
from under the how bad it feels when you realize
that you're wrong. So anyway, the whole idea is that
feelings will make it feel like the story you're telling

(55:44):
yourself is true instead of a hypothesis. And so they
have a vague sense of what they want to do.
It feels good, and so because it feels good, they
believe they have clarity, but they don't.

Speaker 2 (55:57):
The next question is I want to shift into AI.
But this piece is actually a segue that I think
is really important. So I just had this coaching program
here last week, and I brought in some high level
entrepreneurs and this time I decided to get a little
bit smarter, and I sent them pre work, and this
pre work was for them to do some exercises, spend
a couple hours to figure out really clearly, specifically a

(56:19):
smart goal what it is they're trying to achieve, so
that we could spend the time here for day and
a half building on top of them. And they came
back with the most vague ideas, even after I gave
them very detailed specific prompts mental exercise, all these things, right.
And so what I find is that I often say
that the quality of your life come down the questions

(56:41):
that you ask, and when you ask vague questions, you
get back vague answers. And so they all came with
these very vague goals, and I'm like I, so then
we have to spend half the day just trying to
get clear on the goals, right. And so that's problem
number one. And so what I find from that is,
now if we take that into AI, and a lot

(57:02):
of people think that AI is going to make them
so much smarter, I don't find that to be the case,
because people can't think clearly, they can't think specifically, and
they ask terrible questions. So when you ask AI, write
me a book, what are you gonna get? Should I
buy bitcoin? Like, what do you is my investment portfolio

(57:22):
d risk? Like when you ask it broad vague questions,
you're going to get back terrible answers. And so really,
these types of people who are unable to think clearly
and ask smart specific questions aren't really going to be
able to get the full benefit of a tool like
AI facts.

Speaker 3 (57:39):
Yeah, that is it.

Speaker 1 (57:42):
Well, so right now I'll say I'm kind of glad
because we're in this narrow window that I expect to
be very brief where we all still matter in the process. Right,
So my thing is that I so for people that
don't know anything about me or know my story. I
got into business because I went to film school. I

(58:03):
wanted to make movies and I just could not figure
out how to break into the industry. Finally, write a
script gets turned into a film.

Speaker 2 (58:09):
It was a horrible experience. I was completely devastated.

Speaker 1 (58:12):
I had met these two very successful entrepreneurs and they said, listen, man,
you're coming too the world with your handout, and if
you want to control the art, you have to control
the resources, so you should get into business.

Speaker 3 (58:21):
And get rich.

Speaker 1 (58:22):
And I was like, yeah, that's a brilliant idea, I thought,
I would take eighteen months took fifteen years. But it worked,
and along the way I realized all the things that
we're talking about here. Skills have utility, that you can
control your own life. But it was a very grueling
process of realizing all of the things that I try

(58:42):
to teach people, but everybody ends up getting locked in that.

Speaker 3 (58:47):
I'm lost, and.

Speaker 1 (58:49):
I don't realize I'm lost, and that's the problem. So
take taste. This is the biggest thing with AI. If
you don't realize that you're asking AI to do something
that you don't have any taste in, you're in trouble.
So right now, when AI hands me back an answer
on like a screenplay that I'm working on, I'll be like,
that doesn't make sense, that's cheesy, that's not working. And

(59:11):
then the AI is going to say, as it's trained
to do, you're right now, what if I'm not right?
And that's how you get into a death spiral. People
either take the first thing it gives them, not understanding
the AI is meant to zoom in on what's the
most likely thing someone will say, which means it's sort
of the most blaw answer ever. You've got to get
into like a really narrow band over here where it's like, okay,

(59:32):
this is unique, this is fresh. I want to bring
these different ideas together and no, no, no, that's not
quite right. And then you can polish on top. And
so it basically becomes a way to cut down a
lot of work that would otherwise stall you out. Coding
is the easy one. It's like as soon as you start,
you know, to write that file that you've written a
thousand times, it always has to be like custom built.

Speaker 3 (59:53):
It just auto.

Speaker 1 (59:54):
Populates from right and so now it speeds you up
whatever two, three, five, ten X, and it will obviously
just keep going, keep going and get better. But right
now it's a pretty magical window where if you have
built a set of skills, the AI will extend your capabilities.
I think of it like an exoskeleton. If you can
now bend over and pick up a much heavier weight

(01:00:14):
than you could before. But if you're already strong, then
it's amplifying a really good base.

Speaker 2 (01:00:22):
What I've found is I've been an entrepreneur investor my
career since I was eighteen. I started buying homes in
south central LA fix and flipping them, and I've built
dozens of businesses and exits and all these things and
this wide range of things, and I'm this massive generalist.
I'm the jack of all trades, master of none. And
for the first time that's to my benefit because AI
can make me instantly deep in each one of those pieces. Right, So,

(01:00:46):
all of a sudden, to your point, this exoskeleton, like
I just feel like I'm just like plugged in and
like we can just happen to all this thing. But
going back to that for a second, So if the
majority of people, back to the mindset thing, don't know
what they want, can't think specifically, you can't think clearly,
don't know what questions to ask, the AI isn't going
to help them be any better at it. And so

(01:01:07):
what you're going to have is the small percentage the
two percent of people that you said that are able
to ask good questions and are able to think strategically,
are you going to be able to tap into this
technology and those two percent of people will continue to
see the smaller amount of narrow this divide between wealth
and poor, right where the two percent are going to
excel and the majority people don't know how to use
the tool.

Speaker 3 (01:01:27):
And they're going to fall further and further behind.

Speaker 2 (01:01:29):
And so it sort of exacerbates that. You did an
interview with Mark Andrissen and that was really good and
warned about you know, this AI information war things like that.
You talked about, you know, the impact on jobs, which
we're sort of talking about. You said, there's one job
AI can't do. So is that the hope for some

(01:01:50):
people what AI can't do?

Speaker 1 (01:01:54):
That anything that we think AI can't do right now
will be short lived. I think ultimately, once AI is
truly agentic and embodied and smarter than any human alive,
there just isn't going to be anything that AI can't do.
Your only thing is it's not human. And some people
are going to care deeply about that. I don't think

(01:02:17):
that will be only benefit. I think that there will
be violence what I call from New Puritans who really
eschew certainly the technological augmentation of a human. They're gonna
rebel against that. We are already bringing back extinct species.

Speaker 3 (01:02:36):
I saw people are paying attention. Yep.

Speaker 1 (01:02:39):
So first the dire wolf, which hasn't been seen in
ten thousand years. Next up they're doing a wooly mammoth
and like actually bring them back, not like simulations like
you can go pet a dire wolf. So humans will
eventually start modifying our own DNA, and people.

Speaker 3 (01:02:57):
Are going to have a problem with that.

Speaker 1 (01:02:58):
Some people have a problem with that are already augmenting
themselves with technology. If you've ever seen anybody with a
cochlear implant, you've now got the people from Neulink that
can play video games with their minds, and the guys
that play it with their minds are saying, you're gonna
have to create a league for us because we're so
much better than people that actually have to use a

(01:03:19):
mouse and keyboarder controller. That's a real thing, yeah, right now,
this very minute.

Speaker 3 (01:03:25):
Wow, I mean it is.

Speaker 2 (01:03:28):
The exoskeleton, right, So we have seen robotics helping people
that are paralyzed things like that, and these are just
to your point, augmented technologies that sort of help us
do that. I saw today it was all over Twitter.
Was the Shopify ceo had an internal memo leaked.

Speaker 3 (01:03:44):
I don't know if you saw that. I didn't know.

Speaker 2 (01:03:46):
So there was an internal memo leaked from the Shopify CEO,
and it mandated that AI usage be a fundamental expectation
for all employees. So you won't get a job at
shop Fight if you're not proficient in it. But but
even more than that, integrating AI proficiency into performance reviews
and encouraging a exploration in project prototyping. So now getting

(01:04:09):
the job is a requirement. Well, having AI skills is
the requirement to get the job, But in your job reviews,
if you're not growing in your skill and proficiency with it,
that will count against you and your job performance for sure.

Speaker 3 (01:04:21):
For sure.

Speaker 2 (01:04:22):
Now is that getting backlash? Well, what happened is it
got leaked and so it was like going all over Twitter,
and so the CEO came out and said, hey, here
it is Boom and just like put.

Speaker 1 (01:04:32):
It out there like that seems so self evident. And
you've you said at the beginning, like you're either going
to use it or you're going to get replaced by somebody.

Speaker 3 (01:04:38):
Who you mean, That's what I'm telling my internal team
one hundred percent. Everybody should be.

Speaker 2 (01:04:41):
It's crazy, right, Like if you're not saying that.

Speaker 3 (01:04:43):
You're going to get blindsided. You're going to get blindsided.

Speaker 2 (01:04:46):
And we have this you said, we have this period
of time you said where we still matter. I was
thinking something different the saying is that the future is
not evenly distributed. And so I tell my team we
have about a year before everybody catches up, and so
if we go super hard right now, we have about
a year to really gap people. And I think that's
going to be super important.

Speaker 1 (01:05:06):
Man, We're having those same conversations inside Impact Theory four.

Speaker 2 (01:05:09):
Sure, yeah, now AI is amazing. But what I've been
really thinking about and starting to where I brought a
person on just to start buildings out as AI agents.
And you mentioned that before, so you think about like
AI agents, which there was a book that I read
probably fifteen years ago and it was maybe yeah about that.

(01:05:31):
It was called A Whole New Mind by Daniel Pink
and it talks about how we have a creative side
of our brain and analytical side of our brain. And
now you know, several hundred years ago the thought didn't
need the creative side of our brain, and they're doing
lobotomies and all these things, and so the entire world
has been built up for the analytical thinker and SAT scores.
So schools build for analytical thinkers, and SAT scores are
analytical thinkers, and that has been the right path. My

(01:05:52):
mom wanted me to go be an engineer, right, that
was the past. You wanted me to go down, and
I didn't want to do that. But he said that
the Internet commoditized technical workers because what happened is.

Speaker 3 (01:06:02):
It opened up the world.

Speaker 2 (01:06:03):
So now I can go on to upwork or Fiver
and I can hire a coder or a program or
a coder or whatever. And now they're in Pakistan or
in they're in India or wherever they are. And so
also to commoditized these technical workers. And so he said
that the new world is going to be driven by
the creative thinker, so not the technical worker. More like
a conductor of an orchestra. I don't know how to

(01:06:24):
play the instruments better than any of view, but I
can make you make beautiful music together.

Speaker 3 (01:06:28):
Right.

Speaker 2 (01:06:29):
And so you have all these countries basically coming out
of poverty with bop skills, right, And so they're commoditized
technical workers. And whether that be research assistance, whether that
be pair of legals doing briefs, or coders or accountants
or whatever it is. And so if I were hire
somebody on Fiver to go do SEO for me, that's

(01:06:50):
an autonomous agent. But now we can just program AI
to do that. And this is gaining speed at a rapid, rapid,
rapid rate.

Speaker 3 (01:07:00):
Curious what your thoughts are on that?

Speaker 2 (01:07:02):
My well, what do you see AI agents doing to
disrupt sort of the world and at what speed?

Speaker 3 (01:07:10):
Before I give my thoughts.

Speaker 1 (01:07:11):
The speed will be extremely rapid. So by twenty twenty seven,
you'll have agents, you'll have AI basically helping to write
AI improvements, running a lot of the experimentations, and so
you'll have decades of advancements in a week. And that's

(01:07:37):
the very shrewd minds in the space. The guys that
like have been making these predictions that have been coming
true are the ones saying that that's not me making
that number up.

Speaker 3 (01:07:47):
So it's twenty.

Speaker 1 (01:07:49):
Twenty five already, we're rapidly approaching the middle of twenty
twenty five, So this is less than twenty four months
away that you're seeing like this kind kind of blinding
speed what they call the intelligence explosion. So that's coming
very fast. That will create a work environment for sure

(01:08:11):
that nobody can conceive of. You are at that point
a fish like twenty twenty eight is beyond the technological
event horizon. Nobody knows what's going to what twenty twenty
eight is going to look like. So the fact that
something three years from now is like, I can't even
begin to tell you what that's going to look like.
So in the workforce, the only way from where I'm
sitting to make any sense of this is what you

(01:08:32):
said at the beginning, which is you need to be
at the edge of it. You need to be deploying it,
using it, improving your skill sets, figuring out how do
I whatever it is that I'm doing, how do I
make this work? And the people that are going to
pull away are the people that have taste, that understand
when something's good and when it's not, people that have
something that they want to build and they have the
courage to go build that thing. I think about it

(01:08:55):
a lot. It's like taking if you take the US
economy and it's like a big sheet of glass, and
there's like a few really big companies and then you know,
sort of a handful of little medium ones, and then
for the most part everything else just these small companies.
You were about to drop that and those big companies
that make up the vast majority of this sheet of

(01:09:16):
glass are just going to shatter into a ton of
small companies, and I can't remer if you said it
before we started rolling, but there's going to be many
billion dollar companies made by one, two or three people.
They get together, they deploy full AI orgs, and they
just run. They give the AI agents like money, the

(01:09:41):
ability to move, to buy things, to do whatever they
need to do, and all of a sudden, all of
those employees that you would have historically had they just
go away. And so the question becomes what does it
look like when let's say even thirty percent of the
total workforce now starts their own company, Like what does

(01:10:03):
that do? How many things will that commoditize? How far
will that push innovation? And then how long will it
last before it's like, actually, human, you're just kind of
slowing me down and I'm the AI and it's like,
you know, how about you just tell me what you
want done and I go do it and.

Speaker 3 (01:10:20):
I sort of sit at a utopian dream. Yeah, I
mean sort of.

Speaker 1 (01:10:23):
So the there were two books written about the future
that were hyper prescient. One has already come true, one
may be about to come true. Unless nineteen eighty four,
the surveillance state as terrifying as advertised, then you've got
a brave new world, which was a world of plenty
where everybody does drugs to numb themselves. And in a

(01:10:47):
world where you don't have to work for anything, finding
meaning and purpose is not going to be easy.

Speaker 2 (01:10:53):
Yeah, if you think about like the base law of
economics is based off of scace, and so when you
think about work and labor and knowledge, it's also built
on scarcity. So an attorney is able to charge a
premium because they have a scarcity of knowledge that they've
been able to acquire. Or a doctor has scarcity of
knowledgey've been oun to acquire. But when knowledge no longer

(01:11:16):
has scarcity, how has that changed? And to your point
about the agents, I saw this, so we have, like
I said, I brought somebody on and we're building out
these agent workflows, and I saw this breakdown of somebody.

Speaker 3 (01:11:27):
He put it online and it was basically the org chart.

Speaker 2 (01:11:31):
And then all the officers of the company had agent shadows,
and the shadows would the agent shadows would just shadow
all their calls or meetings, all these things, and the
CEO could call on a phone call the agent shadow
and discuss all the things that's been going on and
now they've got each of the officers agent shadows communicating

(01:11:51):
between themselves.

Speaker 3 (01:11:52):
That's awesome.

Speaker 2 (01:11:53):
I mean just like that, right, and we're just in
a couple months into this. But when you think about
like an AI agent, like go out and find the
top five business models and just instantly duplicate them for me,
and go hire the other agents that you need and
the other agents that you need, and go build out
this agent workforce to go duplicate this and so instantly
disrupt every online company.

Speaker 3 (01:12:10):
I hear about MCP.

Speaker 1 (01:12:12):
Yeah, the MPC servers, Oh my god, like the So
the protocol that they're going to be using that will
let these AIS talk to each other, is that this
is where it's going to get crazy because right now
agents will dead end pretty quickly. But with the MCP protocol,
they're going to be able to go in. AI will

(01:12:34):
be able to talk to all other AI do their things.
And so now, whether it's accessing your calendar, booking a flight,
checking hotels, I mean just I can't even imagine going
through your tender profile for you.

Speaker 3 (01:12:49):
I mean it literally.

Speaker 1 (01:12:51):
Anything that deploys the standard and Anthropic and Open AI
have already agreed to it. So that's two of the
big boys. I'd be surprised that more people don't. This
is like the TCPIP moment for AI, where now it's
just like you can just hyperlink and it all connects.

Speaker 3 (01:13:08):
That's gonna be cool. Yeah.

Speaker 2 (01:13:10):
So this kind of takes me to the next subject
that I want to jump into is bitcoin. So when
I think about bitcoin. I gave a talk at I
gave the closing keynote at the Bitcoin Conference in Abu
Dhabi a few months ago, and I talked about sort
of this bitcoin forecast twenty thirty, twenty forty, twenty fifty,
and talked about how we have these fifty EU repeating
tech cycles and if you put bitcoin into that, it

(01:13:30):
sort of gives us a roadmap. And a lot of
people think that bitcoin has failed because it's a store
of value but it's not a medium exchange and blah
blah blah. But they don't understand the monetary evolution that
has to happen in the time frames that happens. So
it hasn't failed. It will get there eventually. But Gresham's
law states that good money drives out bad, so we
don't have pre sixty five quarters and dimes in circulation
anymore because they're pure silver. You wouldn't spend that, and

(01:13:52):
if you did, you definitely wouldn't spend it.

Speaker 3 (01:13:53):
You'd save it.

Speaker 2 (01:13:54):
So you spend the bad money the post sixty five,
you saved.

Speaker 3 (01:13:57):
The pre right.

Speaker 2 (01:13:58):
So you wouldn't want to use big coined to buy
a cup of coffee when I can just spend Fiat
to do that. So then you start thinking, well, how
does bitcoin evolve from that store value to a medium exchange. Well,
we would want to use it only for things that
it could only be used for, and so then you
start thinking, Okay, what are the things that it could
do that FIAT can't, Like micro transactions, high frequency transactions,

(01:14:21):
things like that, and you start to think about AI agents.
So AI agents can't have a bank account because you
have to have KYC and AML you have to be
a person to do that. So they can have their
own bitcoin address, their own bitcoin wallet, and they could
do micro transactions fractions of penny back and forth at
the speed of light a thousand times if they wanted to.

Speaker 3 (01:14:38):
And so we're.

Speaker 2 (01:14:39):
Already seeing there was an AI agent that was programmed
to go build a business, find names, business plans, find
the domain names and went and bought a domain name
using bitcoin over the Lightning network. So already seeing truth
terminal or a different one. I'm not sure which one.
I did that on truth terminals correct, So it's already

(01:15:00):
And so it seems like with the AI agents they're
going to need some form of payment to hire each other.
Pay them out the cost of compute plus energy not to.

Speaker 1 (01:15:09):
Derail though, So on bitcoin, I am so nonplussed by
the argument that this has to be a medium of exchange.
So tell me if you think I'm just naive and
I'm missing something. But to me, the thing that would
cause bitcoin to fail is to have a currency that

(01:15:29):
can't be inflated. So if there was a currency that
couldn't be inflated, I wouldn't even think twice about bitcoins.

Speaker 2 (01:15:35):
Is about I put my money in whatever that thing
is bitcoin? Is that?

Speaker 1 (01:15:39):
So right now, in a world where your money is
being inflated, you have to have somewhere to go where
your money can't be inflated. So now the only catch
up with bitcoin me is that people love its volatility.

Speaker 3 (01:15:50):
I hate it.

Speaker 1 (01:15:51):
I want it to just be a static place I
can put my money and be like, cool, this is
going to be worth the same inten.

Speaker 3 (01:15:56):
Courl that it is. That's gold, you got that gold
also works.

Speaker 1 (01:15:58):
Yeah, but in a world where it becomes more and
more virtual, I don't under like when I think about
a kid who grows up and they were let's say nine,
two years ago. For them, bitcoin just exists. It's just
a thing. People take it seriously. It's an option, Like
they don't have any sense of like, oh it was
born and there was a time where people thought it

(01:16:21):
was weird. They're just like, oh, yeah, bitcoin. It's like
that my daughter thought everything was a.

Speaker 2 (01:16:25):
Touch screen touch every computer, kay, of course.

Speaker 1 (01:16:28):
So I think my thesis anyway with bitcoin people can
think what they will is that the world is going
to be more virtual tomorrow than it is today. And
one of the most important things to virtualize is money.
Certainly don't want it to be a CBDC stable coin
helps a lot, but that's still, at least as imagine
right now, going to be backed by a fiat currency.

(01:16:48):
So that doesn't solve my problem of inflation. So having
a inflation resistant thing that I can self custody.

Speaker 3 (01:17:00):
Solves that problem.

Speaker 1 (01:17:00):
So I've obviously heard people now starting to talk about
this has failed because it's not a medium of exchange.

Speaker 3 (01:17:07):
That just doesn't even make my radar.

Speaker 2 (01:17:09):
Gold can be, but it can't like I hold gold,
gold can't be in the digital age. And ye, gold
requires centralization to get the velocity, which is why it failed.

Speaker 1 (01:17:18):
So why don't people complain about that because they're saying
it's used in like electronics.

Speaker 2 (01:17:22):
Yeah, no, one's arguing that with gold. But back to
the point. So if you think about bitcoin and the
attributes and so in the monetary evolution, you go from
a collectible it's a cool rock, feather, seashell baseball card.
But some collectibles most don't, but some collectibles become a
store value. Some people put a lot of wealth in
their baseball cards or Pokemon cards or whatever, right art

(01:17:43):
and art, and then some collectibles, if they have the
right I'm sorry, store values, if they have the right attributes,
could become a medium exchange, which would be portable, durable, divisible, recognizable, fungible,
et cetera. And do you accept that that's an important
part of Bitcoin's future. Well, let me I'll answer that
by asking you a question. So if you imagine the future,
you said, there's two books that were written that tell

(01:18:06):
us the future that are very clear, and one of
them was nineteen eighty four. And why is that one
so scary? You said, it's an authoritarian state. So the
government if you read you know, Anatomy of the State
or Bastiet's the law, you understand the role of the
government is to always grow and to always retain power.

Speaker 3 (01:18:28):
And so governments are going to try and do two things.

Speaker 2 (01:18:29):
Number one, continue to inflate away the currency and number two,
continue to take more power.

Speaker 3 (01:18:33):
And more freedom away.

Speaker 2 (01:18:34):
That's that's the role of the state of the government.

Speaker 3 (01:18:37):
And so.

Speaker 2 (01:18:39):
In North Korea, you're not allowed to have money. Let
me give you another example. So bitcoin has lots of attributes.

Speaker 3 (01:18:46):
Permissionless.

Speaker 2 (01:18:48):
There's over a billion adults in the world right now
today who are not allowed to use the financial global
financial system because they don't have permission to join. A
billion adults. There's eight billion people. I'm talking a billion adults,
so they don't have permission to join. So think about
the mind share that the world has lost. If we
brought a billion adults into the world to bring more

(01:19:10):
solutions and solve more problems and more value.

Speaker 3 (01:19:13):
But they're out.

Speaker 2 (01:19:14):
They left the war torn country. They don't have proper documentation.
They are a fifteen year old kid that happened to
be born in the wrong country.

Speaker 3 (01:19:20):
They're out, so think about that.

Speaker 2 (01:19:22):
They don't have permission in the current system of medium
exchange currency. It's also censorship resistant. It's also immutable. So
for example, immutable, why is that important? Well, in India
in twenty sixteen, they canceled all the big bills and
you had a month to bring them all in and
claim them and show all the paperwork of how you

(01:19:43):
earn the bills or you lost them. But India is
a high cash country, so a lot of people had
this money saved and they lost it all.

Speaker 3 (01:19:49):
It wasn't immutable.

Speaker 2 (01:19:50):
Why censorship resistant. The lady that cuts my hair is
from Afghanistan. She was there when she was a girl.
When the Biden administration pulled out in twenty twenty two,
the disaster, the Taliban took it over. She's like, Mark,
I just feel so bad. I know these ladies back
in Afghanistan, they need so much help. I'd love to
send them money if there was away, but I can't
because the Taliban took over the bank. So it's censorship resistant.

(01:20:12):
So in this future world, I believe to the point
I think you believe nineteen eighty four it wasn't supposed
to be a playbook, right, an instruction manual, but it
seems to be that way. And in North Korea and
in Afghanistan and the US dodged a curveball that Europe
is trying to launch a CBDC. Now in that world
that we're rapidly going.

Speaker 3 (01:20:32):
Into, we will probably need I.

Speaker 2 (01:20:34):
Mean, people today need a censorship resistant way to transmit value.
And so the question that you would ask yourself is
do you think governments print more money or less money
in the future, and do you think they become more
authoritarian and less authoritarian in the future.

Speaker 3 (01:20:47):
Now, you mentioned.

Speaker 2 (01:20:48):
Earlier, you know what AI is doing in the rise
of or the divide between the rich and the poor
and potentially leading to some sort of mass civil unrest.
I know Verydally is super big on that topic. So
in that if that were to happen and we got
more civil unrest, then what do governments do. Do they
take more power, become more authoritarian? And if so, do
we need a way to transact outside of the state.

(01:21:10):
Yap all strikes me as a very good idea.

Speaker 1 (01:21:13):
Now I'm a much bigger believer in the wrench attack,
I think than a lot of people. If the government
wants your bitcoin, they're going to get your bitcoin.

Speaker 2 (01:21:20):
They won't And here's why. So have you read the
book Sovereign Individual? I have read like the first two chapters.
It as a no, and you gave up on it.
I I gave up.

Speaker 1 (01:21:33):
It was interesting, but other things, Yeah, it took its
place and I never went back to it.

Speaker 3 (01:21:37):
So if you think about like the state, the a state.

Speaker 2 (01:21:41):
The country, the government has a monopoly on violence obviously, right,
But like all things, there's trade offs and there's returns. Right.
So like in nineteen thirty three, when the government sees
the gold, it was very easy for them to seize
the gold because all the goal was in the bank, right,
So all the gold went in the bank. They gave
you an IOU claim for the gold, so we could
speed up the velocity of money while they held the gold.

Speaker 3 (01:21:59):
So it was very easy.

Speaker 2 (01:22:00):
For them to put a bank holiday in place. Banks
were closed for a week. When they opened the banks
back up, you were no longer able to get your gold. Now,
if they had to ride out in nineteen thirties across
the plane to ranchers and farmers with guns. They had
to go house to house to house to house to
house and where's your gold, go through where's your treasure map,
and they had to go dig it up.

Speaker 3 (01:22:21):
Would that have been realistic?

Speaker 2 (01:22:22):
The return on violence would have been way too low.
Every one of those ranchers with guns would have fought
back the chance of them finding one or two gold coins.

Speaker 3 (01:22:29):
The return on violence would be too low to do that.
But because it was in the bank, it was very
easy to do well.

Speaker 2 (01:22:34):
Trying to go door to door to door and try
to recover people's hardware, wallets, and hang people by their
toe and tickle them until they give up their private key,
the return of violence is just way too low. It's
just impractical for them ever to try to achieve something
like that.

Speaker 1 (01:22:46):
To cast a vote that the government won't do it
is very different than to say that they can't do it.
So my argument is simply anybody that's taken a lot
of solace in that, I would say, don't if for
whatever reason your the one that they target and come
after they will get your bitcoin.

Speaker 3 (01:23:03):
So there's a couple things about that.

Speaker 2 (01:23:04):
So Number one they could say, hey, we'll kill you
on site if you don't turn it in.

Speaker 3 (01:23:08):
Yep.

Speaker 2 (01:23:08):
So they did make gold illegal, right, so not only
do we take your gold, if we catch you with gold,
it's illegal, So they could certainly do that. I would
say with that a couple of things, Like number one,
when the government makes things illegal, they typically become.

Speaker 3 (01:23:21):
Bigger and more useful.

Speaker 2 (01:23:22):
So like the war on drugs since the seventies, drugs
are a bigger problem today now. When the government tells
you not to do drugs, it doesn't make you want
to do drugs. But when they tell you don't have
a right to story your wealth in a way we
can't steal and confiscate from you, this sort of makes
you want to.

Speaker 3 (01:23:35):
Just like every time they talk about imposing.

Speaker 2 (01:23:36):
New gun laws, what happens with gun sales. So number one,
you have human psychology and motivation. Number two, with drugs,
drugs have to be their physical items that have to
be grown and cultivated and packaged and shipped and smuggled
and distributed. Whereas bitcoin is completely the digital peer to
peer and can't be traced.

Speaker 3 (01:23:52):
Or tracked, So how do they stop that. They can't
even keep drugs out of a prison. So I think
it's just impractical.

Speaker 2 (01:23:59):
I think technolo happens is technology moves faster than the state.
So you have three D gun schematics, Well they can
make those illegal. Well now they're on the bitcoin blockchain
and there's some supersistent for all of humanity. How does
the government do that? So when they try to, I
think what happens when the government tries to move on
technologies that are moving and advancing faster than they are,

(01:24:19):
it makes the state look irrelevant and incompetent. But you've
interviewed a lot of people with bitcoin. My good friend
Robert Breedlove, I know you've. I had Sailor on a
couple of times as well, and he calls bitcoin digital energy.
He has some really good metaphors. I mean, he's an
amazing mind. I love the part where he told you

(01:24:40):
you said.

Speaker 3 (01:24:41):
But what about all these other cryptocurrency?

Speaker 2 (01:24:42):
He says, how many chairs do you have? You're just
sitting in one chair? Right?

Speaker 3 (01:24:46):
Yes? The meme.

Speaker 2 (01:24:48):
Yeah, but when you think about it like that as
digital energy, and then you think about like storing that
energy for a long period of time, especially in today's
day and age, or going into this AI age. There
was this sort of deep Seek moment that was April
twenty seventh? Was it where all of a sudden Deep
Seek was dropped into the world and it sort of

(01:25:10):
caused Nvidia to crash And it sort of made the
whole world kind of step back and go, are we
pricing assets properly? Because couldn't a bunch of wealth leave
the nasdak can go to China and if they need
less in Vidia chips, do we have in Vidia price right?
And it sort of made everybody sort of look at
everything differently. And I'm curious, through your talks and bitcoin
with Sailor and others, do you think it eventually reprices

(01:25:34):
assets like investment assets as we know them. It's a
good question on a long enough timeline. I honestly don't know.
That isn't the way that I think about bitcoin. To me,
I take a very simple approach, which is that you've
got inflation. Inflation is the problem of the modern era,
like the problem the number of things that I think

(01:25:54):
emanate from the fact that government's deficits, spend and then
print to not how to engage the voting public with
their choices is deeply problematic. So to me, bitcoin solves
that incredible problem, and it solves it in a moment.

Speaker 3 (01:26:12):
Where as it's being.

Speaker 1 (01:26:16):
Effectively repriced by the wild inflation of the dollar, combined
with the growing awareness and demand for bitcoin, you get
this incredible run up for people that are able to
withstand the temporary volatility. So I look at it in
that perspective. I ultimately want bitcoin to become an incredibly
boring asset. I want it to be a place where

(01:26:36):
I can put my money and I know that the
amount of output I got from a unit of my
time in the.

Speaker 3 (01:26:43):
Past holds that value.

Speaker 1 (01:26:45):
Because then, and this is a big part of my
thesis right now, the reason that the average American doesn't
invest in the stock market, and if they do, they
certainly don't invest very much, is because they don't understand it.
It's just complicated enough that they're never going to understand it.
And so I think a government has a moral obligation
to give their citizens an opportunity to just save their

(01:27:09):
money in a way where it's not going to go
down in value. Once you do that, I think a
lot of problems go away.

Speaker 3 (01:27:15):
Now.

Speaker 2 (01:27:15):
People that are fiscally responsible, they're going to save their money.

Speaker 1 (01:27:17):
It's not going to be eaten by inflation. There's going
to be a benefit for them to do that. So
that's the thesis the lens through which I look at bitcoin. Now,
whether it ends up gobbling up more and more of
these assets, I don't see how it couldn't. For somebody
like me, who might otherwise consider rolling the dice on
a piece of art, I would much rather be in

(01:27:39):
certainly my future vision of bitcoin, where it becomes far
less risky. So people that are trying to build wealth
right now are going to go for bitcoin, but hopefully
one day that's like a way more stable play, and
so that's not where they're focused on. And that way,
we really begin to have assets that are low risk,
and we have assets that are higher risk, and the
people that want to accept the consequences of getting that wrong,

(01:28:01):
they can go do it. I think people should be
able to spend their money however they want. But I
have not put a lot of thought into like how
much do I think that this just keeps gobbling up
the world. Because I didn't buy it because I thought,
oh my god, this is going to go up an
insane amount of value. I bought it because I thought
tomorrow would be more digital than today, and that you
have to have an escape from inflation.

Speaker 2 (01:28:23):
You have to Yeah, you've interviewed over six hundred high performers. Congratulations.
That number, by the way, generate over a billion views.
And you've pulled yourself from scrounging on couch cushions to
a billion a billion dollar business exiting a billion dollar business.
You showed on Twitter or on x your top ten
favorite interviews of all time, and so they weren't ranked

(01:28:45):
in any particular order. But I noticed that like five
of those people were money guys and two of those
were like future tech guys. So I was going to
ask you if you had to distill everything you've learned
into a single mind blowing insight that might be left
thinking about for a few days, what would that be?

Speaker 1 (01:29:05):
Nothing matters other than thinking from first principles, that is.

Speaker 2 (01:29:11):
To formulate your own devastating.

Speaker 3 (01:29:15):
Yes it.

Speaker 1 (01:29:16):
First principle is a little deeper than that. First principles
is the world works on a set of rules. We'll
call them the laws of physics and everything on that. Yes,
they are the axioms that we cannot disprove. And once
you understand that life is a big chain of cause

(01:29:36):
and effect that we can't go all the way back
because we don't under fully understand physics. But man, you
can really think like an engineer, which is the right way,
even if you're playing a game of psychology. The human
mind works in a certain way. It's complex, a lot
of variables, but it works.

Speaker 2 (01:29:53):
In a knowable way.

Speaker 1 (01:29:55):
And so if you focus obsessively on mapping that, well,
then you can get as close to having a crystal
ball as you're ever going to get. And it's almost
the guests that I have really fall into two gigantic camps.

Speaker 2 (01:30:14):
There are people that are just worried about the.

Speaker 1 (01:30:17):
Laws of physics in their own mind, Like, here are
the things that I'm up against, David Goggins. I figured
out how to master my own mind. Cool, You've got
the physics of your mind, and given that, you've been
able to lead an extraordinary life because you understood how
to deal with your own limitations, your cognitive biases, all
of that.

Speaker 3 (01:30:34):
Then there are.

Speaker 1 (01:30:34):
People that do an external thing where they master money,
they master business.

Speaker 3 (01:30:41):
It's the same game.

Speaker 1 (01:30:42):
You're trying to figure out, how does this thing actually work?
How do I get my emotions out of the way,
and how do I think from first principles? And I
have a feeling no matter how many people I interview,
they're going to use different words, but they're all going
to be people who did that thing either on a
really granular subject. Take Annie Jacobs, I believe is her name,

(01:31:03):
and she's walking you through the first principles of nuclear
war and what it looks like, and like why we
have to avoid it and all that stuff again just
cause and effect. Eric Weinstein is literally walking you through
the laws of physics. Michael Sailor how bitcoin is financial energy,
and he's literally using the language of physics. Ray Dalio saying,

(01:31:25):
I figured out how this game worked, and so I
can point you five hundred years backwards to explain it,
and I can tell you, given these cycles, what it's
going to look like moving forward. It's like all of
the really just incredible banger guests are people that I'm like, oh,
you know where you're trying to end up, and you're
just thinking from first principles with an understanding that you
can master a set of skills that allow you to

(01:31:45):
do something in your own mind or in the real
world that works better than the next person.

Speaker 2 (01:31:50):
Yeah, and everyone should figure that out for themselves.

Speaker 1 (01:31:53):
I mean, it's really universal, but you have to figure
out how to get out of your own way, with
your own cognitive limitations, biases, distorted frame of reference, so
it will be an end of one experiment, but you're
seeking the truly universal.

Speaker 3 (01:32:12):
Got it, all right?

Speaker 2 (01:32:14):
I think this might be my longest interview ever, So
I apologies only because I kept taking it was.

Speaker 3 (01:32:18):
Good, It was really good. It was really good. So
we'll end it with that. Tom. Obviously, we'll link your
stuff down below. I'm su everybody already knows where to
find you, I hope, so at Tom, Bill you wherever
you go. All right, Thanks, thank you,
Advertise With Us

Follow Us On

Host

Mark Moss

Mark Moss

Show Links

The Mark Moss ShowMark's Website

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.