Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin.
Speaker 2 (00:28):
Hey everyone, we are off for the holidays, so we
wanted to reshare one of our favorite episodes last year,
when Nate's book On the Edge came out, I had
a chance to interview him about it on the podcast,
and I thought it would be really fun to take
a look back at that interview where Nate was in
the interviewee seat. Then I wrote a little review recap
(00:52):
of the book on my sub stock. If somehow you
are listening to Risky Business and you have not read
the book, please read the book. It's wonderful, Nate. You
did such a great job and I really enjoyed that conversation.
Speaker 3 (01:04):
Yeah, no, look, maybe I should go back and read
that book is pression. I think it's talking about all
the kind of topics from you know, prediction markets to
AI to crypto that are our only you know, to
gambling scandals right that are only becoming more pertinent today.
But thank you for that interview, Maria, And yeah, maybe
I'll take a listen to myself.
Speaker 2 (01:26):
Yeah, so enjoy everyone.
Speaker 1 (01:32):
So, first of all, Nate, congratulations. I know that this
was a long time coming labor of many many years.
Speaker 3 (01:40):
Yeah no, I spent basically three years of my life
where I'd say this is my main professional focus, talk
to two hundred people, and I know, I just I
would recommend email really smart people and ask to have
a conversation with them and pretend it's for a book,
and then don't write the book. That's probably still worth
it in some sense. But no, I actually went ahead
and wrote the book.
Speaker 1 (02:00):
And we're very glad you did. So I'm going to
start with Nate. In the book, you make this distinction
between two community. You have the village and you have
the river. Talk me through what exactly those mean.
Speaker 3 (02:17):
Let's start with the village, even though it's not the
focus of the book, just because it's a little bit
easier to define in a term that's more conventional. The
village is basically the kind of liberal, progressive East Coast establishment.
So it's the New York Times, it's Harvard University, it's
think tanks and nonprofits, it's you know, government when you
have a Democrat in office at least, it's kind of
(02:38):
the expert class, the professional managerial class, that whole cluster
of attributes. By contrast, the river is kind of the
world that I suppose I inhabit, or I imagine myself inhabiting.
It's the world of calculator risk taking, where people are
some combination of being very analytical and extremely wildly competitive.
So like poker is an archetypal maybe the archetypal river activity.
(03:02):
But as I discovered this world in the book, you know,
I found that the venture capitalist and the effective at
and the hedge fund people, and the sports betters, even
the crypto people, they all have a common vocabulary. They
all talk about things in roughly the same ways, and
so it's like it's not a place, it's a discrete community.
(03:24):
I mean, in Las Vegas and Silicon Valley you might
get closer to an actual network, but like they're just
these people that I think are often poorly understood by
the mainstream media and yet are incredibly wealthy and powerful.
Many of the richest people in the world have some
of these river tendencies, and like, surprisingly to me, would
(03:45):
take my emails and my phone calls because they kind
of recognize like a fellow traveler a little bit. And
then what I discovered is that the kind of conceit
of the proposal that actually there's a commonalities in how
people think more was more literally true than I had thought,
and that there's more literal crossover poker players who become
effective altruists, or hedge funders who host poker games, or
(04:07):
venture capitalists who do sports betting is a metaphor for
some of these things. So it's a real place of
a real group of people that I think, Maria are
in our circles, maybe not the only circle to run in,
but certainly one common type. And I felt like I
had the opportunity to tell a really good story about
these people. I did not. I think I got lucky.
(04:28):
I think I'm running in my like ninety eighth percentile
run good for how some of these topics really blew up.
I mean Ai in particular, although that was a fairly
incidental thing in the book Effective Altruism and its particularly
Sam Bakman Free who had talked to first before the
collapse and then got to talk to you afterwards as well.
Speaker 1 (04:48):
Yeah, so you draw some really interesting parallels between worlds
that I didn't necessarily connect, like obviously poker sports bedding,
Like yes, you know, I get why those things are related.
Then you go into finance, you go into venture capital
(05:08):
and the hedge world in Silicon Valley you go into
AI and people who are kind of the AI researchers,
and I actually love your little side where you have
the physical risk taker. Yes, where you have the astronauts,
and I love that you have a female astronaut. By
the way, did you on purpose, like, did you try
to have two women or did that just so happen?
(05:29):
You know, because as you know, like it's something that
is interesting to me, And when you're talking about risk takers,
people often assume men, and so I was very curious
if that was a deliberate choice or if it just
so happens that these most interesting stories happen to be female.
Speaker 3 (05:45):
No, there there was a deliberate choice made to I mean, look,
it's a careful balance to strike because it is a
world that's very male. Also world that is pokers a
little bit more diverse, but the rest of the river
is quite white and or Asian and or Asian sometimes.
So there was a deliberate choice to like not sugarcoat
(06:08):
this and pretend it's all a fifty fifty like. But
also the I call them characters. The people I talked to,
like Katharine Sullivan, this astronaut's amazing, and she has an
amazing stories and she was studying like oceanography in Nova
Scotia and beat out like ten thousand other people to
become an astronaut. I mean, it's an amazing story. Cataline Kaiko,
(06:29):
I had to learn all the names pronounced correctly for
the audiobook and no I'd forgotten them. Who was this
woman who like smuggled money in a Teddy Bear to
move out of Hungary and then was just shot upon
in the establishment while she was working on MR and
A vaccines and eventually joined biointech and won the Nobel Prize.
Co won the Nobel Prize for her work on MR
(06:50):
and A vaccines. I mean, they have amazing stories. Or
someone like our mutual friend but one of the best
poker players in the world. I think Maria Ho who
has great stories to tell too so so or Vanessa
Selps is another person that we know but like it's
just an amazing person. So yeah, I mean, when you
find people like that and you want to tell their stories.
And there was, but there's also sections that directly tackle
(07:12):
There's one section the poker chapter about why an't there
more women in poker? And then one section in the
VC chapter about why are there so few black and
Hispanic and women founders, and so you have to take
it on directly without it being a woke book because
I'm not super woke, but you can't. But there are
real shortcomings here. I mean the VCS in particular, I think,
are they claim to want people who have overcome, who
(07:33):
have grit, who have overcome odds, and who are high variants,
who are different than the consensus, and yet they have
this prototype of like the on the spectrum, you know,
white or Asian twenty something nerd who dropped out of
Harvard or MIT as a sophomore and Sam mcminfreed played
into that stereotype, and they're not actually, I think, trying
to diversify their portfolio, and it just in a purely
(07:53):
not moral but even like financial sense of wanting like
high variance founders.
Speaker 1 (07:59):
Yeah. No, I think one of the things that I
enjoyed about the book is that it's about how to
take risk well. But you also talk about some of
the shortcomings of the of the people on the River
and how they think about risk. But first, let's talk
about kind of the good elements. So what does the
river have in common when it comes to evaluating risk.
(08:20):
What can we take away from them when we're thinking
about Okay, I am, you know, Maria, everyday person. How
do I kind of think about risk in my life?
What do I take from these people who take risks
professionally so that I can just make better decisions and
better risk assessments myself.
Speaker 3 (08:42):
So let me take the two axioms that you hear
over and over and over again in Silicon Valley and
make it very successful financially despite having all these flaws.
I mean, some of these guys are difficult people. It
is mostly guys. It's not very diverse. There are some
big flaws, right, I think their political instincts are often poor.
But they do two things, one of which is they
understand expected value, which is kind of the theme of
(09:04):
this show. They understand that if you can make a
bet that has a ten percent chance of one hundred
act payout, that's an extraordinarily good bet. And then you
can make a lot of those bits by aggregating different
companies into a fund, then you're kind of guaranteed to
make money and maybe make a lot of money. And
it also applies to things like evaluating nuclear war risk
(09:24):
or AI risk. Things that might be you know, a
two percent chance in over some timeframe, or five percent
chance whatever else, but have like very bad to infinitely
bad negative payouts. The second thing that Silicon Valley knows
is that having a longer time horizon is really worth it.
We're a country that likes to make money, but we're
(09:45):
a get rich quick country and not to get rich
slowly country. When you're investing in startups, early stage startups,
they often have a time horizon. You know, SpaceX took
thirteen years I think to make a first profit nine
year would actually even launched a rocket successfully or something
like that. So I think you almost always benefit from
having like a longer time horizon than other people. People
just discount the long term future wait much even in
(10:10):
their personal lives, I think. And so therefore those I
think are kind of the two foremost lessons and why
the book is you know, kind of ambivalently to positive
on that Silicon Valley ethos, even though the individual characters
there are are complex.
Speaker 1 (10:24):
So one of the things that you stress over and
over is that people who evaluate risk correctly and make
risks that tend to pay out over the long term,
they not only can understand the upside, but they can
also protect against the risk of ruin, right, the risk
(10:44):
of not being able to take these risks going forward.
So first, I'd love for you to talk a little
bit about this idea of risk of ruin because you
talk about people who do it correctly and then people
who might not, and and also like how do you
how do you balance that, and how might that be
different from person to person.
Speaker 3 (11:03):
Yeah, I mean, we have this abstract concept in poker
of a bank roll, which is and that is, how
much money can you spend on poker before your kind
of poker broke or at least you're severely limited in
the size of the games you might play, and you
might have to pass up positive evy opportunities. So to
some extent, this is related to the idea of opportunity cost.
(11:25):
There's a famous formula from sports betting called the Kelly criterion,
which tells you how much can you bet to maximize
your expected value without enduring a very high risk of ruin.
Now I can get into the technicalities of why that
formula might make you a little bit more aggressive than
you should be. In general, though like ninety percent of
(11:45):
people in the world are I think too risk averse
about this kind of thing. This goes back to, as
you'll know, like some of the condoment and diversity work
on prospect theory, it may have also an evolutionary base
where we weren't living in the time of abundance that
we are now and so and so. You know, you
only got one shot before it made sense to protect
(12:06):
your household. It made sense to be very cautious about
if you get an infection, you might die. And I
think people have not adapted to this world of abundant
but confusing opportunity that we have very much. Annie Duke
wrote a book called Quit, and she's in the book too,
also a former hooker player. Of course, she cites studies
like by Steve Levitt of fre Economics that when they
(12:26):
have people flip a coin to make a decision, and
a major decision, not like am I getting Thai food
or Indian for dinner tonight, which I do flip a
coin about sometimes, by the way, but things like should
I leave my partner, or should I quit my job,
or should I move across the country to a different
part of the world. People are happier on average when
they make a change and so you know, the ninety
percent of people who are risk averse maybe could take
(12:48):
some messages from that book. Then, but you also meet
the sbfs and the Elon Musks of the world.
Speaker 1 (12:54):
And let's actually unwrap that last bit a little bit.
The sbfs and the Elon Musks because you write about
the fact that you know, obviously they are people. Wouldn't
people don't necessarily lump them together, right, because Elon Musk
is someone who is the richest man in the world,
(13:15):
and then SBF is going to jail. However, they do
have this one characteristic in common. So would I would
love for you to expand on that alone.
Speaker 3 (13:23):
Yeah, and I should be clear, I mean I don't
mean to I am very and the book is very
unsympathetic to SBF. I'm not saying I'm sympathetic to Elon.
But Elon has accomplished a lot of things, and I
think Elon's a real person whose career wasn't based on
I mean, I don't think Sam's ull career was based
on fraud, but he didn't. You know, Elon has not
committed ten billion dollars worth of stealing people's crypto deposits. Basically, Yeah,
(13:48):
they are the two people that really, really really go
Baalse to the wall. I think we can use that
phrase in terms of risk.
Speaker 1 (13:54):
Absolutely.
Speaker 3 (13:56):
If you read the Walter Isaacson book on Elon Musk,
his poker strategy is literally going all in every hand
until he runs out of money or wins it all.
Speaker 1 (14:04):
I didn't know that that was crazy to me.
Speaker 3 (14:07):
It makes sense, so I've I've played in the all
in podcast poker game a couple of times with some
of Elon's buddies, and that's a very aggressive game. Shall
we say. I'm forbidden to reveal details, but there's a
lot of blind naked aggression there. You know. SBF is
even more unambiguously, I think irrational in taking a very
(14:30):
literal approach of utilitarianism, where you know, he has said
repeatedly he told his co founder Carolyn Ellison this. He
told the economist Tyler Cowan this that if he could
flip a coin to determine the fate of the world,
and either half the time the world is destroyed, but
the other half the time the world is like two
(14:51):
point oh one x as good, then he's calculating the
expected value. He's so committed to that that hey you
know one point five is greater than one. Therefore you
take the coin flip, even though you might destroy the world,
And in fact, you take that coin flip repeatedly, over
and over again. This is called the Saint Petersburg paradox,
where it's a bet that has infinite expected value, but
(15:11):
its infinitely likely to leave you ruins. So what should
you do? Probably not take the bet. It's not that
hard a dilemma, actually, but to SBF it was. And
in my interviews with him, you kind of find a
pattern of him saying, I mean the first thing who
I did with him when he's riding high January twenty
twenty two, Bitcoin still near its all time peaks. He's like,
if you're not really literally willing to ruin yourself, not
(15:34):
the fake ruined, Like if Elon Musk has to sell
Twitter and he's only worth one hundred and seventy billion
dollars instead of two hundred and twenty billion dollars, that's
not what it would describe as being ruined. And Sam
was very clear that I don't think you should protect
your downside. You have to be so risk loving that
you should literally be willing to destroy your life and
your reputation if it's positive expected value, which I think
(15:57):
is insane. I think reflects I mean, not a clinical diagnosis,
but I think literally some I think he's a somewhat
defective person emotionally in another way. And you know, the
fact that he was enabled by so many supposedly smart
actors is a core question that's asked by the book.
I mean, I you know, the New York Times review
(16:20):
understood this, which is that it's like the book is
actually pretty unsparing to some of these characters in the River,
even though like it's not a book where I'm gonna
be like Peter Teal is evil and Elon Musk is
evil and a fascist, Like that's not my style of writing.
And I think I think they're interesting people who deserve
to have their stories told by someone who kind of
understands the personality type. But SBF was a really dangerous
(16:43):
figure and on top of everything else, like also a
fairly bad calculator of risk. His decision to go take
the witness stand at his trial after the government's testimony
was just incredibly devastating, and they had him dead to
rights and contradicting everything he told people publicly, everything he
told me in interviews that were on the record, although
(17:03):
with an embario, I mean they couldn't be used until
the book was released. You know, the willingness to lie,
plus the constant miscalculations he made where he probably could
have avoided jail time by conceding defeat and not taking
the witness stand. Not at all the jail time, but
ten years or five years and not twenty You know,
why did so many people vouch for him? As a
(17:25):
question that I think has to be has to be pondered.
Speaker 1 (17:29):
We'll be back after a quick break. You actually touched
on something there which I think is important to unwrap
a little bit more, which is that if you're taking
these huge gambles, you better be pretty damn sure your
(17:51):
calculations are accurate, that your percentages are accurate. But we're
living in a world where you can't do that, and
so I'm curious. You know, you build models, right that
try to kind of model things that are uncertain, that
are unknowable, that have this margin of error, And so
I think you understand and more than most people that
(18:11):
you know, if you say sixty percent certain or like
sixty percent chance of this, like you're not actually like
that there is a margin of error there, right, Like,
if you fifty one percent likely that you're going to
have a plus EV and forty nine, what if you're wrong? Right, Like,
what if it's forty nine that you're that you're actually
negative TV and fifty one that you're going to destroy civilization?
(18:34):
So I'm I'd love you to walk us through that
thinking and through kind of the hubris that you both
need to succeed in some ways in this area. But
also that might make you over confident of these sorts
of percentages where you really really can't afford to be.
Speaker 3 (18:49):
Yeah, look, we've seen bad election models. We saw models
I gave. Yeah, saw models that gave Trump a one
percent chance or zero point one percent chance of winning
in twenty sixteen. We saw models earlier this year that
gave Biden, even after he had tanked in the debate,
a fifty to fifty chance of winning, which didn't make
any sense. Even building models for these kinds of close
(19:10):
problems like elections or sports forecasting is pretty tough. You
have a lot of choices to make as a modeler,
your biases might creep in and other things. So to
then kind of take these very back of the envelope
informal models and try to apply them to every problem
(19:30):
in the world. Is, you know, obviously prone to going
wrong to some extent, and like in effective altruism, they're
very literal about this. I talked to Will mccaskell, who
was kind of one of the founders or at least
the brand name of EA and wrote a book called
What We Owe the Future. He calls himself a long
term'speaning he's concerned about the very very very far future.
(19:52):
The first time to talk to him, he was like, okay,
so how do we weigh like a human's life against
an animal's life? And he's like, well, it should be
based on the number of neurons in the animal's brain,
or maybe the number of neurons to the one third
power or something like that. It's a good approximation, but
that leads to weird things like an elephant elephants. Elephants
are actually with more than human beings by that calculation,
(20:14):
it turns out, but you also face some comfortable decisions.
The book uses a story of a time when a
dog named I didn't know.
Speaker 1 (20:23):
Yeah, I didn't realize that the poodle like I didn't
know this story, so please, yeah, please.
Speaker 3 (20:27):
There's a poodle that name named Dakota that gets loose
in Prospect Park or some park in North Brooklyn from
its owner and dashes into the subways at the first
stop inbound on the F train from Manhattan to Brooklyn. So,
and this is happening at like three point thirty on
a weekday, so you're starting to get the rush hour commute.
And the question is should we shut down the entire
(20:49):
F line in order to search for and rescue Dakota,
which they did, and Dakota popped up at some other
subway station an hour or so later. But that's a
case where I'll put it this. This is how I
put it in the book. If you had, like a
human baby who had fallen onto the tracks, like, no
one would dispute that we should halt every train until
(21:12):
that baby is found. If it was a squirrel, then
we would just run under the squirrel, right, no questions.
Speaker 1 (21:17):
Asked, We would we would.
Speaker 3 (21:19):
I mean, I hate squirrels, and the dog is intuitively
kind of a close decision, and so and so and
but the fact is that you have to make a decision,
and we had to make these decisions in COVID, where
you know, what's the cost of the shutting down schools
versus the cost of preventing some degree of sickness and death? Right,
I mean, that's a real that's a real trade off,
(21:41):
or things that are more intangible. What's the cost of
the undermining people's happiness because they can't at least they're
not supposed to see their friends in person during COVID,
or they can't go to a baseball game, or go
to a restaurant, or see their dying relative in a
hospital or all, or go to church, all these other things,
like like what's the cost of that? And we make
(22:01):
these decisions with with you know, different heuristics and different
rules of thumb that are often very clumsy and maybe
maybe words of magnitude wrong. I think the decision to
keep school shut down after vaccines were available is like
wrong by like ten x or one hundred x, like
unambiguously a terrible decision that some Blue states made even
after vaccine availability. So you kind of get this dilemma
(22:25):
where you have to make you have to make decisions,
you have to make some calculation. But I think people
underestimate how back of the envelope. These are and when
you start to put numbers to things, and people sometimes
stop asking questions about models where they should ask more questions.
Speaker 1 (22:42):
Is there something that you would recommend, because this is
something that I've come across, you know, many times at
a wall I hit in a lot of my research.
Is there something that you have seen in people who
are successful where they are able to modulate or at
least like self check their overconfidence and their hubris, because
(23:02):
it seems like most of the people on the river
don't do that, right, Like Elon Musk, absolutely not. Like
Computer Tel absolutely not. Like all of these men who
you talk about, they don't seem to have a switch
where they can be like, Okay, you know what, let
me take a step back and self evaluate. Did you
find anyone who was able to do that? And if so, how,
(23:22):
like how do you do that?
Speaker 3 (23:25):
I mean, in some ways, some of the poker players
I talked to seem like they're more at ease with
themselves and can modulate it somehow. You know, Jason Kuhn,
for example, is an interesting poker player who has some
very you know, had a rough upbringing, has some very
aggressive tendencies. But I think has some type of perspective
of there's this very competitive part of him that he
understands can be unleashed, but has to be you know,
(23:48):
meet it out in doses somehow. Look, there is some
bias in the book in the following sense, which is
that the people who are quote unquote crazy or the
people who really you know, like to talk about themselves
and take crazy risks, make for better stories. So yeah,
I mean I'm thinking about like, you know, Michael Moritz
is a Sequoia capital venture capitalist who's been very successful
(24:10):
early investor in Google of many of other companies. He's
a guy who I think is probably pretty measured actually.
A former journalist Michael Moritz ven Knowed Kosla is another
one who invests in kind of alternative things like alternative
energy or or you know, what's it called artificial meat,
things like that. I mean, he seems to be like
reasonably well balanced. Patrick Collison, the founder of Strike. But
(24:34):
and these people are all in the book, but because
they're less bombastic, they maybe get like both a little
bit less of a write up from me and then
and then kind of having some lost in translation problem too.
But yeah, look, over competence is a really big problem
for people in the River because by definition, these are
people who probably won their first couple of big bets,
(24:57):
sometimes by skill, sometimes by luck. I'd say often more
luck than skill, but usually some of both. And when
you're on a winning streak, then you can think that
your God's gift to whatever venture that you happen to
be involved in, and you can be overconfident and that
because there is like there is like fruitful territory. I mean,
the village leaves a lot to be desired for how
it assesses risk and does other things, and so you know,
(25:20):
you can in some ways the critiques that these two
communities have of one another are both pretty smart. I mean,
I think the River's critique of the village that these
guys are too risk averse, that they're partisan, that they're
too concerned with social perception. I think that's basically right.
But also some of the Silicon Valley founders and vcs
are are total arrogant pricks.
Speaker 1 (25:42):
Yeah, that's one way of putting it. And I think
that was something that you're that you just hinted at,
which is I think going to necessarily be like a
problem when you're trying to figure out who to interview
for the book. And it's just in this industry, well
in the world in general, is we have this survivor bias, right,
which we see the people who are successful. How are
(26:05):
you going to go out and interview all the people
who took these risks, who where it didn't pan out.
One of the things that really struck me as I
was reading your book was when you actually did some
of these simulations, right, and you actually wrote out what
the numbers looked like if you had different strategies, and
some of them I was like, holy shit, right, like
this crazy risky strategy, Like, yes, you die ninety five
(26:29):
percent of the time, but like, look at that five percent.
But that five percent is mostly who you end up interviewing, right,
because they become the elon Musks of the world. So
what does that do to kind of our perceptions of
some of these traits?
Speaker 2 (26:40):
You know?
Speaker 3 (26:40):
Yeah, and so look, this became a little bit easier.
I'm just speaking kind of behind the scenes as a writer.
When the SBF thing blew up, then that became easier.
There's originally going to be like a chapter on failures,
but that meant more people that like you know, want
to poke a tournament and then developed some type of
problem afterward or whatever else. But like SBF kind of
(27:03):
served enough of that role where great meretive wise, I
thought it was okay. I mean, there is one story
about a person named run bow Lee who was a
smart guy. You never see it Toronto.
Speaker 1 (27:13):
How that was kind of a heartbreaking story. Yeah, go ahead.
Speaker 3 (27:15):
And he came to me and he said, you're writing
this book about the successful people just kind of cold
emailed me, and he's like, I'm somebody who got really
burned by by day trading, by day trading options, in
particular getting into all the Wall Street Beth's kind of communities,
and he has lost a million dollars on options trading,
and he wanted to tell a story, and I'm like,
the story has to go in the book. There's also,
(27:37):
you know, the chapter on Las Vegas, which is chapter three,
is about how casinos, particularly through slot machines, kind of
manipulate their patrons into spending more. I mean, the stories
I heard from Natasha Sholl, who is a kind of
an amazing woman. She's a NYU ethropologist who had like
never been out of the East Coast, and she's like
Las Vegas is an exotic place. I want to study
(27:58):
Las Vegas as an anthropologist, did her dissertation about it,
then wrote a book about it, and that's a story
about problem gambling. And although didn't talk to that many
problem gamblers, really you know, it's very dark about slot
machine edication.
Speaker 1 (28:14):
Yeah. Well, one of the things actually that really was
interesting to me there because I hadn't read Schell's book,
was this insight that you that you go into, which
is that a lot of these problem gamblers that people
who play slot machines they know they're losing and they
don't care that. They're just like they enjoy, like they
(28:35):
enjoy that experience even though they lose. And to me,
like that was actually an alha moment because I always
thought that at least they thought they could win, right,
But like one of the allers was that, like you
might be a winner, but you even said that some
of them don't like winning because it brings them out
of that flow stage.
Speaker 3 (28:52):
Yeah, you're in some machine zone Natasha Shall calls it
the machine zone where it's just you and you're pressing
the reeles and you know, I don't have much of
a compulsion to flay slots, but I've tossed in a
few bucks here and there when you're frustrated the Innovan
Foker tournament, and some of the games are very fun
and compelling, right you got like gorillas or what's the
big one? Not garillas, but uh, Buffalo.
Speaker 1 (29:15):
I've never I've never played slots in my life.
Speaker 3 (29:18):
Even I'm going to force you to. I'm going to
give you money to play slots. I think, Maria, you
can no longer make that drag, which I find nitty
by the way, but but yeah, they they want to
shut out all the other distractions of the world and
just channel it into like one problem which is problem
part problem gambling, which is very different than like the
(29:39):
table games players. The blackjack crabs player more often fits
the stereotype of somebody who is bored because they don't
have enough risk tolerance in their life, enough risk in
their life, and so they're useless as a way, and
this but some of this is is gendered too, if
you want to go there, but like you know, men
feeling the need. If you've been enough craps, tables, craps,
(30:01):
I do find fun, don't play it much. If you've
been enough groups of guys at a at a craps
table where they'll encourage one another to make bets. You know,
every crafts bets negative ev. But also to play outside
their means and gamble bigger still have stories to tell.
I mean, that's you know, that's a certain type of
young man is more likely to do that kind of thing,
and it might substitute for other types of ways approve
(30:22):
their bravery. But like the slots machines players and then
the skill game players, the poker players, the sports betters,
the handful of advantage players, which means people who try
to count cards in blackjack, there are conditions. Please do
not try this yourself. There are conditions under witch playing.
Slot machines can be plus ev because of contingencies the
(30:46):
machines or bonus payouts or jackpots that becomes efficiently large.
But those are those are one percent of the gambling world.
I mean, sometimes I feel like you and I are
use you know, some of the prestige resorts, generally speaking,
the prestige resorts in Las Vegas and elsewhere have the
best poker rooms, the win that are et cetera. Resorts
(31:08):
world Now, Allas, you, I sometimes wonder if, like we're
used a little bit to whitewash the fact that like
ninety nine percent of people in that casino are doing
negative expected value gambling.
Speaker 1 (31:20):
No, I mean that I think that's true, and I
do and I do overlook that on purpose because I
don't like paying attention to that, right, But yeah, I know,
I think you're right, And I don't want to judge
anyone for you know, playing slots or doing whatever they're doing.
But I do find it a little bit depressing when
I see people who don't have an edge and who
(31:41):
are just kind of pressing buttons. Yeah.
Speaker 3 (31:44):
Look, I comfort myself by saying that poker, at least
I think is plus CV any utilitarian calculus for society,
just because I think it is such a unique teacher
of certain critical life skills in an environment where you
have something real at risk that like, I just I
just think that, like I just think that playing poker
(32:04):
makes you more adept at handling real life risky situation
a lot of the time, you know. Apart from that,
I mean, I think there's a lot in the book
about sports betting. I think there's more papers coming out
saying that sports betting increases bankruptcies, among exactly the classes
of people that you would expect, like you see oh
(32:25):
in uptick and bankruptcies and males age twenty to thirty nine.
I wonder what could have caused that when sports betting
went online in X and Y State. You know, look,
I think the slot machine stuff is probably quite bad
from utilitarian standpoint. I think the state lottery is actually
maybe the worst of all on a kind of a
per capita basis, where the odds are very bad for
the player and the socio economics of who plays the
(32:48):
state lottery are It's basically attacks on like lower class
people for the most part, so there's a lot of hypocrisy.
There are also lots of arguments about the arms race
that hey, if we don't legalize this game, then maybe
somebody else would, right. I mean, if I had my
you know, if I could perfectly calibrate it, then maybe
i'd have something where you have a casino that offers
(33:10):
poker and table games but not slot machines. And maybe
you have sports bedding which can be done in person
at a facility but not online, where the addiction seems
to be higher, and they have fair rules where you
don't limit winning players et cetera. And you have limits
on how much can take from Wales. Like there's a
world in which I would be I think more content with.
(33:31):
And again I'm not a moralistic person. The book is
not going to be scoldy, and I believe in showing
and not telling. But like, it's not like the the
gambling industry comes out looking great. It comes out looking smart.
It comes out looking like it understands how to like
model its customers out. And I would say one thing
to say encounter to this is that if you go
to the mid to high range resorts in Las Vegas,
(33:54):
my subjective view is that people seem to be having
a pretty good time, and Las Vegas gets a lot
of repeat business, so there's some type of win win
in that deal. But I might I might draw the
line a little bit higher than like a total freefer all,
which is kind of where we're headed.
Speaker 2 (34:07):
Now.
Speaker 3 (34:09):
We'll be back in just a minute.
Speaker 1 (34:26):
I have a few questions, But first I'm just curious
are you still doing sports betting? So I loved that
you picked it up in a serious way for the book,
although holy shit, I had no idea that you'd wagered
over a million dollars for your research, Nate, I didn't know,
but yeah, tell us about that, and tell us what
(34:48):
you're doing right now.
Speaker 3 (34:49):
So I wanted to have skin in the game, to
borrow another author's phrase and see what it's actually like
when you're going through the grind of sports betting an
NBA season. So I bet almost the entirety of the
twenty twenty two twenty three NBA season, all the regular season,
in about half the playoffs, and then five thirty eight
blew up, and so I had other things to do,
and I didn't bet the end of the playoffs, and
(35:11):
I learned that, I mean, it's probably what I should
have expected by I learned that it's pretty hard. I
went on a huge heater at the start of the
NBA season where I was up like seventy thousand bucks.
I'm like, man, I'm really fucking good at the sports
betting stuff. But then things change. One thing that changed
is you start to get limited by some of the sites,
especially if you're trying to do things like, oh, I
(35:32):
just suck. Because I'm a Twitter addict, I follow a
lot of NBA writers.
Speaker 1 (35:36):
Oh.
Speaker 3 (35:37):
Damian Lillard of the Portland Trailblazers is injured today, but
the line at sportsbook X does not yet account for that.
So let me go and bet three thousand dollars on
this bet, which is now hugely positive. Ev you can
do it the first time, maybe the second time. The
third time, your hands are cut off basically kind of literally.
So when you aren't able to take advantage of things
(35:58):
that you know in principle, I think you should be.
I mean, if they're posting a line, they're taking action
on it. And the fact that, like in the NBA
regular season, it's all about is this player injured? Who
is trying to actually win and who's not. The models
that you have at the start of the season don't
work as well when you're kind of in this like
triage phase of the of the trenches of the NBA
regular season, and you know, and you talk to other
(36:20):
bears like, yeah, I have three guys whose entire job
Spanky kiro Lois is a guy who we interview in
the book. I have three guys who just track injury
data for me. I mean, while I'm trying to like
do some Twitter search for is this player in the
warm up? You know you're competing essentially against the best players,
the best betters in the world. So I love that
(36:41):
one point eight million dollars in bets and believe this
is just like on average the bet size is like
twelve hundred bucks, you're making fifteen hundred twelve hundred dollars
bets or whatever. Over the course of the season, I
made about five thousand bucks, probably spent about five thousand
hours on this, so I made less than minimum wage
actually with my with my sports betting side hustle.
Speaker 1 (37:03):
I love it. I love it. So I want to
ask two more things before we wrap up. The first
is kind of a theme that I and I think
I picked up on this because you know, it's something
that I've obviously thought about a lot a lot, And
it's the strand that is woven throughout the book, which
(37:24):
is that even though the book is about skill, about
finding edges, about how to make plus ev gambles, there
is this thread throughout it of luck, right and of
the fact that you also do need to get lucky,
and you actually you point to a lot of these
moments where things could have turned out so differently, like,
(37:44):
you know, to pick a random example, the car crash
that Peter Tila and Elon Musk were in right, like
what happens if the car lands differently? And I don't
know if you want to talk about that specific example,
but this is something that you're clearly very aware of,
and I just want to raise it and talk about
it a little bit.
Speaker 3 (38:03):
Yeah, there is a survivorship bias problem when you're writing
a book about successful people, and literally kind of in
the case of Elon Musk and Peter Teal, Elon had
sold his first company, bought like a million dollar McLaren
F one or something sports car basically almost like a
you know, Formula one caliber car, and they're driving it
(38:23):
on I think sand Hill Road or somewhere in Palo Alto,
and Elon floor is the accelebrator while trying to change lanes,
and it kind of spirals up into like doing a
little helicopter motion and miraculously lands on its wheels, and
Elon and Peter are are okay, and actually hitchhikes their
VC meeting, And I guess keep PayPal alive by with
(38:46):
that hitchhike. But so I asked Peter Teal, you know,
I asked him in a way that was nerdier. I
asked him if you ran the world a thousand times,
had a thousand simulations, then how often would you wind
up in this position? And he objected to this question
and went on the whole thing about determinism or probablism,
which is interesting, but we don't have time for today,
(39:07):
I don't think. But yeah, life is very contingent. We've
talked on the show about the assassination attempt against President
Trump where that was a very close call, or things
like the butterfly ballot in the two thousand election, which
is in one ballot in one county in Florida in
two thousand, the ballot had a funky design where some
people who would have wanted to vote for Al Gore
(39:27):
wound up voting for Pat Buchanan instead, and that was
enough to cost or the election. On top of the
recount stuff in Florida wouldn't have mattered if this ballot
design had been better. So life is. Whether it's literally
metaphysically random or not is a philosophical question, But it's
random to our ability to determine it in a lot
of ways. And it's just it's just very hard to
(39:49):
acknowledge the role that luck plays in your life. It's
very hard, even for me or for you. Maybe you're
the exception, Maria, it's very hard not to tell yourself
a clever narrative in which in which you know, even
in writing the book I have, I think I've gotten
very lucky with a lot of things to kind of
have like this SBF story, like fall into My Lap,
(40:12):
for example, was I think pretty lucky. And having a
book that covered territory like AI that kind of became
a much bigger deal, and sports betting really blew up.
I mean, you know, so I think I've gotten very
lucky in a lot of things recently, the way that like,
you know, I didn't know that actually having this independent
newsletter is actually actually much better economically than being working
for a big company. I didn't know that, And I
could easily have taken a deal with another big company
(40:34):
and wound up worse off for it, and a lot
of respects, and so, you know, I know the role
that luck has played in my life, including just things
like being basically healthy and having a supportive partner and
supportive friends, and being born into a country where it's
okay to be a gay man, and a country where
where you have some downside protection. I mean, those are
all things that are very lucky. So you know, the
(40:55):
next time that I take a cooler in poker or
blue set of or set or something, then I can't
complain too much.
Speaker 1 (41:03):
Well, I had more questions, but that seems like such
a beautiful place to end that I think I have
to not ask my remaining question. But yeah, I mean,
obviously this is something I've thought about a lot and
was kind of the big theme of the Biggest Bluff.
But it's nice to see that you know that this
(41:23):
is something that you're thinking about as well, because I
do think that it's easy for us to get over confident,
and I think that as we you know, during risky
business and all of these endeavors, we do want to
maximize our skill, but we do need to always remember
that we're we are insanely lucky to even be in
this position. So Nate, I hope that you get continue
(41:45):
to get lucky, and that this book sells an insane
number of copies and that it does incredibly well, and
that you and I can continue making this show together
because we're both continuing to be lucky and yet take
plus evy edges whenever we can.
Speaker 3 (42:03):
Thank you so much, Maria, and you know, maybe if
you want to have you on the show again sometime.
You know, of fun, we have a good chemistry.
Speaker 1 (42:10):
I think, I agree, I agree. So yeah, you're you're
welcome back. How's next week looking for you?
Speaker 3 (42:16):
Actually really fucking busy, but we'll make it work.
Speaker 1 (42:30):
Let us know what you think of the show. Reach
out to us at Risky Business at pushkin dot fm.
Risky Business is hosted by me Maria Kanakova.
Speaker 3 (42:38):
And by me Nate Silver. The show was a cool
production of Pushing Industries and iHeartMedia. This episode was produced
by Isaac Carter. Our associate producer is Sonya gerwit Lydia,
Jean Kott and Daphne Chen are our editors, and our
executive producer is Jacob Goldstein. Mixing by Sarah Bruger.
Speaker 1 (42:57):
If you like the show, please rate and review us
so other people can find us too, But once again,
only if you like us. We don't want those bad
reviews out there. Thanks for tuning in.
Speaker 2 (43:10):
From the call.
Speaker 1 (43:12):
W