All Episodes

August 15, 2024 43 mins

Nate’s book “On the Edge” is out this week! (Get your copy here.) Maria interviews Nate about the origins of the book, what extreme risk takers get right and wrong, and the year Nate bet a million dollars on sports.

Check out the Risky Business YouTube channel here: https://www.youtube.com/@RiskyBusinessShow

For more from Nate and Maria, subscribe to their newsletters:

The Leap from Maria Konnikova

Silver Bulletin from Nate Silver 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin. Welcome back to Risky Business, a show about making
better decisions. I'm Maria Kanakova.

Speaker 2 (00:30):
And I'm Nate Silver.

Speaker 1 (00:32):
So today Nate and I are going to be doing
something slightly different. Today is Tuesday, as always, which is
when we tape August thirteenth, which is a really important
date because it's the day that Nate's book comes out.
So today, instead of being co hosts, I am going
to get in the interviewer's seat and put Nate in
the hot seat where I get to ask him also

(00:52):
some really hard questions about his book. How does that sound?

Speaker 3 (00:55):
N I have to be on your podcast, Maria. I'm
glad that Nate guy is off this week.

Speaker 2 (01:00):
He's a fucking jerk.

Speaker 1 (01:05):
I really enjoyed reading On the Edge, which is out today.
So by the time that everyone is listening, I hope
you've already bought a copy, but if not, go ahead
and buy it right now. So first of all, Nate, congratulations.
I know that this was a long time coming labor
of many many years.

Speaker 2 (01:25):
Yeah.

Speaker 3 (01:25):
No, I spent basically three years of my life where
I'd say this is my main professional focus talk to
two hundred people, and I know I just I would
recommend email really smart people and ask to have a
conversation with them and pretend it's for a book, and
then don't write the book. That's probably still worth it
in some sense. But no, I actually went ahead and
wrote the.

Speaker 1 (01:45):
Book, and we're very glad you did. So I'm going
to start with Nate. In the book, you make this
distinction between two communities. You have the village and you
have the river. Talk me through what exactly those mean.

Speaker 3 (02:02):
Let's start with the village, even though it's not the
focus of the book, just because it's a little bit
easier to define in a term that's more conventional. Village
is basically the kind of liberal, progressive East Coast establishment.
So it's the New York Times, it's Harvard University, it's
think tanks and nonprofits, it's you know, government when you
have a Democrat in office. At least, it's kind of

(02:23):
the expert class, the professional managerial class, that whole cluster
of attributes.

Speaker 2 (02:29):
By contrast, the river is kind.

Speaker 3 (02:31):
Of the world that I suppose I inhabit, or I imagine
myself inhabiting. It's the world of calculator risk taking, where
people are some combination of being very analytical and extremely
wildly competitive. So like poker is an archetypal maybe the
archetypal river activity. But as I discovered this world in
the book, you know, I found that the venture capitalist

(02:53):
and the effective altruists, and the hedge fund people and
the sports betters, even the crypto people, they all have
a common vocabulary. They all talk about things in roughly
the same ways, and so it's like it's not a place,
it's a discrete community. I mean, in Las Vegas and
Silicon Valley you might get closer to an actual network,

(03:14):
but like they're just these people that I think are
often poorly understood by the mainstream media and yet are
incredibly wealthy and powerful. Many of the richest people in
the world have some of these river tendencies, and like,
surprisingly to me, would take my emails and my phone
calls because they kind of recognize like a fellow traveler

(03:36):
a little bit. And then what I discovered is that
the kind of conceit of the proposal that actually there's
a commonalities in how people think was more literally true
than I had thought, and that there's more literal crossover
poker players who become effective altruists or hedge funders who
host poker games, or venture capitalists who do sports betting
is a metaphor for some of these things. So it's

(03:58):
a real place of a real group of people that
I think, Maria are in our circles, maybe not the
only circles to run in, but certainly one common type.
And I felt like I had the opportunity tell a
really good story about these people.

Speaker 2 (04:11):
I did not. I think I got lucky.

Speaker 3 (04:13):
I think I'm running in my like ninety eighth percentile
run good for how some of these topics really blew up.
I mean AI in particular, although that was a fairly
incidental thing in the book Effective Altruism, and particularly Sam
Bakman Free who had talked to first before the collapse
and then got to talk to you afterwards as well.

Speaker 1 (04:34):
Yeah, so you draw some really interesting parallels between worlds
that I didn't necessarily connect, like obviously poker, sports, bedding,
Like yes, you know, I get why those things are related.
Then you go into finance, you go into venture capital

(04:54):
and the hedge fund world in Silicon Valley, you go
into AI and people who are kind of the AI researchers,
And I actually love your little side where you have
the physical risk taker, yes, where you have the astronauts,
and I love that you have a female astronaut. By
the way, Wait, did you on purpose, like did you
try to have two women or did that just so happen?

(05:15):
You know, because as you know, like it's something that
like is interesting to me. And when you're talking about
risk takers, people often assume men, and so I was
very curious if that was a deliberate choice or if
it just so happens that these most interesting stories happened
to be female.

Speaker 3 (05:31):
No, there there was a deliberate choice made to I mean, look,
it's a careful balance to strike because it is a
world that's very male. Also a world that is Poker's
a little bit more diverse, but the rest of the
river is quite white and or Asian and or Asian sometimes,
so there was a deliberate choice to like not sugarcoat

(05:53):
this and pretend it's all fifty to fifty. But like,
but also the I call them characters, the people I
talked to, like Catharine Sullivan, this astronaut's amazing and she
has an amazing stories and she was studying like oceanography
in Nova Scotia and beat out like ten thousand other people.
I'm an astronaut and mean, it's an amazing story. Cataline Carico,

(06:14):
I had to learn all the names pronounced correctly for
the audiobook, but no, I've forgotten them. Who was this
woman who like smuggled money in a Teddy Bear to
move out of Hungary and then was just shot upon
in the establishment while she was working on MR and
A vaccines and eventually joined biointech and won the Nobel Prize,
come won the Nobel Prize for her work on MR

(06:35):
and A vaccines. I mean, they have amazing stories. Or
someone like our mutual friend but one of the best
poker players in the world. I think Maria ho who
has great stories to tell too, so so or Vanessa
Selps is another person that we know, but like it's
just an amazing person. So yeah, I mean when you
find people like that and you want to tell their stories.
And there was but there's also sections that directly tackled

(06:57):
There's one section in the poker chapter about why aren't
there more women in poker? And then one section in
the VC chapter about why are there so few black
and Hispanic and women founders? And so you have to
take it on directly without it being awoke book because
I'm not super woke, but you can't. But there are
real shortcomings here. I mean the VCS in particular, I think,
are they claim to want people who have overcome, who

(07:18):
have grit, who have overcome odds, and who are high variants,
who are different than the consensus, and yet they have
this prototype of like the on the spectrum, you know,
white or Asian twenty something nerd who dropped out of
Harvard or MIT as a sophomore, and Sam mcminfreed played
into that stereotype, and they're not actually, I think, trying
to diversify their portfolio, and it just in a purely

(07:39):
not moral but even like financial sense of wanting like
high variance founders.

Speaker 1 (07:44):
Yeah. No, I think one of the things that I
enjoyed about the book is that it's about how to
take risk well. But you also talk about some of
the shortcomings of the of the people on the River
and how they think about risk. But first let's talk
about kind of the good elements. So what does the
River have in common when it comes to evaluating risk?

(08:06):
What can we take away from them when we're thinking
about Okay, I am you know, Maria everyday person? How
do I kind of think about risk in my life?
What do I take from these people who take risks
professionally so that I can just make better decisions and
better risk assessments myself.

Speaker 3 (08:27):
So let me take the two axioms that you hear
over and over and over again in Silicon Valley and
make it very successful financially despite having all these flaws.
I mean, some of these guys are difficult people. It
is mostly guys. It's not very diverse. There are some
big flaws, right, I think their political instincts are often poor.
But they do two things, one of which is they
understand expected value, which is kind of the theme of

(08:49):
this show. They understand that if you can make a
bet that has a ten percent chance of one hundred
x payout, that's an extraordinarily good bet. And if you
can make a lot of those bets by aggregating different
companies into a fund, then you're kind of guaranteed to
make money.

Speaker 2 (09:04):
And maybe make a lot of money.

Speaker 3 (09:06):
And it also applies to things like evaluating nuclear war
risk or AI risk, things that might be you know,
a two percent chance in over some time frame or
five percent chance whatever else, but have like very bad
to infinitely bad negative payouts. The second thing that Silicon
Valley knows is that having a longer time horizon is

(09:27):
really worth it. We're a country that likes to make money,
but we're a get rich quick country and not a
get rich slowly country. When you're investing in startups, early
stage startups, they often have a time horizon. You know,
SpaceX took thirteen years I think to make a first
profit nine years would actually even launched a rocket successfully
or something like that. So I think you almost always

(09:47):
benefit from having like a longer time horizon than other people.
People just discount the long term future way too much,
even in their personal lives, I think, And so therefore
those I think are kind of the two foremost lessons
and why the book is you know, kind of ambivalently
to positive on that Silicon Valley ethos, even though the
individual characters there are are complex.

Speaker 1 (10:09):
So one of the things that you stress over and
over is that people who evaluate risk correctly and make
risks that tend to pay out over the long term,
they not only can understand the upside, but they can
also protect against the risk of ruin. Right the risk

(10:29):
of not being able to take these risks going forward.
So first I'd love for you to talk a little
bit about this idea of risk of ruin because you
talk about people who do it correctly and then people
who might not, and also like, how do you balance
that and how might that be different from person to person.

Speaker 3 (10:49):
Yeah, I mean, we have this abstract concept in poker
of a bank roll, which is and that is, how
much money can you spend on poker before your kind
of poker broke or at least you're severely limited in
the size of the games you might play and you
might have to pass up positive ev opportunities. So to
some extent, this is related to the idea of opportunit cost.

(11:11):
There's a famous formula from sports betting called the Kelly criterion,
which tells you how much can you bet to maximize
your expected value without enduring a very high risk of ruin.
Now I can get into the technicalities of why that
formula might make you a little bit more aggressive than
you should be. In general, though, like ninety percent of

(11:31):
people in the world are I think to risk averse
about this kind of thing. This goes back to, as
you'll know, like some of the connoment and diversity work
on prospect theory. It may have also an evolutionary base
where we weren't living in the time of abundance that
we are now and so and so, you know, you
only got one shot before it made sense to protect

(11:51):
your household. It made sense to be very cautious about
if you get an infection, you might die. And I
think people have not adapted to this world of abundant
but confusing opportunity that we have very much. Annie Duke
wrote a book called Quit, and she's in the book too,
also a former poker player. Of course, she site studies
like by Steve Levitt fre Economics that when they have

(12:12):
people flip a coin to make a decision, and a
major decision, not like am I getting Thai food or
Indian for dinner tonight, which I do flip a coin
about sometimes, by the way, but things like should I
leave my partner, or should I quit my job, or
should I move across the country to a different part
of the world. People are happier on average when they
make a change, and so you know, the ninety percent
of people who are risk averse maybe could take some

(12:34):
messages from that book then, but you also meet the
sbfs and the Elon Musks of the world.

Speaker 1 (12:39):
And let's actually unwrap that last bit a little bit.
The Sbfs and the Elon Musks because you write about
the fact that you know, obviously they are people. Wouldn't
people don't necessarily lump them together, right, because Elon Musk
is someone who is richest man in the world, and

(13:00):
then SBF is going to jail. However, they do have
this one characteristic in common, so I would I would
love for you to expand on that alone.

Speaker 3 (13:08):
Yeah, and I should be I mean, I don't mean
to I am very and the book is very.

Speaker 2 (13:13):
Unsympathetic to SBF.

Speaker 3 (13:16):
I'm not saying I'm sympathetic to Elon, but Elon has
accomplished a lot of things, and I think Elon's a
real person whose career wasn't based on I mean, I
don't think Sam'shull career was based on fraud, but he didn't,
you know, Elon has not committed ten billion dollars worth
of stealing people's crypto deposits. Basically, Yeah, they are the
two people that really, really, really go balls to the wall.

(13:38):
I think we can use that phrase in terms of risk.

Speaker 2 (13:40):
Absolutely.

Speaker 3 (13:41):
If you read the Walter Isaacson book on Elon Musk,
his poker strategy is literally going all in every hand
until he runs out of money or wins it all.

Speaker 1 (13:49):
I didn't know that that was crazy to me.

Speaker 3 (13:52):
It makes sense, so I've I've played in the all
in podcast poker game a couple of times with some
of Elon's buddies, and that's a very aggressive game, shall
we say. I'm forbidden to reveal details, but there's a
lot of blind, naked aggression there. You knowf is even
more unambiguously I think irrational in taking a very literal

(14:16):
approach of utilitarianism, where you know, he has said repeatedly
he told his co founder Carolyn Ellison this. He told
the economist Tyler Cowan this that if he could flip
a coin to determine the fate of the world, and
either half the time the world is destroyed, but the
other half the time in the world is like two
point oh one x as good, then he's calculating the

(14:39):
expected value. He's so committed to that that, hey, you know,
one point five is greater than one. Therefore you take
the coin flip, even though you might destroy the world,
and in fact, you take that coin flip repeatedly, over
and over again. This is called the Saint Petersburg paradox,
where it's a bet that has infinite expected value, but
it is infinitely likely to leave you ruins.

Speaker 2 (14:58):
So what should you do? Probably not take the bet.
It's not that hard a dilemma, actually, but to SBF
it was.

Speaker 3 (15:04):
And in my interviews with him, you kind of find
a pattern of him saying, I mean, the first thing
that I did with him, when he's riding high January
twenty twenty two, Bitcoin still near its all time peaks,
He's like, if.

Speaker 2 (15:15):
You're not really literally willing to ruin.

Speaker 3 (15:18):
Yourself, not the fake ruined, Like if Elon Musk has
to sell Twitter and he's only worth one hundred and
seventy billion dollars instead of two hundred and twenty billion dollars,
that's not what it would describe as being ruined. And
Sam was very clear that I don't think you should
protect your downside. You have to be so risk loving
that you should literally be willing to like destroy your

(15:38):
life and your reputation if it's positive expected value, which
I think is insane. I think reflects I mean not
a clinical diagnosis, but I think literally some I think
he's a somewhat defective person emotionally in other ways. And
you know, the fact that he was enabled by so
many supposedly smart actors is a core question that's asked

(16:02):
by the book. I mean, you know, the New York
Times review understood this, which is that it's like the
book is actually pretty unsparing to some of these characters
in the River, even though like it's not a boo
where I'm gonna be like Peter Teal is evil and
Elon Musk is evil and a fascist, Like that's not
my style of writing. And I think I think they're
interesting people who deserve to have their stories told by

(16:24):
someone who kind of understands the personality type. But SBF
was a really dangerous figure and on top of everything else,
like also a fairly bad calculator of risk. His decision
to go take the witness stand at his trial after
the government's testimony was just incredibly devastating, and they had
him dead to rights and contradicting everything he told people publicly,

(16:46):
everything he told me in interviews that were on the record,
although with an embario, I mean, they couldn't be used
until the book was released. You know, the willingness to lie,
plus the constant miscalculations he made where he probably could
have avoided jail time by conceding defeat and not taking
the witness stand. Not awitted all the jail time, but
ten years or five years and not twenty You know,

(17:07):
why did so many people vouch for him?

Speaker 2 (17:10):
As a question that I think has to be has
to be pondered.

Speaker 1 (17:14):
We'll be back after a quick break. You actually touched
on something there which I think is important to unwrap
a little bit more, which is that if you're taking
these huge gambles, you better be pretty damn sure your

(17:36):
calculations are accurate, that your percentages are accurate. But we're
living in a world where you can't do that, and
so I'm curious. You know, you build models, right that
try to kind of model things that are uncertain, that
are unknowable, that have this margin of error, And so
I think you understand more than most people that you know,

(17:57):
if you say sixty percent certain or like sixty percent
chance of this, like you're not actually like that. There
is a margin of error there, right, Like if you
fifty one percent likely that you're going to have a
plus EV and forty nine What if you're wrong, right, Like,
what if it's forty nine that you're that you're actually
negative TV and fifty one that you're going to destroy civilization?

(18:20):
So I'd love you to walk us through that thinking
and through kind of the hubris that you both need
to succeed in some ways in this area. But also
that might make you over confident of these sorts of
percentages where you really really can't afford to be.

Speaker 3 (18:34):
Yeah, look, we've seen bad election models. We saw models
I gave Yeah, saw models I gave Trump a one
percent chance or zero point one percent chance of winning
in twenty sixteen. We saw models earlier this year that
gave Biden, even after he had tanked in the debate,
a fifty to fifty chance of winning, which didn't make
any sense. Even building models for these kind of close

(18:55):
problems like elections or sports forecasting is pretty tough. You
have a lot of choices to make as a modeler,
your biases might creep in and other things. So to
then kind of take these very back of the envelope
informal models and try to apply them to every problem

(19:16):
in the world is you know, obviously prone to going
wrong to some extent, and like in effective altruism, they're
very literal about this. I talked to to Will mccaskell,
who was kind of one of the founders or at
least the brand name of EA, and wrote a book
called What We Owe the Future. He calls himself a
long term mispaning. He's concerned about the very very very
far future. The first time to talk to him, he

(19:38):
was like, okay, so how do we weigh like a
human's life against an animal's life? And he's like, well,
it should be based on the number of neurons in
the animal's brain, or maybe the number of neurons to
the one third power or something like that. It's a
good approximation, but that leads to weird things like an
elephant elephants.

Speaker 2 (19:56):
Elephants are actually.

Speaker 3 (19:57):
With more than human beings by that calculation, it turns out,
but you also face some comfortable decisions. The book uses
a story of a time when a dog named I
don't I.

Speaker 1 (20:07):
Didn't know, Yeah I did it realize that the poodle
like I didn't know this story, so please, yeah, please.

Speaker 3 (20:13):
There's a poodle that name named Dakota that gets loose
in Prospect Park or some park in North Brooklyn from
its owner and dashes into the subways at the first
stop inbound on the F train from Manhattan to Brooklyn. So,
and this is happening at like three point thirty on
a weekday, so you're starting to get the rush hour commute.
And the question is should we shut down the entire

(20:35):
F line in order to search for and rescue Dakota,
which they did, and Dakota popped up at some other
subway station an hour or so later. But that's a
case where I'll put it this this out, put it
in the book. If you had, like a human baby
who had fallen onto the tracks, like, no one would
dispute that we should halt every train till that baby

(20:58):
is found. If it was a squirrel, then we would
just run on the squirrel, right, no questions asked.

Speaker 1 (21:03):
We would we would? I mean, I hate squirrels.

Speaker 3 (21:06):
And the dog is intuitively kind of a close decision,
and so and so and then. But the fact is
that you have to make a decision, and we had
to make these decisions in COVID, where you know, what's
the cost of the shutting down schools versus the cost
of preventing some degree of sickness and death, right, I
mean that's a real that's a real trade off.

Speaker 2 (21:26):
Or things that are more intangible.

Speaker 3 (21:28):
What's the cost of the undermining people's happiness because they
can't at least they're not supposed to see their friends
in person during COVID, or they can't go to a
baseball game, or go to a restaurant, or see their
dying relative in a hospital or all, or go to church,
all these other things, like like what's the cost of that?
And we make these decisions with with you know, different

(21:49):
heuristics and different rules of thumb that are often very
clumsy and maybe maybe words of magnitude wrong. I think
the decision to keep school shut down after vaccines were
available is like wrong by like ten x or one
hundred x, like unambiguously a terrible decision that some Blue
states made even after vaccine availability. So you kind of
get in this dilemma where you have to make you

(22:12):
have to make decisions, you have to make some calculation.
But I think people underestimate how back of the envelope
these are. And when you start to put numbers to things,
and people sometimes stop asking questions about models where they
should ask you more questions.

Speaker 1 (22:27):
Is there something that you would recommend, because this is
something that I've come across, you know, many times at
a wall I hit in a lot of my research.
Is there something that you have seen in people who
are successful where they are able to modulate or at
least like self check their overconfidence and their hubris, because

(22:47):
it seems like most of the people on the river
don't do that, right, Like Elon Musk, absolutely not, Like
Competer Teal, absolutely not. Like all of these men who
you talk about, they don't seem to have a switch
where they can be like, Okay, you know what, let
me take a step back and self evaluate. Did you
find anyone who was able to do that? And if so,
how like how do you do that?

Speaker 3 (23:10):
I mean, in some ways, some of the poker players
I talked to seem like they're more at ease with
themselves and can modulate it somehow.

Speaker 1 (23:18):
You know.

Speaker 3 (23:18):
Jason Kuhn, for example, is an interesting poker player who
has some very you know, had a rough upbringing, has
some very aggressive tendencies, but I think has some type
of perspective of there's this very competitive part of him
that he understands can be unleashed, but has to be
you know, meet it out in doses somehow. Look, there
is some bias in the book in the following sense,

(23:39):
which is that the people who are quote unquote crazy
or the people who really you know, like to talk
about themselves and take crazy risks, make for better stories.
So yeah, I mean I'm thinking about like, you know,
Michael Moritz is a Sequoia Capital venture capitalist who's been
very successful early investor in Google of many of their companies.
He's a guy who I think is probably pretty measured actually.

(24:01):
A former journalist Michael Moritz ven know Kosla is another
one who invests in kind of alternative things like alternative
energy or or you know, what's it called artificial meat
things like that. I mean, he seems to be like
reasonably well balanced. Patrick Collison, the founder of Strike. But
and these people are all in the book. But because

(24:22):
they're less bombastic, they maybe get like both a little
bit less of a write up from me and then
and then kind of having some lost in translation problem too.
But yeah, look, overconfidence is a really big problem for
people in the river because by definition, these are people
who probably won their first couple of big bets, sometimes

(24:43):
by skill, sometimes by luck. I'd say often more luck
than skill, but usually some of both. And when you're
on a winning streak, then you can think that your
God's gift to whatever venture that you happen to be
involved in, and you can be overconfident. And because there
is like there is like fruitful territory. I mean, the
village leaves a lot to be desired for how it
assesses risk and does other things, and so you know,

(25:05):
you can in some ways the critiques that these two
communities have of one another are both pretty smart. I mean,
I think the River's critique of the village that these
guys are too risk averse, that their partisan, that they're
too concerned with social perception. I think that's basically right.
But also some of the Silicon Valley founders and vcs
are are total arrogant pricks.

Speaker 1 (25:28):
Yeah, that's one way of putting in. And I think
that was something that you're that you just hinted at,
which is I think going to necessarily be like a
problem when you're trying to figure out who to interview
for the book, and it's just in this industry, well
in the world in general, is we have this survivor bias, right,
which we see the people who are successful. How are

(25:51):
you going to go out and interview all the people
who took these risks, who where didn't pan out. One
of the things that really struck me as I was
reading your book was when you actually did some of
these simulations, right, and you actually wrote out what the
numbers looked like if you had different strategies, and some
of them I was like, holy shit, right, like this
crazy risky strategy, Like, yes, you die ninety five percent

(26:14):
of the time, but like, look at that five percent.
But that five percent is mostly who you end up interviewing, right,
because they become the elon Musks of the world. So
what does that do to kind of our perceptions of
some of these traits? You know?

Speaker 3 (26:26):
Yeah, and so look, this became a little bit easier.
I'm just speaking kind of behind the scenes as a writer.
When the SBF thing blew up, then that became easier.
There's originally going to be like a chapter on failures,
but that meant more people that like you know, want
to poke a tournament and then developed some type of
problem afterward or whatever else. But like SBF kind of

(26:48):
served enough of that role where great narrative wise, I.

Speaker 2 (26:51):
Thought it was okay.

Speaker 3 (26:51):
I mean, there is one story about a person named
run bow Lee who was a smart guy.

Speaker 2 (26:57):
You never see a trun that was.

Speaker 1 (26:59):
That was kind of a heartbreaking story. Yeah, go ahead.

Speaker 3 (27:01):
And he came to me and he said, you're writing
this book about these sucessful people. Just kind of cold
emailed me. And he's like, I'm somebody who got really
burned by by day trading, by day trading option in
particular getting into all the Wall Street bets kind of communities.
And he has lost a million dollars on options trading,
and he wanted to tell a story, and I'm like,
the story has to go in the book. There's also,

(27:22):
you know, the chapter on Las Vegas, which is chapter three,
is about how casinos, particularly through slot machines, kind of
manipulate their patrons into spending more. I mean, the stories
I heard from Natasha Sholl, who is a kind of
an amazing woman. She's an myu athropologist who had like
never been out of the East Coast and she's like,
Las Vegas is an exotic place. I want to study

(27:44):
Las Vegas as an anthropologist. Did her dissertation about it,
then wrote a book about it, and that's a story
about problem gambling. And although I didn't talk to that
many problem gamblers, really you know, it's very dark about
slot machine addiction.

Speaker 1 (27:59):
Yeah. Well, one of the things actually that really was
interesting to me there because I hadn't read Schul's book,
was this insight that you that you go into, which
is that a lot of these problem gamblers that people
who play slot machines they know they're losing, and they
don't care that they're just like they enjoy, like they

(28:20):
enjoy that experience even though they lose. And to me,
like that was actually an alha moment because I always
thought that at least they thought they could win, right,
But like one of the otherers was that, like you
might be a winner, but you even said that some
of them don't like winning because it brings them out
of that flow state.

Speaker 3 (28:38):
Yeah, you're in some machine zone. Natasha Shall calls it
the machine zone where it's just you and you're pressing
the reels and you know, I don't have much of
a compulsion to play slots, but I've tossed in a
few bucks.

Speaker 2 (28:50):
Here and there. When you're frustrated the Innovan Poker.

Speaker 3 (28:51):
Tournament, and some of the games are very fun and compelling,
right you got like gorillas or what's the big one?
Not Garia's but uh Buffalo Buffalo.

Speaker 1 (29:00):
I've never I've never played slots in my life, even I'm.

Speaker 2 (29:04):
Going to force you to.

Speaker 3 (29:05):
I'm going to give you money to play slots, I think, Maria,
so you can no longer make that bras, which I
find nitty by the way, but yeah, they they want
to shut out all the other distractions of the world
and just channel it into like one problem, which is
problem part problem gambling, which is very different than like.

Speaker 2 (29:24):
The table games players.

Speaker 3 (29:26):
The Blackjack crabs player more often fits the stereotype of
somebody who is bored because they don't have enough risk
tolerance in their life, enough risk in their life, and
so their useless as a way, and some of this
is is gendered too, if you want to go there,
but like you know, men feeling the need. If you've
been enough craps tables, craps I do find fun, don't

(29:47):
play it much. If you've been enough groups of guys
at a at a craps table where they'll encourage one
another to make bets at you know, every crafts bets
negative EV, but also to play outside their means and
gamble bigger, so they'll have stories to tell. I mean
that's you know, that's a certain type of young man
is more likely to do that kind of thing, and
it might substitute for other types of ways to prove

(30:08):
their bravery. But like the slots machines players and then
the skill game players, the poker players, the sports betters,
the handful of advantage players, which means people who try
to count cards in blackjack, there are conditions.

Speaker 2 (30:22):
Please do not try this yourself.

Speaker 3 (30:25):
There are conditions under witch playing slot machines can be
plus ev because of contingencies the machines or bonus payouts
or jackpots that becomes efficiently large. But those are those
are one percent of the gambling world. I mean, sometimes
I feel like you and I are use. You know,
some of the prestige resorts, generally speaking, the prestige resorts

(30:47):
in Las Vegas and elsewhere have the best poker rooms,
the win that are et cetera resorts world Now, Alasio,
I sometimes wonder if like we're used a little bit
to whitewash the fact that like ninety nine percent of
people in that casino are doing negative expected value gambling.

Speaker 1 (31:06):
Now, I mean that I think that's true, and I
do and I do overlook that on purpose because I
don't like paying attention to that, right, But yeah, I know,
I think you're right, And I don't want to judge
anyone for you know, playing slots or doing whatever they're doing.
But I do find it a little bit depressing when
I see people who don't have an edge and who

(31:27):
are just kind of pressing buttons.

Speaker 3 (31:29):
Yeah. Look, I comfort myself by saying that poker, at
least I think is plus CV any utilitarian calculus for society,
just because I think it is such a unique teacher
of certain critical life skills in an environment where you
have something real at risk that like, I just I
just think that, like I just think that playing poker

(31:50):
makes you more adept at handling real life risky situations
a lot of the time.

Speaker 2 (31:56):
You know.

Speaker 3 (31:57):
Apart from that, I mean, I think there's a lot
in the book about sports betting. I think there's more
papers coming out saying that sports betting increases bankruptcies among
exactly the classes of people that you would expect, Like
you see oh, uptick in bankruptcies and males age twenty
to thirty nine, And what if it could have caused
that when sports Beenning went online in X and Y State.

(32:19):
You know, look, I think the slot machine stuff is
probably quite bad from a utilitarian standpoint. I think the
state lottery is actually maybe the worst of all on
a kind of a per capita basis, where the odds
are very bad for the player and the socio economics
of who plays the state lottery are It's basically attacks
on like lower class people for the most part, so
there's a lot of hipocrisy. There are also lots of

(32:41):
arguments about the arms race that hey, if we don't
legalize this game, then maybe somebody else would, right. I mean,
if I had my you know, if I could perfectly
calibrate it, then maybe i'd have something where you have
a casino that offers poker and table games but not
slot machines. And maybe you have sports bedding which can
be done in person at a facility but not online

(33:04):
where the addiction seems to be higher, and they have
fair rules where you don't limit winning players, et cetera,
and you have limits on how much can take from Wales,
Like there's a world in which I would be I
think more content with. And again I'm not a moralistic person.
The book is not going to be scoldy, and I
believe in showing and not telling. But like, it's not
like the the gambling industry comes out looking great. It

(33:27):
comes out looking smart, it comes out looking like it
understands how to like model its customers out. And I
would say one thing to say encounter to this is
that if you go to the mid to high range
resorts in Las Vegas, my subjective view is that people
seem to be having a pretty good time, and Las
Vegas gets a lot of repeat business, so there's some
type of win win in that deal. But I might,

(33:48):
I might draw the line a little bit higher than
like a total free for all, which is kind of
where we're headed. Now.

Speaker 2 (33:54):
We will be back in just a minute.

Speaker 1 (34:12):
I have a few questions, but first I'm just curious
are you still doing sports betting? So I loved that
you picked it up in a serious way for the book,
although holy shit, I had no idea that you'd wagered
over a million dollars for your research, Nate, I didn't know,
but yeah, tell us about that and tell us what

(34:33):
you're doing right now.

Speaker 3 (34:35):
So I wanted to have skin in the game, to
borrow another author's phrase and see what it's actually like
when you're going through the grind of sports betting an
NBA season. So I bet almost the entirety of the
twenty twenty two to twenty three NBA season, all the
regular season, in about half the playoffs, and then five
thirty eight blew up, and so I had other things
to do, and I didn't bet the end of the playoffs,

(34:56):
and I learned that, I mean, it's probably what I
should have expected by I learned that it's pretty hard.
I went on a huge heater at the start of
the NBA season where I was up like seventy thousand bucks.
I'm like, man, I'm really fucking good at the sports
betting stuff.

Speaker 2 (35:11):
But then but then things change.

Speaker 3 (35:12):
One thing that changes is you start to get limited
by some of the sites, especially if you're trying to
do things like, oh, I just suck. Because I'm a
Twitter addict, I follow a lot of NBA writers. Oh,
Damian Lillard of the Portland Trailblazers is injured today, but
the line at sportsbook X does not yet account for that.
So let me go and bet three thousand dollars on
this bet which is now hugely positive. Ev you can

(35:34):
do it the first time, maybe the second time. The
third time, your hands are cut off basically kind of literally.
So when you aren't able to take advantage of things
that you know in principle, I think you should be.
I mean, if they're posting a line, they're taking action
on it. And the fact that, like in the NBA
regular season, it's all about is this player injured? Who
is trying to actually win and who's not. The models

(35:56):
that you have at the start of the season don't
work as well when you're kind of in this like
triage phase of the of the trenches of the NBA
regular season, and you know, and you talk to other
bears like, yeah, I have three guys whose entire job
Spanky here a this is a guy who we interview
in the book. I have three guys who just track
injury data for me. I mean, while I'm trying to
like do some Twitter search for is this player in

(36:16):
the warm up? You know, you're competing essentially against the
best players, the best betters in the world. So I
love that one point eight million dollars in bets and
believe this is just like on average the bet size
is like twelve hundred bucks, you're making fifteen hundred twelve
hundred dollars bets or whatever. Over the course of the season,

(36:37):
I made about five thousand bucks, probably spent about five
thousand hours on this, so I made less than minimum
wage actually with my with my sports betting side hustle.
I love it.

Speaker 2 (36:49):
I love it.

Speaker 1 (36:50):
So I want to ask two more things before we
wrap up. The first is kind of a theme that
I and I think I picked up on this because
you know, it's something that I've obviously thought about a lot,
a lot, And it's the strand that is woven throughout
the book, which is that even though the book is

(37:11):
about skill, about finding edges, about how to make plus
ev gambles, there is this thread throughout it of luck,
right and of the fact that you also do need
to get lucky, and you you actually you point to
a lot of these moments where things could have turned
out so differently, like, you know, to pick a random example,
the car crash that Peter Tila and Elon Musk were in, right,

(37:35):
Like what happens if the car lands differently. And I
don't know if you want to talk about that specific example,
but this is something that you're clearly very aware of,
and I just want to raise it and talk about
it a little bit.

Speaker 3 (37:49):
Yeah, there is a survivorship bias problem when you're writing
a book about successful people, and literally kind of in
the case of Elon Musk and Peter Teal, Elon had
sold his first company, bought like a million dollar McLaren
F one or something sports car basically almost like a
you know, Formula one caliber car, and they're driving in

(38:09):
on and I think sand Hill Road or somewhere in
Palo Alto, and Elon floor is the accelebrator while trying
to change lanes, and it kind of spirals up into
like doing a little helicopter motion and miraculously lands on
its wheels, and Elon and Peter are are okay, and
actually hitchhikes their VC meeting, And I guess keep PayPal

(38:31):
alive by with that hitchhike. But so I asked Peter
teel you know, I asked him in a way that
was nerdier. I asked him, if you ran the world
a thousand times, had a thousand simulations, then how often
would you wind up in this position? And he objected
to this question and went on a whole thing about
determinism or probablism, which is interesting, but we don't have

(38:51):
time for today, I don't think. But yeah, life is
very contingent. We've talked on the show about the assassination
attempt against President Trump where that was a very close call,
or things like the butterfly ballot in the two thousand election,
which is in one ballot in one county in Florida
in two thousand, the ballot had a fun design where
some people who would have wanted to vote for Al

(39:12):
Gore wound up voting for Pat Buchanan instead, and that
was enough to cost or the election. On top of
the recount stuff in Florida wouldn't have mattered if this
ballot design had been better.

Speaker 2 (39:22):
So life is.

Speaker 3 (39:23):
Whether it's literally metaphysically random or not is a philosophical question,
But it's random to our ability to determine it in
a lot of ways. And it's just it's just very
hard to acknowledge the role that luck plays in your life.
It's very hard even for me or for you. Maybe
you're the exception, Maria, it's very hard not to tell

(39:44):
yourself a clever narrative in which in which you know,
even in writing the book I have. I think I've
gotten very lucky with a lot of things to kind
of have, like this SBF story, like fall into into
My Lap, for example, was I think pretty lucky. And
having a book that covered territory like AI that kind
of became a much bigger deal, and sports betting really

(40:05):
blew up. I mean, you know, so I think I've
gotten very lucky in a lot of things recently, the
way that, like, you know, I didn't know that actually
having this independent newsletter is actually actually much better economically
than being working for a big company. I didn't know that,
And I could easily have taken a deal with another
big company and wound up worse off for it, and
a lot of respects, and so, you know, I know
the role that luck has played in my life, including

(40:25):
just things like being basically healthy and having a supportive
partner and supportive friends, and being born into a country
where it's okay to be a gay man and a
country where where you have some downside protection. I mean,
those are all things that are are very lucky. So
you know, the next time that I take a cooler
in poker or b louse set of or set or something,

(40:46):
then I can't complain too much.

Speaker 1 (40:49):
Well, I had more questions, but that seems like such
a beautiful place to end that I think I have
to not ask my remaining question. But yeah, I mean,
obviously this is something I've thought about a lot and
was kind of the big theme of the Biggest Bluff.
But it's nice to see that you know that this

(41:09):
something that you're thinking about as well, because I do
think that it's easy for us to get over confident.
And I think that as we you know, during risky
business and all all of these endeavors, we do want
to maximize our skill, but we do need to always
remember that we're we are insanely lucky to even be
in this position. So Nate, I hope that you get

(41:30):
continue to get lucky, and that this book sells an
insane number of copies and that it does incredibly well,
and that you and I can continue making this show
together because we're both continuing to be lucky and yet
take plus evy edges whenever we can.

Speaker 3 (41:49):
Thank you so much, Maria, and you know maybe if
you want to have you on the show again sometime.

Speaker 2 (41:53):
You know, I had a lot of fun. We have
good chemistry.

Speaker 1 (41:56):
I think, I agree, I agree. So yeah, you're you're
welcome back. How's next week looking for you?

Speaker 2 (42:01):
Actually really fucking busy, but we'll make it work, all right.

Speaker 1 (42:04):
Sounds good, sounds good, Thanks so much, Nate. Oh, by
the way, before you go, last week, Nate and I
recorded a special video only podcast where we talked about
Kamala's pick of Walls as her VP. So you can
find that on our YouTube channel. We hope you enjoy it.

Speaker 3 (42:35):
Risky Business is hosted by me Nate Silver and me
Maria Kanakova. The show was a co production of Pushkin
Industries and iHeartMedia. This episode was produced by Isabel Carterer.
Our associate producer is Gabriel hunter Chen. Our engineer is
Sarah Bruger. Our executive producer is Jacob Goldstein.

Speaker 1 (42:53):
If you want to listen to an AD free version,
sign up for Pushkin Plus for six ninety nine a month.
You get access to ad free listening. Thanks for tuning
in

Speaker 3 (43:10):
At don bom my FA
Advertise With Us

Popular Podcasts

1. Stuff You Should Know
2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.