Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin. Welcome back to Risky Business, a show about making
better decisions. I'm Maria Kannakova and I'm Nate Silver.
Speaker 2 (00:30):
We're going to kick the show off with a lighthearted topic,
what is the chance that artificial intelligence will destroy every
last human being and all animals speaking us on Earth?
Speaker 1 (00:38):
Yeah, Nate, that is a very very lighthearted area of discussion.
I'm looking forward to it. Then we're going to move
on to the presidential debates and how important they are
and what they mean for the elections.
Speaker 2 (00:49):
And then for our serious segment, the World Serious Poker
begins next week in Las Vegas. We're both going to
be there. I already got my wired transfer to whatever
new payment provider the world series uses, So we'll do
a preview of the series on today's show.
Speaker 1 (01:05):
All right, Nate. So you know, I did not think
that twenty four was going to be the year that
I would be talking about Scarlett Johansson in reference to
artificial intelligence and open Ai. But here we are because
open Ai, which is one of the biggest companies in
AI research and the fourth behind chat GPT, just unveiled
(01:26):
the new voice of their newest AI assistant, and that
voice sounds eerily like Scarlet Johansson, which I guess is
not surprising since there was this movie Her where Scarlet
played this AI assistant who becomes very very human like
and engenders very human like emotions in her co stars. Right,
(01:48):
and then we see this announcement from Sam Altman, who
is the founder of Open Ai, about their new voice
called Sky, and I want to go back to the
name of the voice in a second when we talk
about the more substantive issues in open AI, but he
references her when he tweets about it, and the voice
(02:10):
sounds early like Scarlett, and she issues a statement that says, hey, guys,
you know that is not me. I did not agree
with this, and we're getting some lawyers involved. And then
the voice is very quickly taken down.
Speaker 2 (02:22):
Yeah, because he had tried to make a proper deal
with her. She had said nah, cold feet. And then
they come out anyway with this product that sounds a
lot like her. So it's I mean, look, this comes
up in my book a little bit, which I'm always pitching,
I guess on the show, But I think Silicon Valley
does not understand politics very well. Right, you have this
(02:42):
very popular actress people left, right and center. Literally my
Twitter feed, we're pretty upset for how brazen Sam Altman
and openay look about this, and that matters. I mean,
Congress has a lot of power to regulate AI. There
is actually a broad bipartisan support for more a regulation.
Theydy the AI might be dangerous in different ways, maybe
(03:03):
not quite in the way that the people will talk
about in a moment. The doomers think about it, but
people think it I'll take their jobs, and people think
it will make misinformation worse, all types of problems algrethmc bias.
So people have to be people have to be careful.
I mean, look, I kind of hate politics too, but
unfortunately they have a lot of regulatory powers. You have
to you have to put up with some of that
(03:24):
stupidity and not egging on by doing something twice as
stupid yourself.
Speaker 1 (03:29):
That's absolutely right, and Nate, I am going to show
the world how much you're already rubbing off on me.
In terms of using a sports analogy, this is akin
to what we would call an unforced error, right, like
this did not have to happen, And this is in
a week where open ai desperately, desperately needs some positive publicity,
because what we really want to be talking about isn't
(03:51):
Scarlett Johansson, as gorgeous as she is and as amazing
as her voice is and as much as we love her.
But the other really really big news from open AI
this week, which is that one of their co founders
and chief scientist Ilia Sutskib has resigned. And he is
not the only one.
Speaker 2 (04:10):
Yeah, I'm not going to pretend I know all these names.
It's not like their NBA players or how to pronounce them,
but yeah, unlike a Leopold Aschenbrenner, Pabel Isma Lov, all
these people that, again I don't know personally, but like
credible people think are safety oriented open air researchers, and
a half dozen of them have left in the past
(04:31):
half year or so. It's a pretty bad sign for
a company that's marketing itself as once having been a
nonprofit that was concerned with doing AI the right way.
Is that people in that coalition are are now leaving
in some significant number.
Speaker 1 (04:45):
So the reason that people, including the two of us,
are so incredibly worried about this or thinking about this
and trying to figure out should we be worried about
this is because this was the super Alignment team. This
was the team responsible for making sure that AI doesn't
destroy the world. And by the way, I'm just going
to harken back to our Scarlett Johansson discussion for one second.
(05:08):
They called the sky and like, were they trying to
refer to Skynet as well? Like, were they just trying
to put a terminator in the reference right in there,
saying like, hey guys, we're unleashing something that might destroy
the world.
Speaker 2 (05:21):
So I think there's a detached irony. Like, look, I
think Sam Altman's Twitter feed is kind of funny, right,
And it's kind of funny, and he's combative and he battles.
He's a poster which if you're like me, who the
fuck cares? If you're running open AI, then maybe you
have to be a little bit more buttoned down. And
they're kind of aloof right. San Francisco and the Silicon
(05:44):
Valley have always been kind of detached from the rest
of the country. They're all very smart, all very rich.
They're also very you know, male for the most part,
They're probably in a bubble in certain ways. And I
don't know, like I said before, I might think Congress
is stupid. I kind of sympathized more with the Silicon
Valley side and the argument in many ways than the
(06:05):
DC side. I think they're more innovative. I think technology
has been good for society. We're all, but like I
think there is a lot of arrogance there that might
not end well. You look at what Congress did to Napster,
for example, or other businesses that we're seen as having
gotten out of line. And there is a pretty broad
bipartisan sense of weariness about AI for different reasons. Democrats
(06:26):
might be concerned about algorithmic bias. Both parties might be
concerned about jobs being taken over. But the political climate
on this could shift quite quickly, and that would be
very bad for Sam Maltman's business.
Speaker 1 (06:37):
So let's talk about this concept known as P doom.
So do you want to just first to say, like,
what is P doom?
Speaker 2 (06:45):
I pronounce it more like pay doom because looking myself,
like you look like a urinary tract problem or something.
Speaker 1 (06:52):
All right, P doom.
Speaker 2 (06:54):
So pay doom is a shorthand for probability of doom,
which is a shorthand for what's the chance that AI
destroys civilization or substantially impair civilization, which is actually like
a pretty big distinction. People have different definitions. Eliezer Yudkowski,
one of the most prominent doomers, meaning people who are
(07:15):
very worried about AI risk, defines it literally is all
human beings and animal species perishing, right, every single living being.
He has said in a Time magazine article last year.
I think so he's being very literal. Someone like Ajaia Kotra,
who is an AI researcher, uses some more inclusive definitions.
I mean things where humanity loses agency if, for example,
(07:39):
we are no longer really in charge, maybe our lives
are okay, or some very rich people in Silicon Valley
have some agency. But if we kind of hand over
the keys to machines and it becomes kind of dystopian,
and that could qualify as p doom as well. That's
a big distinction. And people also have very wide ranges,
(08:00):
which when they cite this chance you can get people
who think it's you know, ninety nine percent in this
coming from Eliezer Yukowski. The consensus, and again if you
kind of put aside the fact that the different definitions
of it is probably in the range of five to
ten percent. If you survey AI experts, AI researchers give
(08:21):
them different definitions, they kind of gravitate toward that five
to ten percent range. Some people in the safety community
are higher, maybe more the range of ten or twenty percent.
And by the way, let's not be too abstract here.
If it's true that there is like a ten percent
chance or even frankly a two percent chance of humanity
going extinct from AI in the next one hundred years,
this is a tremendously important problem. Yeah, so figuring out
(08:44):
how reliable those estimates are is an important challenge that
the show will tackle from time to time.
Speaker 1 (08:50):
Yeah, and I just took round it even more. Let
me kind of talk you through a problem that has
nothing to do with AI that I had to deal
with quite recently, where you kind of start realizing what
it means, what five percent means, right, what ten percent
actually means? So I had a leak in the shower, right,
you know, like that just completely really annoying, like drip
drip drip from the show our head, and so you know,
(09:10):
you look it up online, you try to figure out,
like what's going on? How do we fix this? And
turns out that you know, this is a gasket problem,
and according to YouTube, it's super easy to do it yourself, right,
you just you know, you take it off, you replace it,
put it back on, and voila, everything's done. So this
happened in a big apartment building and I had to
(09:34):
reach out to the management and be like, hey, how
do I turn off the water in my apartment? Because
that's step one, right, you turn off the water flow.
And so they ended up sending someone and she was like,
are you insane? You do realize that you could end
up flooding the apartment below and if you do that,
you're going to be liable for all of those repairs,
which could go into the tens of thousands of dollars. Now,
(09:57):
of course, like normal people walking around aren't going to
give me a percentage and be like, you know, Maria,
your p flood is approximately fifteen percent here, But she
made it seem like it was certainly great than five percent,
and maybe you know, greater than ten percent, especially because
the building has very hard water. And so there's isn't.
Speaker 2 (10:17):
A real thing that sounds like some bullshit.
Speaker 1 (10:19):
It is a real thing. Date, It's a very real thing.
As a female who washes my hair all the time.
Let me tell you, as someone who doesn't have nearly
as much hair as I do. Sorry, Nate, but that's
just Them's the facts. God, that hard water is a
very very real thing and has a very different effect
on your face, on your hair, on everything, and apparently
(10:40):
on the level of corrosion when fixing a shower on
your pea flood. And so here I am trying to
figure out, okay, well what do I do, like, how
do I actually decide this? I ended up asking my dad,
because you know, daddy knows best, and he had a
very good way of thinking about it. He said, do
it if there's no visible sign of corrosion and it's
(11:02):
super easy to take it out, don't do it otherwise.
And so I, you know, opened it and there was
tons of corrosion. It was very clear that I could
not take it out easily. Hired a plumber, got it done.
But it made me realize that those tiny chances, when
there's a huge risk involved that can actually translate to
a lot of money, you have to take that seriously.
(11:22):
And obviously this is so ridiculously silly, right me destroying
someone's bathroom below mind versus humanity going extinct. But it
just illustrates that even if you said, you know, the
p doom is five percent, and that's not just five
to ten percent, but five percent or two percent, that
is absolutely astronomical when the actual cost is the end
(11:44):
of the world.
Speaker 2 (11:45):
Yeah. In fact, I'm not even sure that it makes
sense to think of it an expected value terms, because
expected value is the average or some number of outcomes.
Speaker 1 (11:54):
Right.
Speaker 2 (11:54):
I made a bet on a soccer match the other
day that you know, I had to win one out
of six times to be profitable. It did not win,
but in the long run you would make money from
that ben Right. If you destroy civilization, however, you don't
have at no longer, I don't have a view over.
Speaker 1 (12:12):
This is a very very important point. Yes, if I
destroyed the downstairs apartment, I will be able to pay
for it. So this is a risk that I can
think about. If I destroy the world, I'm not going
to be able to make it all better with a
band aid.
Speaker 2 (12:23):
And what's very weird about the politics of all of
this is that the people who are most worried about
p doom tend to be the people who are building
the AI systems. Which to me people try to like
over complicate this and say, well, maybe they have some
marketing and go like, that's fucking stupid. They're worried about
it because a it's made progress much faster than people anticipated,
(12:45):
at least recently. I mean, I think very few researchers
would have thought that large language models like CHATCHYPT can
accomplish as much as they do right now. Or they can,
you know, answer any question with a degree ranging from
somewhat competent bullshitter to actually very good on many questions.
They can like solve some mathematical equations, they could do
(13:07):
some programming, they can draw shitty art, they can make
mediocre music, right. But like, these are all things that
would have surprised most experts if you'd ask them five
years ago. The other thing that worries experts is that
we don't really know how all of us works. You
kind of feed a giant, you populate a giant matrix
of what are called tokens, which are words which are
(13:29):
kind of compressed into vector space. I'm deliberately using some terminology,
but like the inner workings of these transformer models are
not entirely understood. To be fair the inner workings of
the human brain are not very well understood either. But
to have a very powerful technology that purports to do
superhuman things within a span of five or ten or
(13:49):
twenty years and which we don't understand very well seems
like kind of self evidently a risk we ought to
be concerned about. That's kind of like the short Steelman
version of why it's not ridiculous to be worried about
about ped doom.
Speaker 1 (14:03):
Well, and I think that an even shorter version is
something that you said right at the beginning, which is
that the people who are designing it, that's the same
subset as the people who are most worried about it,
and whose p dooms tend to be very high. That
tells us something, Right, if everyone who was actually working
in AI said nah, this is fine, like it's all good,
(14:24):
and we were hearing this from pundits and from politicians
and from you know, people on the other end of it,
then we'd say, oh, you know, maybe we should discredit
some of this. But it's coming from the people who
are actually doing the designing, who are actually doing the building,
And so that makes me, as someone who doesn't know
anything about these black box workings worried, especially because we
(14:45):
know that other processes that have had black box components
have gone off the rails and have gone wrong and
have glitched in the past. I mean, we've had this
with trading algorithms, right, Like this shit happens all the time.
Speaker 2 (14:57):
Yeah, Look, there are an abundance of scenarios, including that
people are over confident about AI capabilities and therefore hand
control of key systems over to AIS and they fuck up. Right,
I don't see the concerns that kind of DC has
about AI and the one Silicon Valley does that's necessarily
mutually exclusive. They could both. You know, there are various
(15:18):
categories of mild, medium, and terrible things, but there's also
like a lot of upside in general. I don't know
if there's some way to quantify this, but the best
majority of technologies move humanity forward in different ways. Like,
you know, the human condition is much much, much much
better than it was centuries ago because of modern technology.
(15:38):
People who don't know that are basically totally misinformed. And
like I we'd bet that half of people, even I
mean our listeners might know, but like the average voter
probably doesn't know very much about kind of economic history
and things like that. Technology has been a force for
good mostly, but we haven't dealt with that many technologies
that were predicted to have the power to like destroy civilization.
And by the way, nuclear weapons, you can say, well,
(16:00):
you know, nuclear weapons have gone okay so far, well
only since nineteen forty five. If there's like a one
percent chance risk every year of nuclear weapons been used
in common, which, by the way, happened in nineteen forty
five Nagasaki and Hiroshima. You know, seventy eighty years of
data are not enough to nullify a one in one
hundred chance.
Speaker 1 (16:18):
So, Nate, I remember someone when I was kind of
reading about all of the open AI stuff, something that
really jumped out at me was a CEO who described
Sam Altman's approach to open AI regulation and open AI
in general as ready fire aim And that to me
is a little bit backwards. It doesn't mean we can't fire,
(16:40):
but we should probably be aiming. And the people who left,
the people who've resigned the most high profile departures, those
were our aim people. So Nate, I think we should
be revisiting our p DO estimates in the coming weeks
and the coming months to see what happens. I think
I'm a little bit more pessimistic than you are, and
I think probably these departures have made both of us
(17:01):
slightly more pessimistic than we were before. And I would
just absolutely love to be proven wrong here. This is
one area where I hope that ourp dooms are all
way too high. And you know, we can revisit this
in fifty years and say, ha haha, remember when we
were all so scared.
Speaker 2 (17:34):
So nate.
Speaker 1 (17:34):
As you know, we are in an election year and
there were actually some big news for this early on,
and they had to do with the debates, the presidential debates.
This has been something on everyone's minds, and it seems
like Biden and Trump have mostly agreed for their debate
terms for their first debate, which has now been pushed
(17:55):
up from the fall to June twenty seventh on CNN.
So that is like that's in a month, right, that
is coming up right around the corner, and then there's
going to be another debate in September. And this is
big news for a few reasons. So I'd love you
to talk us through why this is important and why
we should you know, actually care about this news, about
(18:18):
the presidential debates.
Speaker 2 (18:20):
So the reason you should care is that we are
how to put this nicely, You.
Speaker 1 (18:25):
Don't have to put it nicely, Nate. I know we're
among friends here.
Speaker 2 (18:28):
I think this benefits. This maneuver by Biden, which I'll
explain in a moment, benefits from people who understand incentives
and strategy and kind of revealed preference versus stated preference,
all those fancy terminologies, fancy terms I should say. So
last week, Biden does a tape segment on Morning Joe.
By the way, I can't even like actually go live
(18:51):
on the show. You know, if you think you could
debate anywhere anytime, you could do a live interview, but
leave that alone for a second. Does a tape little
segment saying I challenge you Donald to not one, but
two debates. Half an hour later, his campaign puts out
a memo which does confirm that he has challenged Trump
to two debates, and also says in the third paragraph,
by the way, we are actually pulling out of the
(19:11):
Commission on Presidential Debates the three debates that they had
planned for later this year. And look, the CPD has problems.
I mean Trump ragged them all the time. In twenty
and twenty. He also didn't do any GOP primary debates.
But basically Biden traded three debates after Labor Day for
one debate after Labor Day and then one on CNN,
a cable network that will happen in the middle of
(19:33):
June that probably everyone will forget about by the time
the conventions happened in July and August, and certainly by
the time we get to the stretch round of the campaign.
So what Biden is actually doing here is reducing his
exposure to debates, which is interesting for various And by
the way, if you have doubt about that, Biden says too,
Trump agrees to two, they say how about two more,
(19:54):
and then Biden's campaign chairwoman says no, no, no, no, no.
That would create chaos. When they just create a chaos
that morning by blowing up the schedule. It's an interesting
kind of game theory problem because on the one hand,
the debates are zero, one candidate gains more than the
other almost by definition. On the other hand, if you
(20:15):
duck the debates, that might be seen as worse than
even a bad outcome of the debate. So it's kind
of like the opposite of prisoners subletment, where you have
an incentive not to cooperate. Here, you have an intetive
to reach an agreement. You want to look too difficult
or diffident. But usually when you're the candidate who's behind
in the race, as Biden is in the swing states
in the large majority of polls, you want more uncertainty
(20:38):
and more variance, meaning that you want to take more risks.
You should want more debates because they add volatility the election,
and instead Biden wants fewer. That to me is a
really really, really really bad sign for his campaign. It
indicates his campaign either is delusional and doesn't believe the polls,
and there's a lot of reporting that they don't or
(20:58):
that they think their candidate is unlikely to commit himself
well in spontaneous public interactions because he's eighty one years old,
he has always stammered a lot, but you know, seems
like audibly and visibly worse now. That is a bearish
sign that you're constraining your stategic options, your chances to
come back because you're worried about how well your guy
(21:18):
will perform. That's a pretty fucking bad sign. And I
don't know, you know, you want to make me to
make the subtext text Maria here, sure, let's do it.
Democrats might be better off replacing Joe Biden on the ticket.
I say might, because that's a very very very very
very very very risky strategy. You would have to draft
(21:40):
a candidate who was not planning on running for president.
There is no one Democrat that unifies the field. Kamala
Harris is roughly as unpopular as Biden. If she were
the nominee, you would have a huge fight at the
nomination process. You could have someone who's quite far left
with the nomination in general, simprist candidates to bear it
from another segment, but like, if you were really as
(22:01):
concerned about defeating Donald Trump as Democrats claim they are,
then you should be doing everything in your power to
maximize a chance that you win.
Speaker 1 (22:09):
All Right, Nate, I'm gonna stop you there for a second.
Did you just say that the Democrats should consider replacing
Biden as the candidate?
Speaker 2 (22:17):
They should certainly consider it.
Speaker 1 (22:20):
And wait, let's be very precise about this. How seriously
should they consider it? And what is the percentage chance
that you are giving to these different outcomes? Like why
are we considering this?
Speaker 2 (22:31):
Because that is.
Speaker 1 (22:32):
I mean, that is just like a very very buld assertion.
Less than a year before the election.
Speaker 2 (22:38):
Biden is trailing in the polls right now. His big
problems don't seem to be soluble, one of which is
the fact that he is eighty one years old and
often looks and acts like it. His approval rating is
thirty eight thirty nine percent, generally well below the number
where you get reelected. I mean, the fact that Trump
is his opponent. Trump is also very unpopular is what
keeps Biden in the race competitively. People thought, well, the
(23:00):
economy will improve and then Biden's numbers will pick up.
They haven't really. People thought, well, Trump will go on trial,
the numbers will improve, Well, they haven't really. Well, the
kind of narrative will improvably, less focus on his age
will Actually, it doesn't matter kind of what the New
York Times says about Biden's age. Most voters aren't really
in the New York Times. So like, let's say we
get to August, and let's say Biden has a mediucre debate.
(23:21):
By the way, if he is a terrible, terrible debate.
One reason I think this play is smart is because
if Biden totally shits the bed in June, totally seems incompetent,
then you can kind of pull this emergency lever.
Speaker 1 (23:35):
Let me just ask you a technical question, right, like,
let's assume that, Okay, Biden performs poorly in the debates,
the debates matter enough that this really swings public opinion,
It swings the opinion of the parties, and they do
end up wanting to replace him as candidates. Can they
do this in August at the DNC.
Speaker 2 (23:56):
Yeah, I've been being a little bit sloppy with my language.
They means that they would have to persuade Biden to
do it, because Biden right now has a majority, a large,
overwhelming majority of pledged delegates. If Biden says I'm not
a candid nomination, those pledge delects become free agents. They
can vote for whoever they want. Pretty much, I think
would be loyaled to Biden, by the way, because he
did decide who the people are, so he would have
(24:19):
a pole to dictate maybe who who the future nominee
would be. Their placement would be. But when I say they,
I mean I mean Biden, and that they are people
who were pressuring Biden publicly and privately to say, Joe,
you're gonna elect Trump unless you stand down and let
us take a puncher's chance at somebody else. And again,
I'm not sure it's the best play. I'm just saying
you have to have that mentality keeping up in mind
toward it.
Speaker 1 (24:40):
Okay, but what we're talking about a decision like this,
which is so monumental, you do need to weigh the
risks on every side. Right, so we can say, Okay,
Biden does have these issues as a candidate. I think
everyone acknowledges that. You know, I'm voting for Biden. I
acknowledge he has issues. Obviously, so does Trump. Obviously. I
(25:00):
don't think we should be having an election where we're
dealing between these these two candidates, but here we are.
So I think, though, you need to weigh the risk
of that versus pulling him out and introducing an entirely
new candidate in say August or September, when the election
is in November. So how do we actually wait that?
(25:25):
How do you actually try to create that game tree
where you figure out which of these you know, if
I'm playing poker, which of these is the least bad option?
Because to me, it is not clear that yanking Biden
out is going to be the least bad option. It
might actually end up being much worse. But we don't
know right these are things that are in the future.
We don't know who the replacement candidate would be. There
(25:47):
are so many unknowns in this particular equation, and that
makes it incredibly incredibly difficult to calculate any sort of
probability with any sort of confidence.
Speaker 2 (25:58):
So one thing I'd say, we will have more information
in August when the Democratic National Convention is held. We
may know the outcome of the first of Trump's criminal trials,
We'll have more information about Biden's first debate about the economy.
It is a very, very big risk. I don't think
I should do anything now. I think they should play
their hand with a mind toward having this option down
(26:18):
the road in August. If the make goes badly, if
things that are supposed to be good for Biden, like
the economy improves, and the numbers are still stagnant or
they get worse, then I think you have to like
have that option available now. What I would do is
backchannel talk to Gretchen Whitmer and Raphael Warnock and people
like that. Gavin Newsome, who I'm not a huge fan of,
but like certainly seems like ready to fill the position
(26:40):
if you were asked to, you know, I might back
channel very very very very very carefully that, like, we
need a couple of backup options in case something I knew,
some bullshit pretense, in case there is a health emergency
or something. Right, maybe just in case, right, maybe get
your kind of nomination speech ready.
Speaker 1 (26:58):
So from that perspective, it seems like it's actually a
very strategically sound decision for Biden to move the first
debate to June.
Speaker 2 (27:06):
Let me get so, look, I think they tactically played
it very well, in part because they actually reduced the
num of their debates, which they apparently wanted to do,
and kind of were credited for being the protagonist and
pushing footward the debates. Right, That was like, very very
very clever tactically. If you have a shitty hand to play, right,
they pull off a good bluff with like seven deuce
(27:26):
off suit. Maybe it's a little unfair, maybe they're playing
like jack seven off suit or something like that. Right,
not your favorite hand. You got a little bit of
equity jack six off suits somewhere in their nine to
four suited. You know you've seen worse hands, but like
they pull off a very good tactical bluff in reducing
their exposure. The fact that they want to reduce their
exposure when they're losing, when their candidate won the debates
(27:49):
four years ago, gives credence to the argument, along with
the lack of media availabilities that Biden does. Where you
might have a hostile setting or a spontaneous setting that
they know their candidate is too old. They know it
does not come across well to voters in those settings.
They want to minimize that risk. And when a campaign,
through turnout and through ads and everything else, that may
(28:10):
be playing their hand the best they can, but maybe
you have to fold and play a better hand.
Speaker 1 (28:30):
So, Nate, this is a really really important week for
both of us because it is our last week of
sanity for the rest of the summer. Because do you
know what starts next week on May twenty.
Speaker 2 (28:42):
Eighth, uh meteorological summer that's not until June, but Closeky
It's It's the Factor World de facto Summer Summer Vacation,
Summer Camp, the World Series of Poker.
Speaker 1 (28:52):
Summer Camp YEP summer camp for poker players. The World
Series of Poker, which is like, I mean, it's called
the World Series of Poker for a reason, but it's
a big deal, Like this is the single biggest poker
event of the year. I mean, it's something that people
plan for, prepare for forever, and for some people it's
their life's dream to come to Vegas and play in
(29:15):
the World Series of Poker.
Speaker 2 (29:17):
Yeah, there's nowhere else where for seven weeks if you
want it. There is such an abundance of poker, not
just tournaments at the World Series, but there are eight
different poker rooms around town that host their own tournament series.
There are cash games, there are friends to hang out with,
degenerate bets to make to your heart's content.
Speaker 1 (29:35):
Yes, this is very true, And the two of us
are both heading out there next week on Monday, so
that we can both play the same first event, which
is the opening event of the World Series of Poker,
the five thousand dollars Players Championship.
Speaker 2 (29:49):
That's new, right that you should have some stupid little
now get real money for the first event. I like
that resolutely.
Speaker 1 (29:55):
And it's a freeze out, so for people who don't
play poker, The World Series is a series of poker
tournaments that probably has right now the greatest proportion of
freeze out events, which means you only get to enter
or one time, and if you bust, you're done. That's it.
And that's becoming increasingly rare, and I think that that
(30:17):
makes it really.
Speaker 2 (30:17):
Special, absolutely, and that changes the whole strategy, both in
terms of your mindset and your opponent's mindset. So one
thing I think people don't realize. I actually have a
uh blog post up newsletter at Silver Bulletin with twenty
one tips for playing the real series of poker.
Speaker 1 (30:34):
Ooh.
Speaker 2 (30:34):
One of the most important tips is don't be afraid
to bluff, which sounds obvious, right. If you're like a
professional poker player, you don't need that tip by help, right,
But if you're like, you know, some amateur and you're like, well,
I'm kind of the World Series and like it's a
big deal for me, and also all these players are bad,
so like win without bluffing, No, dude, you have to
bluff in part because people are terrified and people will fold,
(30:56):
and bluffing is an essential part of poker, but especially
in those like freeze out tournaments. If you are what
we would call a weekend warrior who a few times
a year gets to play in a local card room
in Philly or something and then makes your be week
long trip out to the World Series and plays two
or three or four freeze out events. You are absolutely
terrified to lose, right, and you'll show it with every
(31:19):
action that you take. I mean, people are utterly transparent
and predictable about kind of letting you know how much
they care about maximizing the amount of money they win
versus maximizing how long they get to play for.
Speaker 1 (31:32):
Yeah, I think that's absolutely true. And also, thank you
so much for that great advertisement for my last book,
The Biggest Bluff, so bluff bluff, bluff, bluff bluff a lot.
I think that's great advice. And I will also say
for those weekend warriors for professional players, the one big
tip I would have is have fun right Like this
(31:52):
is summer camp. It's a really unique experience. I remember
my first time walking into the World Series of Poker
and it was just it took my breath away, the
just sheer energy and how exciting it was and how
democratic it was because anyone can enter, right if you
have the money like you can pay that buy and
(32:12):
you can play and you have a chance to win.
And that's not true of any other sport. So take
advantage of that.
Speaker 2 (32:18):
Two years ago I played against Naymar, who is the
extraordinary Brazilian soccer player. I play against the boxer Ryan
Garcia in the main event last year. I seem to
get Phil Helmu at my Table's a fair bit, right,
and who knows you might even get like a Maria
or a Nate. Remember we are.
Speaker 1 (32:37):
Never bluffing, No, never, never, always fold not in our vocabulary.
Speaker 2 (32:40):
Yes, we would never bluff a risky business listener.
Speaker 1 (32:44):
This is true. This is absolutely true. So yes, we'll
be sharing tips like that one, and we'll be sharing
our journey along the way. And I want to propose
something to get us in the spirit. I think we
should have a bit.
Speaker 2 (32:59):
Can I eight one hundred big Max in twenty four hours?
Speaker 1 (33:02):
No, well, I don't think I can. I don't think
I can do that bet. I might as well just
pay you out. Now, let's see which one of us
gets unluckier during the World Series.
Speaker 2 (33:13):
So you want to be unluckier?
Speaker 1 (33:16):
Yeah, in this case, if you want to win the bet,
you want to be unluckier. You want to be the
person who runs worse, who is on the wrong side
of variants the most.
Speaker 2 (33:27):
I love that. I'd love to actually quantify that. So
so we have, I think discussed privately a method for
doing this right, because there are, by the way, many
forms of luck and poker that are hard to quantify.
If you get cooler, meaning I get pocket kings and
Marie gets aces, if she wins, she had the better hand,
it's not a bad beat per se, but like it's
(33:47):
pretty unlucky for me. But one way where you can objectively,
unquestionably quantify luck is when you go all in. There
are more cards left to come, might be before the plot,
might be before the river, et cetera. And you see
if you get a run out that makes you the
best hand or not. Right, If I am all in
with ace king against pocket queens, I have whatever at
six fortyence of winning depending on the suits of the cards.
(34:10):
You can track that we can kind of compare our
realized value against our expected value, and whoever does worse
gets money gets paid off by the person. Is that
how it works.
Speaker 1 (34:21):
So basically, what we're trying to bet on is who
runs better and who runs worse in all In situations
where there are no more decisions, where all your money,
you know, all your chips are in the middle, what
we need to figure out is do only all ins
before the flop count? Does it count if we're all
in on the flop, right? Does it count if we're
all in on the turn?
Speaker 2 (34:41):
Any any all in before the river?
Speaker 1 (34:43):
Right?
Speaker 2 (34:43):
Yeah? And by the way, if you're not a poker fit,
you know, if you go on on the river all
the cards are revealed, there's no luck. There may be
luck and you happen to run in like a good
opponent's hand, right, but there there's no more turn to
the turn of the deck too, all right.
Speaker 1 (34:56):
So every single time the two of us are all in,
we are going to record the exact hands that we
have that our opponents have, if it's against one or
if it's against two, and then we're going to run
all of those equities. So we'll be able to ca
at exactly what the percentages are, and then we'll see
if we won, we'll see if we lost, and we'll
be able to have an objective measure of which one
(35:18):
of us was luckier and which one of us was unluckier.
So in this particular case, the person who wins the
bet and wins the money is going to be the
person who got unluckier. Right, I think that's fair.
Speaker 2 (35:31):
It's kind of like insurance. Right, we're hedgin. How much
do you want to bet?
Speaker 1 (35:35):
Well, Nate, you suggested a very devilish sum when we
talked about this before the show, six hundred and sixty
six dollars and.
Speaker 2 (35:43):
Sixty six cents. Marie, I don't forget those sixty six cents.
Speaker 1 (35:46):
Absolutely, six hundred and sixty six dollars and sixty six cents.
Let's do it, all right, it's a bet and one
I might add that I hope you win.
Speaker 2 (35:56):
No, I insist I hope you win, because almost certainly
the person who wins at six sixty six dot sixty
six will have lost far more than that and expected
value loss. Right, it'd much rather. I mean, it might
be literally worth tens of thousands of dollars Maria in
terms of how well, maybe probably more than that, frankly, right,
I mean you're gonna be playing the whole series. I'm not.
I'm coming and going. It's a fucking election year, right,
(36:20):
I'll play as much as I can probably three weeks total,
but you're playing the whole thing. You're gonna have like literally.
Speaker 1 (36:25):
Mostly mostly I'm taking a few weeks off, but yes,
but I'll be there for a lot of the time.
But I think that this is going to be fun
for people to pay attention to and to follow along.
But I actually also think that this is an important
prop but for a different reason, because it I think
it's a really really important calibration of your decision making process.
(36:45):
So what I always like to tell people, whether they're
poker players or not, is that you know, poker is
a skill game, right, but there's an element of luck.
And I think that's true of almost everything in life.
There are a lot of situations where you know decision
making skill, but then you have to get lucky as well.
And so what this is going to help us do
is actually put something precise on the luck element. Right,
(37:10):
how often did we make the right decision because at
that moment of the all in, were we ahead, were
we behind? How solid is our decision making process versus
the cards? What we don't control, right, the outcome the
deck and the order of the deck. That's the luck factor.
And we're actually going to be trying to put a
number on that, and I think that that's such an
(37:32):
important exercise in decision making in general, to try to
evaluate whether you're making good decisions and running badly or
whether you're just making shitty decisions.
Speaker 2 (37:42):
Because it's very possible to feel as though you're getting
beat up by the luck of the cards, right, because
you tend to remember those times. So one thing, by
the way, is that, by definition, when you lose a
poker tournament, it almost always happens in some frustrating way.
Either you take a bad beat, or you take a
cooler or you make what you thought was a plus
(38:03):
EV bluff and it doesn't work out, or you call
off and the opponent wasn't bluffing. So because tournaments on
most always end in failure except for the one in
whatever five hundred time that you win a five hundred
person tournament, it's easy to like not account for the
good luck that you had too, and that can impair
your self awareness about how you might be playing.
Speaker 1 (38:22):
Frankly, absolutely, one of the most edifying things that I've
ever done, and I do this during multiple events, is
write down every single hand of a tournament. And I
did this for the tournament that I ended up winning
that kind of launched my poker career in a way,
and that was the PCA National Championship. I wrote down
(38:44):
all of the hands and it was really really important
to me to look back and see how many times
I just got so lucky, because it's so easy to
forget that. You know, the moment where I got pocket
seven's in against aces, right, I should have been out
of the tournament and instead I ended up having a
straight and I ended up doubling. So there are moments
like that, and it's I think that in life in general,
(39:07):
it's so easy to just freak at those moments, but
it's so important to remember them.
Speaker 2 (39:12):
By the way, we talked last week about focus during
poker tournaments, this is a little hack. If you feel
like there is a day when you would like more focus,
then forcing yourself to write down every hand you play
will require you to concentrate more. You can't do it
every time. It does take a lot of work. But
like if there's a little bit sleepy, tire, jet lagged, hungover,
(39:33):
et cetera, then that's a tip that I use. I
pull out now and then.
Speaker 1 (39:37):
Yeah, that's that's a great tip night, and it will
just be it will be helpful to you to become
a better poker player. I encourage people to do this
even when they're not playing poker, when they're making decisions
in an unrelated area. When people are like, oh, how
do improve my decision agame? Like, you should actually write
down your entire thought process during the decisions, what you're waiting,
you know, how you're doing it, how you're thinking about it,
(39:58):
so that you have an objective record going back, because
otherwise self serving bias, you know, is very strong.
Speaker 2 (40:03):
I am considering what type of pizza to get for lunch, Marie,
I'm going to write down the thought process. Right, we'll
be like a Grandma Square or like Pepperoni or Suprema,
I don't know.
Speaker 1 (40:13):
I mean, Nate, you're going to go for Detroit style, right,
given your.
Speaker 2 (40:16):
Meal out, Decroit style is kind of fake. I mean,
it does exist in Detroit, but the real pizza of
Michigan is Domino's and Little Cauzers, of which I prefer
the latter.
Speaker 1 (40:26):
All right, And on that note, Nate, I wish you
luck next week, and I wish me luck as well,
and I hope we both do really well and for
people who want to follow along are you are you
going to be posting updates? Are you going to be
posting any action? Can people buy a steak in what
you're doing?
Speaker 2 (40:44):
Or now I haven't historically sold action, maybe I will
all I can figure it out, just for just for fun.
Speaker 1 (40:50):
Yeah, yeah, so I actually decided partly for risky business audience.
But in general I've posted action for a few events
on the two biggest poker staking sites, which is poker
steak dot com and steak kicks dot com. And most
of those events are listed at no markup, which means
you get it at face value. They're small pieces, you know,
But I think it's a fun way for people to
(41:11):
be able to follow along and nate if you do
that too, maybe we'll do some sort of risky business
action package.
Speaker 2 (41:17):
Just to be clear what you mean by this, Maria,
this means that people are basically buying a piece of
equity in you. If they buy one percent of your
action and you cash de terment for one hundred thousand dollars,
they therefore get one thousand dollars for example.
Speaker 1 (41:31):
This is correct. This is correct, And I'm not listing
every single event that I'm playing, but I have a
few I think five or six on there, and then
you are invested in my wins, and unfortunately you're also
invested in my losses, but your downside is capped. Unlike mine,
I've got unlimited ownside. One last thing, Nate, before we
(41:54):
say goodbye to our listeners, I would just like to
say thank you to the Las Vegas Convention and Visitors
Authority for listening, for tuning in to last week's episode
of Risky Business and taking the gender pay gap in
professional basketball seriously, because it turns out that the other
day they announced a deal to provide one hundred thousand
(42:18):
dollars in annual sponsorship to every single ACES player for
this season and for twenty twenty five. Now, of course
they're now going to be investigated by the WNBA, but
let's say kudos for a very innovative way and a
very Las Vegas way of handling the gender pigapp.
Speaker 2 (42:36):
In Las Vegas, everything has a price, including gender equity.
Speaker 1 (42:42):
Risky Business is hosted by me Maria Kannakova and me
Nate Silver. The show is a co production of Pushkin
Industries in iHeartMedia. This episode was produced by Isabelle Carter.
Our associate producer is Gabriel Hunter Chang. Our executive producer
is Jacob Goldstein.
Speaker 2 (42:59):
If you like the show, please rate and review us
who other people can find us more easily and if
you want to listen to an AD free version, sena
for Pushkin Plus for six hundred nine a month to
get access to ad free listening. Thanks for tuning in.