All Episodes

October 1, 2025 52 mins

In this episode of The Evolving Leader, co-hosts Jean Gomes and Scott Allender talk to Sir David Spiegelhalter, one of the world’s foremost authorities on risk and probability. David explains why embracing uncertainty, rather than trying to eliminate it is essential for leaders who want to build resilience and make better judgments in complex times.

Drawing from his latest book “The Art of Uncertainty”, David shares stories that bring statistics to life, from forecasting failures to the importance of imagination and red-team thinking. This conversation offers both a challenge and a toolkit: how to hold authority while admitting doubt, how to communicate risk with clarity, and how to lead teams toward what he calls ‘safe uncertainty’.

 

Further materials from David Spiegelhalter:

Five Rules For Evidence Communication

The art and science of uncertainty - with David Spiegelhalter (recorded at the Royal Institution, 30 January 2025)


 Other reading from Jean Gomes and Scott Allender:


 Leading In A Non-Linear World (J Gomes, 2023)

The Enneagram of Emotional Intelligence (S Allender, 2023)


Social:

Instagram           @evolvingleader

LinkedIn             The Evolving Leader Podcast

Twitter               @Evolving_Leader

Bluesky            @evolvingleader.bsky.social

YouTube           @evolvingleader

 

The Evolving Leader is researched, written and presented by Jean Gomes and Scott Allender with production by Phil Kerby. It is an Outside production.

Send a message to The Evolving Leader team

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jean Gomes (00:02):
Constant, uncertainty is a given for every
human being. So why are we soobsessed with it right now?
Well, obviously to many of us,the world feels a little crazy
and unstable, and this is deeplytroubling because our brains
crave the opposite. We buildenvironments from houses to
organisations and cities thatseek to make the world feel more

(00:25):
predictable, more certain. So welive in this paradox between
reality and desire. In thisshow, we talk to one of the
world's leading uncertaintyexperts who, although an
incredible statistician,recognises that thinking about
this most slippery of subjectsis equal parts science and art.
So tune in to a brilliant mindexpanding conversation with

(00:49):
Professor Sir DavidSpiegelhalter about his amazing
new book, The Art ofuncertainty.

Scott Allender (00:55):
Hi friends.
Welcome to the evolving leader,the show born from the belief
that we need deeper, moreaccountable and more human
leadership to confront theworld's biggest challenges. I'm
Scott Allender

Jean Gomes (01:05):
and I'm John Gomes.

Scott Allender (01:07):
How are you feeling today? Mr. Gomes,

Jean Gomes (01:08):
I'm feeling really uncertain. No, I'm not. I'm
feeling really excited aboutthis conversation, because the
topic that we're talking aboutseems to be pervasive at the
moment, and there's a goodreason very excited. How are you

(01:30):
feeling, Scott?

Scott Allender (01:32):
I'm feeling very thankful that it's Friday. I'm
feeling that it's been a fullweek, good week but but
definitely lots of competingpriorities for my attention. So
I'm really been looking forwardto this conversation. What a
wonderful way to cap the weekwith the conversation that we're
about to have, because todaywe're joined by David

(01:54):
spielhalter, and he's emeritusprofessor of statistics at The
University of Cambridge, and hisbest selling book, The Art of
statistics, has been publishedin 11 languages. He was knighted
in 2014 for services to medicalstatistics. Was president of the
Royal statistical society andbecame a non Executive Director

(02:15):
of the UK statistics authorityin 2020 and his latest and
brilliant book, The Art ofuncertainty, how to navigate
chance, ignorance, risk and luckis incredibly useful as a set of
tools to help us get a hold onhow to think in a more uncertain
world. David, welcome to theevolving leader.

David Spiegelhalter (02:36):
Oh, great pleasure to be here.

Jean Gomes (02:37):
David, how are you feeling today?

David Spiegelhalter (02:40):
just keeps erupting up at moments.

(03:05):
What am I doing here? Someone'sgoing to find out.

Jean Gomes (03:09):
So uncertainty is understandably everywhere

Scott Allender (03:26):
So I have so many questions, but let's start
with from my perspective. Youknow, I'm thinking about leaders
Trump and Brexit and Ukraine andAI. How does uncertainty affect
us at the most human level? Doyou think?
that have been sort of trainedto eliminate uncertainty from

(03:57):
their thinking throughexhaustive planning and control,
and now are feeling particularlyout of their depths as they get

(04:19):
sort of rocked by a series ofsystemic disruption. So what's
the starting point for embracinguncertainty? How do we need to

(04:42):
be thinking about that?

David Spiegelhalter (04:47):
Yeah, the first thing is, you can't
eliminate it. And in anuncertain world, which I think a
lot of periods, think they livein an age of uncertainty, but I
think we can fairly say we're inthe middle of one at the moment.
Uh, one of the first things wehave to acknowledge that we
can't even think of all thethings that can happen. We can't
think of it. I like to think ofpossible futures. And when

(05:08):
people are planning, they liketo construct scenarios, or sort
of best case, worst case, orsomething, you know, central
scenario, perhaps. And that inno way exhausts the
possibilities so and we have torealise that. And even in those
scenarios, the one thing we knowis none of them are actually
going to happen as going tohappen as we, as we, as we have
planned them. So we have to beready for that, and we do it in
our lives. So we can't, youknow, we can't think of

(05:30):
everything we're going to bedoing in five years, 10 years
time or so. We can just have abroad idea of the possibilities
where we might be and try toprotect ourselves against the
worst things that can happen.
And we, I suppose we're going tocome back to this again and
again. I think we, you know,we're talking about resilience.
We're talking about notrobustness. Robustness is the
idea that we've thought we'vegot a long list of

(05:51):
possibilities, and we've hedgedourselves, and we've optimised
and things like that, so we'rerobust to a wide range of
circumstances than we thoughtof. Now that's not enough
anymore, because, you know,things will happen that we
haven't thought of. And so we'retalking about non optimization,
you know, not trying to geteverything all perfected, not,
not really, you know, we can'tassume there's a mathematical

(06:16):
solution to dealing with this asthe sort that we've taught, been
taught, and I've been taught.
I've been taught. I taughtdecision theory for years and
all this stuff. So I know allabout this, but it's just not
true in the real world as weoperate. And so we have to have
this extra idea of beingprepared for the things we

Jean Gomes (06:37):
And in the last few years, given you know, all the
haven't thought of.
things that have been happeningin your work, What? What? How
has that kind of evolved, thatthinking and your application of
your your work, what's the mostexciting or interesting things
that you've kind of learnedthrough that?

David Spiegelhalter (06:51):
I don't know. I think one of the first
ones that really brought thishome to me was 2010 when an
Icelandic volcano erupted, whichI'm not going to try to
pronounce, and because of thedirection of the wind, the ash
came over Northern Europe andclosed down European airspace
for 10 days. I mean, it completeshambles, absolute curse. No
planning was ready for that atall. And I was actually brought

(07:12):
in on the committee that washastily constructed about what,
you know, in the UK, but what,what are you going to do about
it? And you know, it's anamazing, you know, they had all
sorts of people, and itcrucially came down to whether
the airline manufacturers wouldallow their planes to fly with a
slightly in a slightly lessprecautionary way. And you know,

(07:34):
the UK had a plane flying upthere in the ash cloud taking
measurements and showing, yeah,you could do it. But there was
no planning for it, and so itwasn't on the UK risk register.
And you might think, Well, thatwas a black swan. Nobody could
have thought of that. They hadthought about it, somebody had
thought about it, and actuallyit did, but it didn't make that
this was a possibility, but ithadn't made the risk register,

(07:55):
and so there was no thinkingabout it. So I don't think there
is, although I said, Oh, well,we often can't think of the
possible futures. It doesn'tmean that nobody's thought of
those possible futures. So whatthat's really brought home to me
is, and in all sorts of othercircumstances where surprises
have happened, and then afteryou realise, well, somebody had
thought about that. Is this ideaof imagination.

(08:19):
I was thinking, there should beanother. It's like
futurologists, I suppose, but inmuch more specific sense, rather
than long term trends, we'rethinking of actual specific
events. And I always sayImagineering should be a new
sort of discipline, likeengineering, where you just have
to think of things that canhappen. And some of the people
who actually really thoughtabout this, I think, are the UK
Ministry of Defence. So twoways. First of all, they've

(08:43):
employed science fiction writersand publish the stories. They're
rather good short storiesbecause none of them are going
to happen. But the idea is to,in a way, broaden our thinking.
There's a wonderful one aboutyou. So drones all over London,
you know, running amok and thatsort of stuff, and and it's all
to do with governance of dronetraffic in a dense in a dense
area. Now you realise, yeah,this is something that has to be

(09:04):
really thought about carefully.
But, you know, purely people arethinking about that. And the
other thing the Ministry ofDefence, you know, really
recommend is, you know, your redteams. And they got a whole
guidebook of the red teammindset. And I mean, the classic
idea of a red team is actually aseparate group of people who
were there, to be critical ofassumptions, to challenge group

(09:26):
think, to think of the stuffthat other people hadn't thought
of, basically to be real painsin the WhatsApp. In other words,
they'rethinking of things that could go
wrong. And Ministry of Defenceencourage a red team mindset, so
that even if you are anindividual, you try to try to do
it yourself. And it's selfcriticality. It's trying to
challenge your assumptions. It'strying to be aware of your

(09:49):
biases and things like that,which I find very difficult
indeed. I'm I've got biases upto my idols, and so I got some
awareness of them, but it's verydifficult to do anything about.
Did, I think so it's better ifyou've got a range of teams. And
I mean, there's two elements ofthe two. Well, that, again,
brings up some of the sort ofstories that I'm interested in

(10:11):
and I talk about in the book,and that's areas of super
forecasting. Now superforecasting, the forecasting
competitions that have been runare a bit different, because
they are about specifiedfutures. They're not about
general imagining of all thethings that could happen. They
have a list of futures that haveto people have to put
probabilities on them, and whenthey occur in six months time,

(10:31):
or something like that, or don'toccur, then the best people be
the closest probabilities, usinga special scoring a proper
scoring rule,win the competitions, and you
know, they've shown that, youknow, the almost amateurs, but
who have a particular mindsetcan beat some of the best sort
of professional analysts. Andthat's been shown. And that

(10:53):
mindset is almost a red teammindset. It's one that is very
open minded, is not stuck on oneway of seeing the world. They
self critical, they take advice.
They're prepared to change theirminds very quickly, very agile.
They look for every source ofinformation, including those

(11:14):
that might disprove what theyoriginally thought, particularly
those. And so these are it'sbeen Shang in all the sort of
work that's been done on thesecompetitions, this particular
sort of personality or attitudehas been shown to be the most
effective, and that has been thebest characteristics to have.

Scott Allender (11:32):
What are you seeing in terms of the trend of
that? Because it feels like, andthis is just my feeling, I
didn't study this, but it feelslike people are becoming in many
ways less open, right?

David Spiegelhalter (11:44):
That's a real problem. Do you know this
phrase the hedgehogs and foxes?
I think a lot of people haveused that. So just very briefly,
you know, there's an old story.
The hedgehog knows one bigthing, but the fox knows many
things, and it's foxes who havethis agile, un mind that isn't
committed to a particularworldview that do best in
forecasting. And hedgehogs, whoare part of a group of people

(12:07):
who think the same way and tryto fit the world into their
particular worldview, are notgood at making these judgments.
And I think you're right.
There's an increasinglypolarised worlds. People are
becoming hedgehogs of one sortor another, and that sort of
fluid central ground where youreally are adapting and to what
you hear, and very quicklyadapting is lost if you have

(12:27):
taken up a committed, polarisedposition. So I, I, oh, is really
depressing thought, but could betrue. I don't, you know. I would
not sure about the empiricalevidence on that, but I yeah, it
certainly seems plausible.

Jean Gomes (12:44):
You've also noted that people's tolerance for
uncertainty varies enormously,and some some team members
thrive on unpredictably, whileothers just just kind of gripped
by anxiety. If you're a leaderlooking after a team, or
somebody who's trying to inspirepeople there, how do you manage
those differences, so that boththe cautious and the adventurous
thinkers are heard and the teamcan can still make bounce?

David Spiegelhalter (13:06):
Oh, that's, that's, I think, a slightly
outside my expertise as a commonor garden statistician, but it
is important, because peoplevery actually, people within
themselves. They very much, verymuch an attitude to uncertainty.
They may, as I, you know, I'veknown people who took what I
thought was extraordinaryphysical risks and yet were very
cautious about their money. So,you know, the people would take,

(13:29):
you know, very big social risks,and yet being very, incredibly
cautious about their health. Sothere's all, there's all sorts
of variation between us andwithin us. And yeah, we have to
take account those differentviews. And it actually, of
course, a diversity of views isa really good thing to have. If

(13:49):
everybody was really gung hoand, you know, risk taking,
well, that could be extremelydangerous. And same, similarly,
if everyone was very risk averseand incredibly cautious, that'd
be pretty hopeless, too, in ateam. So it's that range of
viewpoints that such a valuable,such a valuable thing to have.
And of course, you've got tohave that discussion. An example

(14:10):
I use was back in 2011 whenislama Bin Laden was thought to
be in this compound inAbbottabad in Pakistan, but it
wasn't certain at all. AndPresident Obama, I did a
brilliant thing, which is he setmultiple teams on the job,
multiple independent teams. AndI don't know if he deliberately

(14:31):
set it up that some were ratherred teamish, very critical, and
some were more gung ho andoptimistic, but they came up
with a very wide range ofassessments. He said afterwards
that some thought there's only a30 40% chance he was there,
probability that he was there,and others were 80 to 90% they
were really confident. So somewere being really cautious, or,
I'm not sure, and and he had tothen make the decision, but he

(14:53):
knew the range of opinion. Andhe said he thought it's about
5050, and wait for it, it'sright. Is there? Right decision
well, unless you were Osama binLaden, of course. But then, so
the point being that, and it'sbeen argued afterwards, it'd
been an academic paper writtensaying that maybe these teams
should have got togetherbeforehand and come up with a

(15:15):
single view to give the decisionmaker. And I think that's
completely wrong. It'sabsolutely right that the person
who really, in the end, has tocarry the can knows that there's
a diversity of opinion fromdifferent perspectives, and has
to make that judgementthemselves. But it is they, and
they have to take responsibility

Jean Gomes (15:40):
for them. One of the things that keeps on coming up
in these conversations is aboutpeople's ability to make the
distinction between risk anduncertainty, and how that often
gets just called mushed upthrough emotions and so on. How
do you help people?

David Spiegelhalter (15:54):
They don't actually really like that
distinction at all, because Ijust see it as all as a
continuum. I mean risk is Cattech. You know, typically one
situations where, you know, wekind of know the probabilities,
the gambling areas, the stuffwhere we've got, we feel we've
got everything in control. Andyou know, those sorts of
situations when you can optimiseand do it mathematically and
uncertainty of those situations,when, when, as I said, you might

(16:17):
not even be able to list thepossibilities. You can't assess
the probabilities. The evidenceis very good. It's all a bit of
a mess. And I don't, I mean,it's, I just see that as a
complete continuum, because evenin risk areas, you're always
making, there's alwaysjudgments, there's always
assumptions. I mean, I alwayscarry a, got it somewhere, two
headed coin. And I, because I,you know, I flip a coin, and

(16:38):
before I say, what's the problemwith this coin will be heads.
They all says, Oh, 5050, and sayyou're wrong. You know, you made
that assumption. You trusted me.
You must be mad. So even inthose apparent risk situations,
you're full of judgments, you'refull of assumptions that may be
wrong. So I don't like,particularly like that
distinction, and I know inparticular I don't like the
distinction between which wasmade 100 years ago by Keynes and

(17:00):
Frank Knight and others, that inone of them you could calculate
the probabilities, and otherones you couldn't. Well, this
was before the age of subjectiveprobability, super forecasting
everything, where we know thatsubjective people, subjective
probabilities can be extremelyvaluable, can be very good in
the writing from the rightassessors and so on. And so you
don't need to be able tocalculate probabilities based on

(17:23):
data in order to make goodjudgments. So I don't really
like that distinction, but thereabsolutely is a continuum from
nicely, well controlledproblems, where we might be able
to assess some reasonablenumbers and things and even what
the values of the outcomes areand optimise, do our proper risk
analysis, and the shambles downin the corner of what people

(17:44):
call radical, or I like the termdeep uncertainty, when you can't
even list the outcomes, and youcertainly can't think of the
probabilities. There are someinteresting situations in
between which people arethinking a bit odd ones, where
you can't think of everythingthat can happen. But you can
think of probabilities. And youmay think, what? How can you do
that? Bank of England do that.

(18:07):
They model the central 90% oftheir belief about the future,
inflation and growth and so on.
But the outside 10% the twotails, they don't make any
attempt to model. They just sayit's somewhere out there. Could
be anywhere out there. Sothey're not saying they're not
putting a probability to streetAnything could happen,
essentially. And in other words,they're giving a probability to

(18:29):
none of the above, somethingelse. And I think that's really
nice. It's sort of a is sort ofgiving a probability to an
unstable situation. So that whenin Cove, the first quarter of
covid GDP dropped 25% it's theclassic 25 standard deviation
drop. It really happened. Butthey said, well, sorry, that's
just part of the 5% tail we youknow, you've got to be ready for

(18:50):
this. You've got to be resilientto these things, because they
can happen. And I really likethat idea of having an honest,
you know, essentially aprobability given to everything
else that we haven't listedabove. You know, the warning
problem, and it's an old idea instatistics, in Bayesian
statistics, of what's calledCromwell's law that you should
never you should always have acertain amount of probability

(19:11):
unassigned. That's sort of freefloating for something we've
never thought of, because thenit mathematically enables you to
adjust incredibly quickly to newcircumstances. The brains, you
know, a good Fox like brain,Bayesian brain, will operate
like that, because they'll neverthink they've actually got the
problem solved.

Jean Gomes (19:34):
It's very helpful.
Thank you.

Scott Allender (19:36):
Yeah, super helpful. I like that thinking of
risk and uncertainty on acontinuum that feels true.

David Spiegelhalter (19:41):
Yeah, I think a continuum, and because
there's there's a lot ofuncertainty in any risk
assessments, there's alwaysjudgments and assumptions.

Scott Allender (19:48):
So you've shared some really interesting
examples. And another story yourecount in your book is the the
Bay of Pigs story where a 30%success probability was watered
down. Down to described, as, youknow, a fair chance of success,
which contributed, of course, toa disastrous decision. I'm
curious, do you see similar sortof fuzzy language in businesses,

(20:10):
and how can leaders ensure thatthey and their teams are
communicating risks with clarityand precision?

David Spiegelhalter (20:18):
Yeah, I think, you know, I, you know,
I'm not sitting there wherethese important decisions are
being made, but I always just,sort of just, you know, sort of
grip the table slightly whenpeople do when they're talking
about important things thatcould happen. To just use words
like, could happen, mighthappen, could possibly. And of

(20:38):
course, we do. We use this inour language all the time. We
can't put numbers and all thestuff we think would happen.
But, you know, I decided, as Isay, I report, I just talked to
about intelligence agencies thatdon't that have gone beyond
this. I've got in from my deskright in front of him. I got a
mug from mi five, and becausethere's, you know, I gave a talk
at MI five, and all I got wasthis lousy mug, so and but it

(21:03):
says that here that if I thinksomething is 25 to 35%
probability, like the JointChiefs of Staff was thought for
the success of to Bay of Pigs,you have to report that as
unlikely, not Fair Chance, likethey did, which was just
hopeless. So and if it's if Iuse the word likely in an

(21:25):
intelligence report, and I canuse it, I can use words all
time, but I have to mean between55 and 75% probability. And just
those, they quite broad bands,but they at least get people
onto a similar wavelength ofthose of the use of words same
thing as using climate change.
They've got a the word likely isused in lots of different ways,
but they tend to be aroundbetween 16 is 55 to 80% mean,

(21:46):
that's the sort of range that isallowed for quite a broad range
for likely and and I think thisis incredibly valuable, because
we know that people, because oftheir risk averse, you know,
they're different attitudes torisk they will interpret words
in many different ways.

Jean Gomes (22:05):
I want one of the things that I really enjoyed was
the quiz that you had at theRoyal Institute talk that I saw
a few months ago. Can you talkus through that in terms of
helping us to quantify ourignorance? Yeah.

David Spiegelhalter (22:20):
The main thing is, I the really annoys me
about when people are trying tosay, predicting the future is
they try to predict the futureand they say, what's going to
happen. I don't want to knowwhat you think is going to
happen. I want to know yourprobabilities for what's going
to happen. And so withinforecasting competitions, people
own and with good intelligenceanalysis, people only use

(22:42):
probabilities. They don't say,Oh, I think this will happen.
No, I don't care what you think.
I want to know yourprobabilities. I want also good
evidence that you are a goodprobability assessor and and you
can train people up in thatusing sort of quiz that I do
with audiences that I got in thebook, which essentially just
asks them Almanack questions,just sort of, you know, which
is, you know, which is furthernorth, Brussels or Kiev, or

(23:04):
something like that, I'll askit's a really nasty question. In
fact, it's a really horribleone, but an easy one. But you
know, who's got more? Which hasgot a bigger population?
Cambridge, England, orCambridge, Massachusetts and
things. So I can ask thesequestions, which I hope people
don't know automatically, andask them. Say, I don't want to
know which one you think. I wantto know your probability for
which is right. So as which oneare you most confident for? A or

(23:26):
B? And then what's yourconfidence? And it's very
simple. On a score, 56789, 1010,out of 10, you're absolutely
sure. Five out of 10, you've gotno idea. And the point is, then,
and people, I could do this withschool kids, young school kids,
and they get it immediately,immediately. Well, they
certainly do after the firstquestion, when they absolutely
say, when they make a mess ofit. So it's really

(23:48):
straightforward thing to do.
People can and we're assessingsubjective probabilities for
facts. This is really quite asophisticated idea, and yet it
is very natural to people.
People can do it. And then, ofcourse, you tell them the answer
and you score them, and ifyou're and it's using a
particular scoring rule, what'scalled the briar scoring rule,
which was developed in weatherforecasting in the 1950s it's
the one used in all the superforecasting competitions and and

(24:10):
I've tweaked it a bit to makeit, I think, easier to use. And
essentially, if you say 10 outof 10 and you're right, you get
25 points. If you say 10 out of10 and you're wrong, you lose 75
so it's really vicious. It'svery highly asymmetric, and
that's deliberately designed toand if you say, you know nine

(24:32):
out of 10 and you're right, ohGod, you lose you get 24 and if
you're wrong, you lose 81 I cando that in my head, you know,
because of a clever trip. And soyou get your points and and then
some people do really quite get,people who know a lot or a bit
lucky, get quite a lot ofpoints, and people who are quite

(24:53):
cautious, the risk averse,people who know what they don't
know often, you know, stayaround. Fact, they do five sixes
and sevens, and they stay aroundzero. And then, of course, you
get people with big negativescores, the gung ho people say,
Oh, yes, I think Brussels isfurther north than Kyiv or
something. I think Barnstable,Massachusetts got more people in
basketball. Oh, the other one Ido is, you know, who's older?

(25:14):
President Trump or Charles thethird? And that's not, maybe
that's not too difficult, but Idon't know who's older, the
Prince of Wales or the Princessof Wales, all these sort of
things that and and then, ifthey get, you know, the gung ho
people get it wrong, they lose75 and then they try to catch
up, and they go more than theygo, and they end up with massive
negative scores. It'sparticularly associated, I feel,
with young males. But nevermind. And I would say these are

(25:38):
sort of people you do not wantis your financial advisors,
because they don't know whatthey don't know. And the other
thing I say, you know, achimpanzee can get zero just by
saying five every time. Youshould not be getting negative
points. So I think it's a reallygood and this is a sort of
training exercise that people doto train people in subjective

(26:01):
probability for assessment, tostop them being overconfident,
to stop them saying, Oh, I'm 99%certain when actually this is
hopeless. You know, they're not.
They're not calibrated. In otherwords, because if somebody is
well calibrated when they say,I'm 90% sure they're right, 90%
of the time, roughly.

Scott Allender (26:21):
Do you know if that leads to sustainable change
for people like, when thatoverconfident young man gets
such a negative score, does doeshe actually shift?

David Spiegelhalter (26:29):
I they they can improve over a competition.
They really can, because theylearn so quickly. Oh my god, I
got to be careful, apart fromthe ones who say, I've got to
catch up and get even worse. Butthat's the that's the gamble.
Who puts even more on afterthey're losing? Yes, but there's
always those. But I said, Don'tgamble whatever you do. But
people can, people pick up on itvery quickly and realise, Whoa,

(26:52):
I got to be careful. I don'twant to lose. I don't want to
lose all my points. I'm going tostick to what I genuinely feel,
rather than exaggerating myconfidence

Jean Gomes (27:01):
that that ability to that kind of metacognition, the
ability to actually understandyour confidence around what you
do and don't know in a situationof uncertainty, is incredibly
valuable, and it could beapplied in lots of different
circumstances.

David Spiegelhalter (27:18):
Oh, just in your normal life. It just

Jean Gomes (27:20):
implies, what else could we do to increase that? Do
you think?

David Spiegelhalter (27:25):
I think do this quiz I usually built in the
school curriculum becausequizzes like that, and they have
used this, they've used sort ofprobabilistic confidence based
marking for questions, ratherthan multiple choice. And you
have to say what you youactually say you're confident
you are in your answer. I thinkit's a very good, a very good
way of marking people'sknowledge, because if you know

(27:46):
what you if you know you don'tknow, you don't get marked down
as much as if you you know. Juststab at an answer.

Jean Gomes (27:54):
Where do you find in your personal life, where you
might not necessarily always,kind of use that for yourself?

David Spiegelhalter (28:00):
Yeah, I don't like it in my personal
life, because I know I say allthis stuff doesn't mean I'm
hopelessly optimistic. That'sthe problem so, and I know it,
at least I do have that insight.
And so I'm a positive liabilitywhen it comes to making
decisions in the face ofuncertainty. I'm counted by my
partner who's very good, muchmore, much more down to earth.
Oh, no, you we will take morethan that time to do something.

(28:24):
And pulls me back from my grossoptimism about everything we can
manage to do in the next threehours, or something like

Jean Gomes (28:30):
that. Well, it's heartening that it's not you're
not perfect, no,

David Spiegelhalter (28:35):
no, exactly, and but on the other
side, again, what it shows isthat in some ways, you know,
people are different aboutdifferent things. When we go on
holiday, I used to be when yougo on holiday, do you plan
everything? Do you know whereyou're going to stay

Jean Gomes (28:47):
today? Yeah, what I'm going to do? No, my wife
wants to plan everything. Idon't

David Spiegelhalter (28:52):
Okay. I was different. I used to plan every
holiday, and, you know,meticulously and where we're
going to go and everything I wason guidebooks and everything we
know we have to see this. Thisis where we're going to stay.
Anything for that? My partner,would not, didn't want to think
about anything until she was onthe plane, if then, and we
reached a compromise. And now wego sort of backpacking in Asia,
and I'm allowed to book thefirst two nights, and that's it.

(29:18):
And then we then we make it upas we go along. It's chaos, and
I so I've learned to deal withit. I think this is all right.
First is, you know, cast. Itseems absolutely horrific, but I
think it's because I'm with her,and I think it is that she's
really good, solid travellingcompanion that I did it. So I've
managed to change and becomeless of a planner and more open

(29:39):
to the uncertainty, thepossibilities, and it's great,
it's wonderful, but it did takea while.

Scott Allender (29:45):
Let's build on that. You said your relationship
with travel has changed. Isthat? Is that the primary way?
Or what else have you learnedabout coming to terms with doubt
and uncertainty? You wrote, youwrote about it in a recent
piece. I just wanted to flagthat.

David Spiegelhalter (29:59):
Yeah. Yeah, yes, yeah, yeah, exactly No. I,
I felt this quite strongly,because Peter rogow was my
father. I used to suffer fromwhat he called Travel fever,
which is a, you know, apparentlythere's a term for this in other
languages. That is when youranxiety about what could go
wrong or what might happen, youruncertainty becomes so much it
paralyses you. And he stopped,had to stop travelling, which
was such a shame, because he wasa good traveller, but he just

(30:21):
got so, so anxious about it. Andthis is, you know, standard
thing. You know, there arestandard intolerance of
uncertainty scales. Then if itgets high enough, people have a
clinical condition that theyjust can't deal without being
absolutely sure of what's goingto happen at every stage. And I
still suffer. And I startedfeeling I was suffering from
this a lot. I was going, Icouldn't sleep for days before

(30:43):
going on a trip. I couldn't doanything. I couldn't do it a
terrible state. And I went to apsychotherapist, and it was
really valuable, because I'vesince found out this is
absolutely standard bit ofcognitive behavioural therapy
that she did on me. But Ithought this was great. And she
just said, okay, don't try todeny the feelings. Feel the
feelings, but then just say toyourself, these feelings are

(31:04):
identical to what I would befeeling just from excitement of
a really exciting, fun thing tohappen. You know, the
butterflies, the feeling ofthing. And so you're just trying
to recode in your brain thatsort of anxiety into excitement.
I mean, it's absolutely now, Iknow it's completely standard,
but it was, I think it's workedto some extent. I give myself a
little bit of a talking to get agreat man, and I think it's

(31:26):
worked to some, some extent. So,you know. So what that is, is
getting we can't, is thesituation just as uncertain? I'm
changed. What I do at all, nonethe least. So in a way that the
actual chances of stuffhappening haven't changed one
bit. It's purely the way Irespond to it that one can
change. So there's

Jean Gomes (31:47):
a good reason why your books call the art of
uncertainty. Because you haven'tjust got it's not just a book
about statistics. It's alsoabout recognising, as you say,
the the map, not the territory,is important. Can we talk about
that a bit?

David Spiegelhalter (32:02):
Yeah, no, it's so important because I'm a
mathematical statistician. In mylife, I've done huge amounts of
data analysis and modelling andusing packages, writing
packages, and doing all thisstuff, and where you are trying
to, anyway, tame uncertainty andeither getting it into a form of
an uncertainty interval that youcan calculate about a quantity

(32:23):
or about the future or somethinglike that. And I'd done that and
taught it all and and it'sgreat, but it's all wrong. And
that's not really fair to sayit's wrong, because it sounds
like you can throw it out. No,it's deeply limited. And so my
book is really about and the whyit's called The Art of
uncertainties, that it's not ascience of uncertainty. Is this

(32:43):
not an algorithmic process to gothrough all this, although I've
taught it as that, oh, you know,work out what the decision, the
options are, what what thepossibilities are, put your
utilities on them, sets theprobabilities, calculate the
subject of expected utility. Andthere, you know, the right
decision to take. I've taughtthat for years. That's nonsense,
it really so it's not nonsense,but it's a sort of idealised

(33:04):
package that really is not goingto work in most circumstances.
So I so a lot of now what Iteach or talk about is trying to
quantify as much as you can,while being fully aware you can
never do it. So I really lovestatistical analysis. I like
good models and things likethat, but none of them are
right. Them are right. You know,it's an old saying, all models

(33:24):
are wrong, but some are useful.
And you really have to get thatinto your mentality. And so
every, every time I see acalculated interval, a p value,
all these things, I just knowthey're all wrong, and because
they're all assuming the truthabout that the model, every
assumption in them, everyassumption in the model, is
true. And you know, this isdefinitely not the case. Now,
they may not be, importantly,wrong, but they're just not

(33:46):
wrong, not right. They're not afact about the world. And so and
this very, I think, quitehumbling and very valuable,
because it gives you a degree ofof humility about any analysis
you might do. And that's why, asI said, the simplest thing of
flipping, asking people what theprobability of a head is 50, and
they say 5050, and just say,sorry, you're wrong, because you

(34:07):
trusted me. I've got a twoheaded coin, so you can be
caught out at just notunderstanding something about
the problem and and that canmess up everything you've done,
and yet you've confidentlyclaimed this analysis so and the
other side of that is, well, no,not on the other side of that,
though, one way to contribute, Ithink, to that humility, is
something I now do andrecommend, and many people do,

(34:30):
which is, as well as doing aquantitative analysis and
calculate, calculating theestimates and the intervals, all
the stuff that comes out of thepackages actually, to look
inside yourself and say, Well,how confident am I in this
analysis? Do I actually believethis stuff? You know, how good
is the evidence? Is the datajust so ropey, or am I, you
know, do I am I suspicious aboutwhere it came from, or the

(34:53):
design of the study, or, etc,etc, or my. Be understanding
what goes on in the world. Tobuild a model, all the
distributional assumptions youhave to make the independence of
something, you know, just whatare all the things could go
wrong in some situations. You'revery confident. You really feel
you've nailed it. Youunderstand, you know, you can
think of car insurance orsomething. You know, you've got

(35:14):
vast amounts of data. You've gota lot of information. Nothing
can go that wrong. And so youcould have high confidence in
your analysis. But if you'redoing insurance on even on
space, spacecraft becoming morecommon now, but you know, on
really things have happened. Yougot to admit that you haven't
got much confidence in all yourcalculations. And similarly, I

(35:34):
mean, during covid, for example,when the scientist committee
were giving advice to thegovernment. They qualified all
their judgments by how confidentthey were in their analysis and
into a low, low to moderate,moderate, moderate to high and
high confidence that the datawe've got can answer the
question that we've been asked.
And I now do that absolutelyroutinely, and recommend that

(35:57):
everyone does a sort of metaview of your quantitative
analysis. Well, what I can alsocan call indirect uncertainty,
not the direct uncertainty aboutthe quantity that's going to
happen, but your indirectuncertainty, all the other stuff
that makes you, which you mightnormally just put as little
caveats at the bottom of somereports, but that's completely

(36:18):
inadequate. You should have thatupfront. Actually, we don't
really know what's going on, orwe really do know what's going
on.

Sara Deschamps (36:27):
Welcome back to the evolving leader podcast. As
always, if you enjoy what youhear, then please share the
podcast across your network andalso leave us a rating and a
review. Now let's get back tothe conversation.

Scott Allender (36:41):
I'd love to get your thoughts on what might need
to change in organisationalcultures to allow for leaders to
articulate the level ofuncertainty they might be
feeling. Because I'm thinking asyou're talking about the
pressure that's on people togive the appearance that they
know everything rightstakeholders and investors and
your boss, they want to knowthat you know. So there's almost

(37:03):
this externalised pressure toover articulate your confidence
in a decision, perhaps whereactually the real strength might
be in being able to articulateclearly the level of uncertainty
that's a part of this decision.
What? What? What are somethoughts you might have for
leaders listening right now?

David Spiegelhalter (37:21):
Yeah, no.
It's very common, because I talkto people in business with
people in government, and theyall say, Yeah, but my boss, or
my minister, I think, wants toknow what's going to happen. He
doesn't mind all this hedging,or, you know, laced really
tightly defined in a series ofcalculated options. I think
essentially, wants to, wants theuncertainty be taken away out of

(37:42):
the that bigger uncertainty tobe taken out of the problem and
and I just say, what's gonnatell them? No, it's partly
because they don't want to haveto admit it, and they don't know
almost what language to use whenthey're explaining it to others.
Politicians are hopeless. Theycannot admit uncertainty. They
cannot they have to beabsolutely confident about

(38:02):
everything they do, about everydecision. And it puts them in.
It paints them into corners.
They can't get out of it, covid,you know, they couldn't admit
that. They didn't know what'sgoing on. So they made all these
recommendations that then theycouldn't go back on, even when
they were found quite quickly, Iwas wiping surfaces and all this
stuff, we knew within a fewcouple of months, that this is
all a waste of time, but no,they couldn't say that like

(38:24):
people were still wipingsurfaces a year later. So, you
know, they can't go back on it,because they were, had to be.
They were so confident. They hadno idea of provisionality that
what we're saying, it may befair now, but we're going to
change our minds, and that somepeople who get to a certain
level to find that completelyimpossible to do, and it does
require confidence to do that,and it requires it does help if

(38:45):
you're already a trusted person,you can then admit uncertainty
more, because you know, ourresearch we've done, and not
just us, but others, have shownthat if you do in sense of
confident uncertainty, if youbring people into your
confidence and say, Well, look,we know all this. We know a lot,
but we don't know everything,and there are these things we
don't actually know yet. We'redoing our best to find out, but
we don't, we don't know yet. Inthe meantime, you can do you

(39:07):
should, you could do this, butour advice will change, and
that's sort of what I would calltrustworthy communication of
uncertainty is very difficult tofind within senior figures. You
can find it in covid. I couldsee it. And some senior
scientists could, could do that.
But among politicians, that'sextremely difficult to do. They
don't seem to have a languagefor it. They just think, partly,
I think they have a myth.

(39:29):
There's this myth that, oh, ifwe admit uncertainty, nobody
will trust us ever anymore. Thepublic will just reject us. It's
completely wrong. The evidenceis that we've got from not
dissent, not just us, but fromothers is that if you do admit
uncertainty and give a balancedopinion of, well, there are
benefits and harms for thispolicy and things, but on
balance, we decide this bestthing to do. But you know, this

(39:50):
isn't going to suit absolutelyeverybody. Exceptionally,
there's some uncertainty, andthe evidence is very good. If
you give that balanced opinion,then people are really sceptical
of you. Trust. You more, andthat's reasonable, because
you're addressing theirconcerns. You're saying what
they were concerned about,rather than just saying, No, you
know, what you're saying isnonsense. Vaccines are safe and

(40:11):
effective. You say, you know,all vaccines are safe enough and
effective enough to use on somepeople in some circumstances,
but not everybody at all thetime. And there are still some
uncertainties, and if you saythat, the sceptics trust you
more, which means, what'sinteresting is that it means
that when the governments, likethey do say, you know,
deliberate this, what you mightcall persuasive message,

(40:33):
vaccines are safe and effectiveand you should take them, they
are actively decreasing trust inthe group they're trying to
reach. They're making it worse.
They're making people aresceptical, trust them even less,
and but they don't have alanguage for doing that on the
whole. And that's something iThat's why I push all the time.
But trustworthy communication alanguage for expressing

(40:54):
uncertainty and doesn't meanwell, we don't know anything, or
your guess is as good as mine.
No, you can't do that. Isterribly important, I think, for
the trustworthiness of in asociety which increasingly is
becoming polarised, withuntrustworthy statements coming
from both sides, to be honest,from all sides, you know, to
hold that middle ground andactually try to realise that

(41:16):
there are there's a balance aswinners and losers. It's not
absolutely black and white. Theissue and to take that position
is becoming very difficult, andyet, I think it's even more
crucial to be trustworthy.

Jean Gomes (41:29):
We've kind of talked about this, but not using this
word luck. Where does that fitinto your thinking?

David Spiegelhalter (41:36):
Oh, I like luck. I think it's great. I
don't think it exists out there.
It's not some external forcethere's not the goddess Fortuna
is not watching over us andeither rewarding us or punishing
us. So I not believe in that,but I think it's a good term for
describing reallyretrospectively. You know,
things that happen to us, thatare unexpected and are out of

(41:57):
our control and yet have animpact, either good or bad.
We're on the wrong plane at thewrong time, or we do happen to
win a lottery ticket orsomething like that. So I, I, I
like it as a concept, andbecause has those
characteristics, and it's one ofthe aspects of living with
uncertainties, that things willhappen to us because we're not

(42:19):
in control of our lives most ofthe time, only to a limited
extent. So things are going tohappen to us that we didn't
expect. And I think to be ableto identify that in a way, I
think it's valuable toacknowledge the lack of control
we do have. I think we've got anexaggerated many people have
said this before. We've got anexaggerated sense of the amount
that we control our lives, andwe are personally responsible

(42:42):
for the situations we're in. Andphilosophers have really
identified this with the idea ofconstitutive luck. And I love
this. This is the luck of justwho you were born, as you know,
what your ethnicity, yourfamily, your time in history,
the place, your siblings, thesituation you're born into your
community and things over whichyou had zero genes, you're over

(43:05):
zero control your parents. Youdon't choose your parents. You
know, they rarely choose you.
So, you know, so you feel love.
You've struck a nerve there.
Well, I said in the book, I forthe book, I researched my own

(43:27):
conception, which I think stillfeel slightly uncomfortable.

Jean Gomes (43:32):
You weren't going to make this personal. No, I know,

David Spiegelhalter (43:35):
I know you personally. So it really changed
my perspective on my life, torealising the bizarre chain of
circumstances that led to mebeing here, and how easily I
might not be here. It's certain,it's bit humbling, because, you
know, to realise how, howamazingly easily it is that you
wouldn't be here. So that'sconstitutive luck, and that's

(43:55):
hugely important. I know who Iwas born as, where and when,
where and when it's just, youknow, it's affected my life all
the way through. And thenthere's, they've identified
circumstantial luck, which beingin the right place at the right
time, the wrong place at thewrong time, being in the wrong
plane at the wrong time, orsomething like that. And then
outcome, like how it happensjust at that moment, how it
happened to work for you, goodor bad. And I, yeah, I think

(44:18):
it's a good way of taking apartour experience.

Jean Gomes (44:20):
It well, when, when people look at their decisions
and they do the post post mortemon the decisions, how often do
they conflate luck, good or badwith the decision?

David Spiegelhalter (44:32):
Yeah, I mean, I think, I think it is
quite reasonable to say that,well, it ended up badly, but
that doesn't mean it was a baddecision, and this is particular
case in health decision making,because, you know, if you're in
a situation, I've got prostatecancer, so, you know, I've been
in a situation, trying to makedecisions about my treatment,
and the the anxieties that, youknow, I can't under whatever the

(44:53):
decision, I can't control what'sgoing to happen. Otherwise it'd
be quite clear what to do. Andyou can't, because it's so much
uncertainty, it's untrue. Do,but research has shown and also
I feel my say and I say I feelit myself, is that a big fear is
a fear of regret. Oh my god, ifonly I'd done that, this would
have been alright this way. Inother words, if only I'd known

(45:15):
better at the time, if only I'dthought of that, if only I'd
really consider the decisionmore so that, because the
evidence, I think, is that evenif you make a decision and it
happens to go badly, where's badluck? You know, your tumour
coming back is, it's luckreally. You know, it's the but
if you did, just didn't doeverything, you know, you didn't

(45:37):
take the step, even the minimalsteps, to avoid that, then
actually you could say, well, Ican regret that. I could have, I
could have, at least made mychances better if I had done X,
Y and Z. So I think the fear ofregret is very strong in as in
ourselves, as individuals andand that's to and what I think,

(45:58):
what that illustrates the factthat you don't regret things so
much. If what happened, you maynot work out well, but if it's
just bad luck, well, I did thebest I could, and that is, I
think, deeply reassuring,because we can't make every we
can't we can't controleverything that happens to us.
We can't make sure everythingturns out well. We just don't
want to be kicking ourselves.

Scott Allender (46:19):
So you've touched on a lot of this
already, but as we near the endof our time, how can we better
prepare ourselves as leaders fora more uncertain world?

David Spiegelhalter (46:29):
Yeah, I think I've learned this
recently. It's not in the bookwhich it was that people in
therapy, family therapy havedeveloped this idea of safe
uncertainty. And what they foundis that people used to go to
therapists in a state of whatthey call unsafe uncertainty.

(46:51):
Oh, my God, I don't understandwhat's going on. My I'm not I
don't have any control of mylife. I'm in a I'm in a state
or, you know, anxious panicattacks and so on, unsafe
uncertainty. And they would goto their therapist with the
illusion that this, thetherapist could cure them, could
turn them into a situation ofsafe certainty, yeah,
everything's sorted. I havesolved, solved your problem, and

(47:14):
they could walk out all curedand solved. Now this, of course,
is a delusion in therapy. And sothere's this move which I think
is really clever, that youshouldn't even pretend that you
can put someone in a situationof safe certainty. You what you
want to do is try to move theminto a situation of safe
uncertainty. And that means thatwe can't solve the problem

(47:36):
completely. There'll still bethings which are not in control
of. Stuff will happen that youdidn't expect. However, what we
would hope is that through thetherapy you are now built at the
resilience to be able to dealwith that in a way that it's not
unsafe anymore. And this is nowbeing you. This is used also now
in child protection and allsorts of all sorts of places, as

(47:57):
a recommended idea. And I reallylike this. And I kind of think I
know when I talk to analysts whoadvising ministers, I said,
Don't ever let them think thatyou can put them into a
situation of safe certainty.
They know you are there. You'retheir therapists. Really they
want help, and they expect to betold exactly what to do or

(48:18):
what's the case. Don't everpretend that you can put them
into that situation. What youwant to try to do is move them
into a situation of safeuncertainty. In other words,
we've done our best. We thoughtof the things that can happen.
We've hedged the best we can.
And it's not completely safe,but we are. We do feel now we're
resilient to what the worldmight throw at us, up to, you

(48:40):
know, absurd levels, maybe notto an alien invasion, but come
on, we got to draw the linesomewhere.

Jean Gomes (48:46):
I love that at the end of your book, you you finish
with a manifesto foruncertainty, which helps us to
recognise that it's our personalrelationship to the world that
defines our perception ofuncertainty. Can you unpack that
little for us?

David Spiegelhalter (48:58):
Yeah, it's something I feel quite strange
actually, from my background inBayesian statistics, which based
on the idea that probabilitydoesn't exist. In fact, it's not
out there as a property theworld. It's just an expression
of our personal relationship tothe world and and this goes
along with the idea ofuncertainty being a conscious
awareness of ignorance. It'scalled, comes back to the self,
to ourselves as a relationship.
And I think it's really helpfulto realise that that we have to

(49:21):
live with it. We can even learnto enjoy it in some
circumstances, you know, bit ofrisk, bit of fun, bit of
uncertainty, we like it. We likeit. So I think, but we do have
to have that humility to realisewe cannot totally control
everything, we can't optimiseeverything we can't model
everything. We can have a go.

(49:43):
Doesn't mean we shouldn't try,but we can never actually
achieve it. And so and that, itrequires both sort of saying,
some boldness, to really try tothink of all the things that
might happen in an imagination,but never to think, never think
you're actually doing itbecause. You always have to have
that built in agility to switchin when things you didn't expect

(50:05):
happened to you. And I thinkthis for me, I writing this book
changed my life completely andand so I kind of feel that these
lessons, I mean, it's not a selfhelp book, but I'm making it
sound as almost as it is a selfhelp book, because I actually
think it is quite, I think it isquite useful to have some of
these ideas into in your ownpersonal life. But actually, you

(50:27):
know, it's also, I feel quiteappropriate to organisations as
well.

Jean Gomes (50:31):
Yeah, no. I mean, it definitely is. And I think, you
know, in the heart of every selfhelp book is changing the
relationship itself. So you aredoing that in a in a very
different way than the typicalself help book, but it is very
powerful from that perspective.

David Spiegelhalter (50:46):
Thank you.
That's really kind, although Ikeep on having to tell my
publicist, it's not a self hope.
Lots of matter, not much becausepeople have bought it, I think
thinking, Oh, this will tell mehow to live my life in the face
of uncertain and these then theysee the binomial equation, and
they go into a state, yeah.

Jean Gomes (51:04):
Well, Scott and I are avidly not mathematical, and
we had Marcus de Sara on theshow, and he tested us, oh no,
and we failed very badly.

David Spiegelhalter (51:14):
I wouldn't do that. Then I'm even more
pleased that you actuallyenjoyed it?

Scott Allender (51:21):
Yeah, absolutely, yeah. Thank you.
Thank you. And thank you forjoining us and for sharing some
of the wisdom from that book.
It's such a rich conversation.
Really, really great. Thank you.

David Spiegelhalter (51:31):
It's huge fun. Thank you so much.

Scott Allender (51:33):
And folks, if you haven't gotten his book, The
Art of uncertainty, do that nowI can't guarantee that you will
like it, but I'll give it butI'll give it a probability of
nine out of 10. I think you willso do it and and until next
time, remember the world isevolving. Are you?
Advertise With Us

Popular Podcasts

Stuff You Should Know
Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.