All Episodes

November 19, 2024 100 mins

On this episode of Revelizations I have the privilege of sitting down with two experts on decision making, Ken Smith and Tessa Mudge. We discuss metacognition, the importance of decision making and the role cognitive biases have in helping our brains reach decisions, we go into depth about multiple cognitive biases including anchoring bias and confirmation bias, how we as humans have heuristics that aid in decision making, the pros and cons of cognitive biases in decision making, should we use cognitive biases to manipulate people if it is in their and society's best interest, and more. Grab your favorite snack, grab a seat, and enjoy today's episode of Revelizations with Ken Smith and Tessa Mudge. Thanks for listening everyone.

Learn more about Ken Smith and Tessa Mudge: https://goodbetterright.com.au/about-us/

Contact Ken Smith and Tessa Mudge: https://goodbetterright.com.au/contact-us/

Additional resources Ken Smith and Tessa Mudge offer including some of the books they mention in this episode along with a free downloadable pdf on how to identify 8 different cognitive biases and how to mitigate their influence on decision making. https://goodbetterright.com.au/resources/

Enjoying Revelizations and don't know what to do next? Let me offer a suggestion: Grab a device capable of playing a podcast along with some earbuds, turn on an episode of Revelizations, place them in the ears of your loved ones, and watch with joy as they thank you endlessly for introducing them to the Revelizations podcast. While you're at it feel free to leave a review on whatever platform you're listening and follow/subscribe so you never miss an episode.

Not enjoying Revelizations and don't know what you do next? Let me offer a suggestion: Grab your loudest portable speaker capable of pairing with a device that can play a podcast, turn on an episode of Revelizations, go to a densely populated area with great acoustics, crank up the volume, and laugh maniacally as the unsuspecting population looks around in confusion to the situation they are in. While you're at it feel free to leave a review on whatever platform you're forcing everyone to listen to the Revelizations podcast and follow/subscribe so you don't miss these types of opportunities in the future.

 

Thanks to today's sponsor: Achoo-2U

Be sure to use code Revelizations at any and all checkouts. Probably nothing will happen, but it is worth a shot.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Today's episode of Revelations is brought to you by ACHOO-2U.

(00:15):
Does this sound like you?
You hesitantly open up the medicine cabinet to only see that it remains full of medicine
that's unopened and unused.
You check the mail, but snubbed again.
No get well soon cards waiting for you.
To add insult to lack of injury, you're indignant that yet again you got an uninterrupted full

(00:39):
night's sleep.
Not once did you wake up in a coughing fit or get abruptly woken up not knowing if you'd
be able to make it to the toilet in time.
You laid your head on the pillow, closed your eyes, and next thing you know it's morning.
Monday morning even.
You work hard, but not because you want to, but because the common cold is an uncommon

(01:00):
occurrence in your life.
Let ACHOO-2U remedy your lack of health problems.
At ACHOO-2U, we keep a healthy staffing of unhealthy staff at the ready to be deployed
at your beck and call.
With ACHOO-2U, we'll come straight to your door and just like the name assures, we deliver
a pathogen-stuffed human being straight to your residence.

(01:25):
ACHOO-2U specializes in sneezing, coughing, even drinking out of the same coffee mug with
the goal in mind to ensure that you get sick.
Within three to five business days, we guarantee you'll be so sick that you'll question why
you even used our services in the first place.
ACHOO-2U, never let your sick days go to waste again.

(01:50):
Hi everyone and welcome to another episode of Revelizations.
My name is Brian James and I am thrilled to present you with today's episode.
Before we get into it, I would like to do a little housekeeping.
First and most importantly, know that I am absolutely floored and honored that you are

(02:13):
listening to my podcast.
I hope you are enjoying the episodes as much as I am enjoying creating them.
If you find yourself thinking, Brian, my life has been changed dramatically for the
better ever since discovering your podcast.
How can I ever possibly repay you?
Know that the simple act of your oath of undying loyalty to listen to each and every episode

(02:37):
in this life or the next is enough for me.
I guess I am simple like that.
However, should you find yourself still wanting to do more, you can help me and my podcast
by telling people you know about the Revelizations podcast.
Word of mouth is a powerful tool and I would be forever grateful if you let absolutely

(02:58):
everyone you come in contact with know about my podcast.
Or perhaps just the people in your life that you think would enjoy it.
Also, a review on whatever podcast app you are listening to my podcast is very helpful
as well.
Next, consistency is not always key.
In these first few interview-style podcast episodes that I am releasing, the sound quality

(03:21):
on my end is not where I want it to be.
Know that I am actively working on improving that.
The good news is that today I am interviewing two guests, so you'll spend more time hearing
them than me.
Today I am speaking with Ken Smith and Tessa Mudge.
They are both wonderfully intelligent, articulate, and kind people with a passion to help people

(03:44):
make better, more informed decisions.
Ken and Tessa are both qualified teachers and trained information analysts.
They host their own podcast called How to Choose, which has further enabled them to
make captivating episodes into various topics around the subject matter of making decisions.
Their podcast has provided a great opportunity for them to indulge their love of teaching

(04:08):
and learning as they continue to try to understand and explain how humans can better manage the
many and varied decisions we have to deal with each day.
You can find their podcast How to Choose on their website GoodBetterRight.com.au and of
course on podcast platforms like Spotify, Apple Podcasts, and pretty much wherever else

(04:31):
you go to download podcasts.
Their website GoodBetterRight.com.au is also a great resource to learn more about them
where you can go to contact them and where you can find additional information on the
wealth of topics they discuss on their podcast How to Choose.
Put on your podcast swimwear as we dive into the deep world of cognitive biases with Ken

(04:56):
Smith and Tessa Mudge.
Thanks for listening everyone.
Hi everybody and welcome to another episode of Revelizations.
I am your host Brian James and today I have the pleasure of talking with two guests from
literally the opposite side of the world, different hemispheres.
With me today I have Ken Smith and Tessa Mudge.

(05:18):
How are you guys doing today?
Yeah, very well.
Thank you and thank you so much for having us.
Yeah, we're excited to be here.
The pleasure is absolutely all mine.
So starting with ladies first, if you could tell me a little bit about yourself.
And I was talking with my wife and she's like open-ended questions make me really nervous.
I don't really like when people do it.

(05:39):
And I was telling her I kind of like it.
I don't like when people do it to me, but I like being the person saying it because
I like hearing what's important to you guys.
Like when you guys talk about yourself, I don't want to lead you guys at all.
But no extra pressure because now it's like, oh my God, now I have to say what's important
to me.
Just whatever comes to mind, whatever is the first thing that you guys want to talk about.
But I would just love to know you guys just a little bit more before we jump into the

(06:00):
episode.
Well, if we go sort of back to, you know, origin stories of Tessa, I would say that
I've always been a big nerd, even when I was, you know, in school.
And that thread has continued throughout my professional life.
So a love of learning, a love of ideas and curiosity about the world has been a really
common thread, you know, from when I was a little kid all the way through now.

(06:24):
And I think I found a common nerd in Ken, which I'm very grateful for.
If we sort of skip forward, I ended up doing journalism at the university level and really,
really enjoyed the ability to research and understand different ideas and sort of the

(06:44):
diversity of knowledge that you get to with that profession.
And then I also went into teaching for a while, which Ken and I share.
And similarly, there was also a love of learning and exploring new ideas and the way people's
minds work is something that teaching actually really gets you a chance to explore.
But the first time that I really got into this field, I guess, of decision making and

(07:08):
metacognition, so thinking about the way we think, which is really what our podcast is
about, was when I became a civil servant and I started working as an analyst more than a
decade ago. And Ken and I actually worked together.
So Ken was actually my boss at the time, you know, amidst crises, because we were in a

(07:31):
pretty high tempo area where the world was falling apart and we were having to make pretty
big judgments about how things were going to turn out in war situations.
Basically, on the sidelines over cups of tea, we realized that we both had this really deep
enjoyment for thinking about thinking and just about how we can make great decisions, how we

(07:54):
can cut through bias, how we can make the best assessment or the best judgment about the
world or about ourselves possible.
And we've kind of come full circle now, you know, more than 10 years later, Ken and I both the
heads of our respective organizations, learning and development areas.

(08:15):
So it's really our jobs now to look at best practice and to teach other professionals about
analysis, writing, biases and all of these kind of skill sets that we inherently have just
enjoyed for such a long time.
Yeah, and I can jump in.
I'd like to think I was a benevolent leader, Tess, I don't know if that's how you remember.

(08:37):
Definitely, Ken.
But no, it was we have been, yeah, as Tess described it, we've been working together and
been friends for quite a period of time and look like Tess too.
I am a naturally very curious person and I don't know if those characteristics are innate,

(08:58):
inherited or whether there's a bit of an environmental aspect to it.
I think learning was really important in my family's environment as well.
You know, I was the youngest of I am the youngest of four siblings and my siblings all sort
of went on and did tertiary study of different kinds.
I had a father who's passed away now, but was a very curious thinker as well.

(09:21):
And I think modeled in our family, particularly in later life, perhaps not so much when we
were younger, but as he got a bit older, modeled a willingness to change his mind.
And that, I think, was a very, very powerful role model for me.
And I think that links very much into the kind of things that Tessa and I talk about.

(09:42):
So I've done a lot of different university study.
I've got degrees in different disciplines, which probably reflects the fact that I study
something and then I become curious of something else and study that.
So, but, you know, I think the thing, too, is I'm very interested to understand how people

(10:05):
think and make decisions.
And part of that comes, I think, from a concern when I see people making irrational judgments
and decisions and whether that's a demonstration of prejudice towards other people or
whether that's just uninformed judgments about the world around them that then leads

(10:28):
people to make poor choices.
Those are things that I'm very interested in, but very concerned about.
So I think we would see the podcast as not just an exploration of topics that we're
intellectually curious about, but we see it almost this is it sounds a bit grandiose,
but almost as a bit of a mission or at least a bit of public service, because I think there

(10:52):
are things that I wasn't aware of that I've become aware of through doing the podcast
and through the opportunities that I've had.
And I think it's really exciting when we're able to share those things.
And we've had some you don't get a lot of feedback through podcasting.
And I've heard people who have very successful podcasts say the same thing.
But when you do bump into people who've had a chance to have a listen to the show and

(11:14):
who've said, yeah, that was really helpful, I had a little light bulb moment there, then
that, I think, is what keeps me going.
So, yeah.
So I guess that's that's probably a bit of insight into King.
Actually, there's so much there's it seems silly.
Did you call it metacognition, Tessa?
Yes. Yeah, that was that was a big word that you dropped first thing in the morning.

(11:34):
I was really impressive.
But yeah, it's such a good way to say it.
It's just you're thinking about thinking.
And we'll get into this in a little bit.
But you talked about how we have enjoyed these biases and we haven't even thought about
this this decision making process.
We don't we don't think about we just make decisions.
And you said something really insightful, Ken, in one of your I think it was one of

(11:59):
the beginning episodes of the podcast.
You said strong decision making is critical to living a fulfilled and purposeful life.
And so it sounds silly, but decision making is literally how we there is nothing that
almost nothing that happens to us that is in the result of some sort of decision.

(12:20):
And to to be mindful about the decisions that you're making, why you're making it,
why am I making it, what what influences are impacting this decision?
Did I get enough sleep in my hungry age?
And these are things that I mean, we there's so many studies out there about the impact

(12:41):
of decisions. I mean, even prison sentences are impacted by the human physiology of being
hungry or not. Like after lunch, I'm pretty sure judges are a little bit more lenient
with their sentencing. That's crazy.
Like, how have we not like it is done more to to to just like protect ourselves against

(13:03):
ourselves in that way?
So on judges, it's frightening.
I was just going to say that there are there are much study when it comes to decision
making, because they they offer a really clear kind of black and white thing where, you
know, you can actually look at the decisions you can look at even things like whether
their football team has won over the weekend, the weather, whether it's kind of cold or

(13:26):
it's a nice sunny day, all of these things impact judges sentencing.
And it's not to say that judges are more fallible.
They're just more easily studdable, studyable.
So I think it's one of those things that you like.
They reflect all of our fallibilities as humans.
Yeah, it's great to be able to look under a microscope for someone that literally makes
decisions for a living and say, like, hey, can we like is are you actually black and

(13:52):
white with your decision making?
If the exact same case is presented in front of you 10 out of 10 times, are you giving
the same ruling every time?
And you just said, no, if it's a cloudy day, that doesn't bode well for you.
Yeah, it's horrifying to think about, isn't it?
And I really enjoy watching true crime documentaries.

(14:14):
Obviously, there's so many of them now when you get onto your streaming services.
But my daughter particularly has kind of drawn me into these.
So we'll sit and watch these.
And I spent half the time just kind of talking.
I annoy my daughter because I'm always talking.
This is so unfair.
This is so wrong.
What's going on here?
And some of these are obviously we understand this deliberate corruption that sometimes

(14:38):
underpins these wrong judgments.
But some of it is, yeah, it's just those little things that people aren't even conscious
of and different ways that we can be manipulated.
And I think, you know, there's so many components to it.
I think that's what's drawn us to this as a topic.
It's just, you know, they estimate that we make 30,000 decisions a day or something

(15:00):
like that. Now, you could argue, are these really decisions that, you know, whether I
pick up my toothbrush with my right hand or my left hand, you know, it's all these kind
of things. Most of these are trivial.
But as you pointed out, Brian, there's a lot of these things that we do, you know,
without being conscious of it.
And I think about it with driving, because, you know, I commute to work and I

(15:25):
commute at a time where a lot of what's what we call in Australia, tradies, but it's
tradesmen, it's young apprentices, builders.
Yeah, I can see you smiling.
There might be a similar phenomenon.
No, I just love that.
They're not known. I've never heard tradies before.
I love it.
So the tradies are not renowned for being the most careful, patient drivers.

(15:48):
And I'm sure there are some listening who will be take offense at that.
But, you know, I see young drivers, let's put it this way, young drivers in their
late teens or early 20s.
And we know that physiologically, there's part of your brain that is not fully
developed at that point.
Your frontal lobe is still developing, which has a big impact on your ability to
make good judgments.

(16:09):
But, you know, I watch these drivers and I'm thinking and I have my own emotional
reaction to that.
And it just I'm constantly reminded of the fact that as soon as my emotions start
kicking in and affecting my judgments, that that makes it very hard or impossible
for me to think clearly and consciously.
So all of a sudden, you know, we see how things escalate in conflict situations

(16:32):
where where people and it might not be a physical conflict, it might be something as
simple as an argument about a point of view, a political issue.
All of a sudden, we're not thinking carefully anymore when we're reacting
emotionally. So and that's where a lot of these different biases kick in.
You mentioned a really good point that people who are researching decision making,

(16:55):
they've estimated I've seen 30,000 to 35,000, and you mentioned a lot of them are
trivial. But I can't recall where I heard it.
But we think because decision making isn't necessarily a tangible thing that we
see. But there is a reservoir in our brain of how many good decisions can we

(17:17):
make. And each time we make a decision, we dip into that reservoir.
And so when you're at the end of the day and you're, you know, on your thirty four
thousand decision, even though like so much so many of it has just been do I stay
dressed or not? And it's just like, yeah, I just I stay dressed for now.

(17:38):
I just, you know, a big decision gets made where you have to make a big decision.
You're like, man, I just I don't have it in me.
And then so you fall back to I don't know if we've explicitly said it, but these
cognitive biases that will help us make decisions.
And that's where I would like to spend a lot of our time is just talking about
these heuristics of decision making shortcuts and decision making to where we

(18:05):
we don't really always know why we made the decision, but we're able to make a
decision. So understanding cognitive biases will help us understand decision
making a little bit better and hopefully on the other side of it, make better
decisions. So you guys in your pockets, I believe you covered eight different
cognitive biases. And so doing a little bit of research, I found that there's an

(18:30):
estimated 150 to 181 different cognitive biases that they said they're out there.
And I think there's a lot of like people will call the same thing different names
or something like that. But I just want to say how grateful I am that you guys are
willing to sit down and have a podcast conversation with me and not get up until
we have talked through every single one.
I'm just extremely appreciative and grateful.

(18:54):
We're going to have to have some new breaks.
We're very happy with that.
Yeah, well, why don't we talk about a couple of those, because there's some that
I think have really lodged in our minds as being particularly important.
And as Tessa mentioned, we're both in training roles now.
So I teach this and Tessa probably does as well.

(19:15):
We teach these to people.
So one of them that I find is particularly widespread and I'm very aware now of how
susceptible I am to that is confirmation bias.
So confirmation bias is that tendency to seek out examples and information that
supports what we already believe and then to ignore or disregard or downplay

(19:40):
information that contradicts our beliefs.
And so it's fairly easy to see the problem here, right?
This is the opposite of open mindedness.
It's really saying I've made my judgment and I'm not going to think anymore.
In fact, I'm just going to find bits of information that seem to support that so
that my position becomes impregnable.
You can't challenge me on this one.

(20:02):
It sounds arrogant, doesn't it?
But there's no like arrogance in it.
Yeah.
But like when you sit and you do Tessa's metacognition on it, you realize like, oh,
yeah, this is really flawed thinking.
If I was listening to someone say, this is how I'm basing my decision, I've already
made my decision and now you keep giving me pieces of information or I keep finding

(20:22):
bits of information that prove my point.
No, I've made my decision.
I'm staying with it.
Yeah, yeah, that's exactly right.
And I mean, we talk about cherry picking, right?
There's that that expression that we use in the English language, cherry picking
bits of information or bits of data that will support what we believe and ignoring

(20:42):
the bits that don't.
And it's the thing that that's really interesting about this, particularly in
the age of of computer algorithms, is that computer algorithms are designed to feed
confirmation bias, because what they identify is the things that we are most interested
in, the things that we click on or the things that we linger on when we're scrolling

(21:05):
through our feed on Instagram and the things that we watch for the longest.
Those are the things that we get more of.
So there is an inherent programming that's feeding our confirmation bias.
And I think it streams us towards the things that we already believe and the things
that we already like and streams away from the things that we don't want.

(21:27):
And we saw this that we've seen this at different times.
There's been criticisms of the big social media players about this, about the
algorithms that are kind of feeding people, whether it's feeding misinformation,
disinformation.
So I think confirmation bias is particularly pernicious and it's something that I'm
conscious of. We're all susceptible to it.

(21:47):
Right. And I think the thing I love with this, when I start feeling particularly
hopeless and bad about how susceptible I am, is I remember hearing an interview
with the late Daniel Kahneman, who is considered one of the fathers of this
field of study of cognitive biases.

(22:08):
And someone said, oh, Daniel, you've been doing this for studying this for 50
years. You must be really good at being able to identify these biases.
And he said, I am just as susceptible as everyone else, which I thought was was
both discouraging and at the same time quite affirming to say that these are very

(22:30):
deeply ingrained.
And I think something that you mentioned, perhaps, Brian, as we think through all of
these topics, it's that I think all we can do really is probably just be aware
that we are quite flawed in our thinking.
But the other is that ability to pause and say, OK, what's going on here?

(22:53):
You know, that we mentioned a few times that term metacognition to think about our
thinking and say, OK, is there something that I'm falling susceptible to?
So confirmation bias is one of the big ones.
Availability bias is a really interesting one as well.
And I think that there's a tendency, it's defined as the tendency to draw on the

(23:17):
most easily available memories to form our judgments.
So, you know, if I asked people and I do this as a question in our training, I say,
ask Australians, what animals do you think kill the most people in Australia?
And Australians will, these are smart people, right, and they know that they're

(23:38):
getting tricked and tested.
So they put a little bit more time into it.
But people will think about, you know, oh, OK, what have I heard about?
Right. So this is where they draw on their memory and they'll think, well, maybe
it's they they realize it's probably not snakes, even though, you know, if you're
living overseas, you might think a lot of snakes killing people all over the place.

(23:58):
It's pretty rare in Australia.
Crocodiles even, you know, it's the dramatic things are not the common ones.
And it's really interesting.
And I think we were chatting about this earlier, Brian, that it's not what you
think it's going to be.
And there's something interesting behind it.
There's two interesting things.
One is that when we think about what we remember, we remember things that are

(24:20):
dramatic and the dramatic things by definition are not typical.
They're not representative of what commonly happens.
So if you turn on the news and you hear a story about someone being killed by an
animal, it's going to be something dramatic because otherwise it's not newsworthy.
Right. You're not going to hear news stories about someone dying because they've
had a heart attack, because that happens all the time.

(24:42):
You're not going to hear actually the most common things that happen in terms of
animal deaths in Australia is I think horses is number one.
It's not that we we breed particularly savage horses.
I don't know. Yeah, it's a different world over there.
But people fall off horses, right.
And but that's not a really exciting story.

(25:03):
Someone fell off a horse and died.
That's that's not newsworthy.
So availability bias, we've got to be really conscious that our memories latch
on to the dramatic and they don't latch on to the commonplace.
And therefore, our memories don't often give us a good representation of what
typically normally happens.
I'm trying to figure out the format of how I want to do this because confirmation

(25:26):
bias, I believe there's so much there.
Let's go back to confirmation bias and kind of sit and and realize or at least
think for a second how, like what you said, that's impacting our thinking.
And we're not really considering it as much unless you kind of know about it
until you know about it, but how like search engines, search engines, you can

(25:49):
even put in a search engine deadliest like horse attacks or something like that.
And just be like, no, I'm never riding a horse because Google is pulling up.
Like, look how crazy these animals are.
They're they're just they're vicious.
And like in reality, like, yeah, they the non domesticated ones probably don't want
you hanging around with them.
But the domesticated ones and the ones that humanity or humans have have learned

(26:13):
to to live with and train like the other they're docile and they want to be around
us. I think like maybe talking about how like insidious confirmation is, especially
with like politics and how you can just really entrench yourself and into just not
changing your mind and like just how dangerous that can really be for civilization,

(26:38):
for just talking it out together.
Yeah, you touch on a couple of really interesting things, Brian and Tess, if you
don't mind, I'll just jump in.
One thing that comes to mind is there's a book that I really loved, and I know we
talk about it in the podcast, and it's called The Scout Mindset by Julia Galif.

(27:00):
And Julia Galif essentially proposes a paradigm that she suggests will help us
address some of these problems that we have with our thinking and our decision
making. And the two different ways of approaching, thinking and making
judgments, she describes as either a soldier or a scout.

(27:20):
So a soldier, essentially their role is to defend a position, you know, so they're
defending against attacks and they're ensuring that their position is held.
Whatever happens, they're holding their position.
The scout, on the other hand, their job is to go out and develop a map of the

(27:41):
terrain. So Galif says, you know, when a scout goes out with their draft map and
identifies something that is not on their map, they don't say, well, I'm just
going to pretend that thing isn't there or I'm going to destroy that thing so
that it doesn't wreck my map.
What they do is they adjust the map.
So the map, then, is a reflection as accurately as possible of the reality

(28:07):
that they see. And there's something, I think, very powerful about that.
But in the book, Galif says something else.
She says, if we want to address this confirmation bias, this problem that we
have with our thinking and our tendency to want to just defend all the time,
Galif says, that is really exacerbated if we identify with the set of beliefs.

(28:32):
So, for example, if I say I am a flat earther, right, that then becomes
something internalized.
That's part of my identity.
I'm not a flat earther, just to clarify, but, you know, we can, we can.
You're in good company.
Maybe flat earther's listening.
I highly doubt it.
But, you know, you can substitute in a whole lot of things.

(28:54):
Like I'm a vegan or I am a, and you can fill in your political position there.
I am a Democrat or I am a Republican.
That is an expression of identity, right?
So as soon as we are saying, and it's the same as if I, if, if I say I'm
Australian and you say, well, look, I have, I don't like this about Australians.

(29:16):
I can't change that, right?
It's my identity.
So I'm naturally going to defend that.
Whereas if, if it's a belief system, it doesn't have to
become part of our identity.
You know, I can say I vote this way or I believe this, but it's then putting
the belief out apart from us, you know, it's, it becomes an objective thing that

(29:37):
we can examine together and look at without becoming naturally defensive.
And something that we teach to our analysts is the value of something called an
argument map.
And an argument map is a way of visualising something you believe and you
put your belief at the top and then you put your supporting reasons under it.
It becomes like a picture of what you believe.

(29:59):
And the reason we teach that, there's a couple of reasons we teach that to analysts.
One is that it clarifies your thinking.
So when your thinking is in your head, often it's not particularly clear and
it's hard for us to see the gaps in our logic and our thinking, but when we
visualise it, we can see those gaps more clearly, but probably even more
importantly, it becomes an objective thing.

(30:21):
It becomes externalised.
So once we put that up on a whiteboard or on a piece of paper, we can sit
together as colleagues and look at that and critique it because it's no longer
part of me personally, it's not an attack against me if you see flaws in my
thinking because I've externalised it.
So even just that physical act of externalising our thinking, I think is a

(30:45):
way that helps us to avoid confirmation bias.
I'm sure we'll talk more about confirmation bias, but it's so difficult
once we are wedded to a belief, once it becomes part of who we are.
And you know, if you jump onto your social media, you can see people will
list these things about themselves in terms of their identity.

(31:05):
Once we are identified with a set of beliefs, it becomes very, very difficult
to be objective and to be open and to be scout minded about those beliefs.
Confirmation bias really entrenches us.
It's like, I don't want to say like, you know, that it's insidious, but like
that's kind of the word that I keep going back to because it's such a dangerous

(31:28):
one and it's just a way that we make decisions, but it's just such a bad way
to do it.
And so like you were saying, you even had your dad as an example of just
like, Oh, I can change my mind.
I can be humble and realize that I don't know everything.
And it's such like in today's world of social media or, or in politics, there's

(31:48):
a great quote by someone named Jane or John Maynard Keynes.
He's a, I believe an English economist, but he was being chided for changing
his mind.
And he says, when the facts change, I changed my mind.
What do you do, sir?
And so it's like, you just, you see like this, this twist of like, Oh, he's

(32:09):
weak-willed, he changed his mind.
But then when you, he presents to you with like the correct way of looking at
it's like, Oh, that's the way I should be.
But we're, we're so far from that.
And it's almost dangerous to be like that as far as like social media goes,
like social media is just its own insulated world.
But if you change your mind on that, like people who were your friends, like

(32:31):
they're now trying to find out where you live.
And so they can tell the world like where you live and it can get dangerous.
And it's just so silly if introduce a little humility to the conversation and
it can go such a different way.
Unfortunately, there are studies that back up exactly what you've, you've said
there that people see changing your mind as being weak-willed.
So we actually sort of respect leaders more who doggedly go after a certain

(32:56):
path, even when they're, you know, struggling more than we respect leaders
who are changing their mind and updating them.
It's, it's, it's, it is almost insidious because confirmation bias is something
that we're actually almost rewarding in the way that we view people.
And you see it with politicians all the time, people who hold fast, very

(33:17):
strong, not open to other opinions are actually seen as great, strong
political leaders, whereas those who are sort of more open and adaptable
and flexible in their beliefs and updating are not seen as sort of capable.
So it's, it's unfortunately something that we need to change culturally, I think,
in the way that we view leadership and that humility to change your mind and

(33:40):
being open to, you know, updating your belief system.
And I'd add to that too, Tess, I think there's developing a comfort with
uncertainty because uncertainty for us is deeply uncomfortable for, I think
humanly and I don't, I mean, we can speculate about the evolutionary origins
of this and, and I'm sure there were benefits of, of certainty and simplifying

(34:04):
our reality and, and, you know, there's no doubt advantages to that, but the
fact is the world we live in is so complex, much more complex than our
ancient ancestors had to deal with, right?
There's, there's so much more data, there's so much uncertainty.
And I think there's different ways we can, we can put our heads in the sand
and just say, well, I don't want to know about that and whether it's, you know,

(34:27):
again, I'll mention a few controversial things, but climate change is an example,
right?
That's a, a new reality.
When I was growing up, I can remember us as a teenager, it was no concept.
Climate change wasn't something people were thinking about at all.
It wasn't in the news.
So it's, it's a reasonably recent change in our, in the reality.

(34:47):
It's, it's a new data point on our map.
If we're thinking about that idea of being a scout, you know, so what do we do
with that?
Well, changes in reality force us to make changes in our lives.
So how do we respond to that?
And so that again, we become rigid.
We don't want to have to accept the fact that, you know, maybe something is
different and I have to then do something different in response to that.

(35:10):
If you're trying to protect yourself from confirmation bias, how can you frame a
topic that you're thinking about that you've made a decision about?
How can you question, okay, am I doing confirmation bias right now?
Yeah, well, I'll lead off and Tess, you can come and jump in as well.
I think there's a few things.

(35:31):
So one is to think about where are you getting your data from?
Because decisions are informed by data.
So if your decision is purely informed by your own introspection, then I think
inevitably, you're going to be reinforcing what you think already.
So it's very hard to think more broadly if you're not reaching out and looking

(35:54):
for different sources of data.
Now, by sources of data, that could be friends, you know, it could be
colleagues, but it's not just people around you who will tell you, yeah,
that's great, you know, we all have, we're familiar with the concept of yes
men or yes women surrounding, you know, dictators that surround themselves by
yes men, people who will just reaffirm that, yeah, that's a fantastic idea.

(36:18):
What a great concept.
I've just finished watching a very interesting show.
I think it was on Netflix.
Am I allowed to mention streaming services?
Yeah, we're sponsored by Netflix.
I'm sure we're going to be on Netflix pretty soon.
Like, yeah, this is just, it's segueing perfectly into it.
So a show called The Regime, and it's a real tongue in cheek look at a

(36:40):
dictatorship in Europe, but, you know, this person that is, everyone around
them is terrified of this unpredictable leader, and so it's just kind of, you
know, agreeing and reinforcing and praising all these crazy decisions.
And meanwhile, the whole country is just going to ruin over the course of the year.

(37:02):
So, yeah, I mean, think about the people who we have around us, often the best
person to talk to is the person who is naturally a challenger, you know, and
those are the people that can be frustrating, right?
There's people who are contrary.
That's an old, old word.
We don't use that that often, but contrarians, people who just will,
whatever you say, they'll say, oh yeah, but what about this?

(37:24):
Those people can be very helpful.
We're talking to one of those people right now.
And my wife would love nothing more for me not to be that person.
And I'm telling you, as I'm doing it, I would love nothing more than to
not be that person either.
But I just, my brain and mouth just want me in constant trouble.
Well, think you're offering a service.
Maybe you should set up a business for contrarian, contrarian for hire.

(37:46):
So, so I think that's one of it.
Think about your data sources.
I think another thing, and we might've already talked about it when
we're chatting previously, but it's think about the kind of questions you
ask people and the kind of questions you type into your search engines.
So if, for example, I'm trying to make a decision about some sort of new
health program or a new exit, let's say a new exercise regime.

(38:10):
And I want to try this out.
So if I type into a search engine, is this exercise program good?
Then that's a leading question, right?
It, you might throw up negatives or let's say you say, what are the
strengths of this exercise program?
That's going to take you down a certain path.
So instead you can ask a more open question.

(38:32):
What are the strengths and weaknesses of this particular program?
So when you ask a leading question, you get a certain kind of answer to that
question, I think even asking closed open or closed questions, right?
So do you think this is a good idea or do you support me if I make this decision?
They're closed questions and it's, you're actually, they're quite confronting

(38:55):
questions to be asked as a friend or an advisor.
So instead, if you, if you can ask your questions in an open way, I think it gets
your head into a more open space as well for, for dealing with both data that
supports your decision and data that maybe challenges your decision.
So that's a couple of things that you can think about.
And to go one step further, I would say that, you know, when it's a really

(39:18):
important decision, we don't just have to, you know, kind of find the balance.
We have to almost try and prove ourselves wrong.
So it's something that you really do care about.
You need to go out of your way to say, is there evidence that I could be wrong
and actually really interrogate the belief that you formed and really
pressure test and try and find out, is there any data out there or is there any

(39:40):
examples that would actually tell, you know, show me that the belief I've come to is wrong?
Yeah, that's a great one.
And that's a question, isn't it, Tess, that we would ask analysts.
So I know it's something that I say to, to our students to write this on a piece
of paper and stick it somewhere on your desk where you would see it every day.
And that is what would convince me that I am wrong.

(40:02):
And that's a very interesting question, right?
Because if your instinctive answer is, well, I'm not wrong, there's nothing
that would convince me that I'm wrong.
Confirmation bias in play there.
Absolutely.
There's nothing, it makes it very clear that you are wedded to that position.
That you will not be moved.
And obviously there is a problem.

(40:25):
So try to think to yourself, you know, what would be those things?
And then once you identify those things, you can start looking for them.
You can say, well, okay, I would be convinced that I was wrong if I saw this.
Well, go out and look for that.
You know, don't hope that you will not find it, but go out looking for it.
It's, it's that stress testing in the same way that, you know, if someone's

(40:47):
building a skyscraper or building a bridge that's going to hold, you know,
millions of tons of traffic, I would hope that that engineer has done
some very careful stress testing.
I don't want them saying, yeah, yeah, it's fine.
It's fine.
Personally, I wouldn't take it, but yeah, it's fine.
Yeah, that's right.
That's, that's really great.
Cause like what we were talking about confirmation bias, like you can be

(41:09):
susceptible under its influence, not realize it, but a simple question of
what would I need to have me change my mind?
What information would need to be presented to me?
And that can completely derail you.
Now, instead of just, you're focused on this narrow bit of information.
Now, naturally you just have to broaden what's out there.

(41:30):
Like that's absolutely brilliant.
I think we, I think you guys did it.
I'm going to throw me in there, but we solved confirmation bias.
Excellent.
Tick.
If only, if only Daniel Kahneman was still alive, he would be so relieved.
He knew it.
He said, Ken and Tessa, I leave this to you guys.

(41:50):
I know you can, can keep doing my research and see it through.
And he was, he was right.
But yeah, that's, that's great.
Cause confirmation bias was the reason why, like we spent so much time there
is definitely intentional on my behalf, because that was the first cognitive
bias that I learned about.
That's, that's where I learned that this is a thing we are not.

(42:12):
And you guys touched upon this when you opened the episode and it is a
fantastic question to ask yourself on a scale of one to 10, how objectively
do you think you make your information?
One, you're just extremely emotional or 10 you used Spock or Sheldon
Cooper from big bank theory as like the robots, are you just all
information in information out?

(42:33):
There is no influence.
And when I first listened to it, I was like, I'm a nine.
And, uh, uh, you know, cause you guys weren't actually, actually recording
me so I could give an honest answer of how arrogant I think I am, or I could
be truly as arrogant as I, as I think.
I don't know what I'm trying to say, but it was an arrogant response.

(42:55):
And, uh, by the end of your guys's, I was like, no, I'm still a nine.
And, um, then I listened to some episodes a few more times.
I was like, oh crap.
I think I'm actually closer to like a one.
I don't think I've made an objective decision my whole life.

(43:15):
And it's just really fascinating.
Just, yeah, but, but just like doing a little bit more research on, on how
many, uh, cognitive biases that are.
But yeah, just confirmation bias was the one that opened my eyes like, oh wow.
We, we really do.
We really are impacted by, by things that we don't even know.

(43:36):
Like our, we think like our brain is helping us and to an extent it obviously
is, but there's just certain maybe old processing ways that are just still in
there that could really use a little bit of reprogramming.
So yeah, just thank you so much for spending so much time on confirmation
bias and just your insight.
I really, really enjoyed it.

(43:56):
And, um, you got another one that, that blew my brain, like just blew it open
because one of the things that I asked for you guys to talk about is how can
we avoid these or how can we avoid these being used against us?
And that brings us to anchoring bias.

(44:17):
And I just, I am very much looking forward to hear what you guys have
to say about anchoring bias.
So if you could please just give us a little bit about what anchoring bias
actually is for sure.
And before I unpack it, Brian, I just want to thank you for your humility there.
Cause there will be some listeners, listeners out there who see themselves as
a nine, but I just want to make it really clear that none of us are a nine.

(44:40):
We, you know, you can't take the bias out of you.
And that's what Daniel Kahneman was saying is that even he, who this
was his job, his profession, you know, you can't take the biases out.
All you can do is again, that metacognition process is when it's an
important decision, you have to externalize your thinking and you have
to really unpack and be deliberate.

(45:00):
You know, intuitively we can't ditch biases.
No matter how much you learn about them.
The only way to avoid them or reduce their impact is by slowing down.
And we'll, we'll talk about this a bit more, but just wanted to thank you for
your, your humility and your reflection.
Cause I think it will help your listeners who are going, I'm a nine,
I'm a 10, if I could hide, if I could hide behind that facade, I would have.

(45:23):
And just the absolute arrogance that I have after I listen to it, it's like,
you know what, I'm still there.
I am so objective.
Look at me.
And then you guys did such a good job of explaining some of it.
I was like, Oh, holy crap.
This is embarrassing.
Good job.
A good thing I didn't record it, but let's get into anchoring bias.
Cause this, this is the one that got me so good.

(45:47):
Yeah.
So basically it occurs when we just rely too heavily on preexisting
information or the first piece of information, which is the anchor when
you make a decision and we can't help it.
It's really one of the most robust effects in psychology.
It's just, it's, it's outrageous.
And it's even robust when it's obtained by something super random,

(46:07):
like rolling a dice.
So they've done this.
We've talked about judges already in our chat.
They can get a judge to roll a dice and the dice is loaded.
So it will either fall on a low number or a high number.
And that will affect the sentencing that that judge gives.
Uh, you know, just think how outrageous that is.

(46:27):
These are people who are professional decision makers and they have
been told to roll the dice, which they're told is something completely random.
And it's still affecting the length of sentences they're giving to people.
Uh, so I think it's one of those things where if you think you're a nine or a
10, when it comes to this stuff, you're not like you will be anchored.
Um, the important thing is to realise that we are all anchored by everything

(46:52):
that, um, we see in here around us and we have to avoid it somehow.
Uh, another good one you can do with your friends and family is think of a really
big, say river, uh, or road near nearby where you live and just ask them, um,
is this river or road greater or, um, longer or shorter than a low number or

(47:13):
longer or shorter than a really high number.
And then get them to actually just judge the exact length and they will be affected
by that first number you give them.
So, whereas if you had to just ask them, how long is this, they will have a
totally different answer because they've been anchored by that
number you gave them initially.
And you, you don't even realise it.

(47:34):
It's just like, that's what's insane.
It's like, like you don't even have to go out of your way.
You can just like provide one small bit of information.
And then like you said, you're anchored to that bit of information.
And, um, can you talk about, cause surely like we know about it now, so we can

(47:55):
overcome it, but can you talk about how even like once you've been given that
anchor, like how much can we move from that anchor even when we're made aware of it?
So like the dice example, uh, you will be anchored by whatever that arbitrary
number or piece of information, or, you know, it's, it doesn't have to be
something so fact, you know, black and white, it could be you read an article

(48:17):
about an issue, uh, and then you form a judgment based on that first article.
You're less likely to move far from that existing point because that's kind
of your, your grounding place.
Um, the way that you do move from your anchor is by being really deliberate.
Um, and again, similar to, we talked about confirmation bias is really
seeking out those broad views.

(48:39):
Um, the, the thing that actually is the most helpful is to avoid
the anchor in the first place.
Um, so a good example is, uh, buying a car, you know, salespeople use anchors
against us all the time, they'll say, you know, it's this amount of dollars,
but it's 20% off at the moment.
And you go, Oh, what a bargain 20% off.
You don't actually stop and think, was that initial price a good price or was

(49:01):
that already too high beyond what it's worth, but you have been anchored.
Um, and that's why, you know, there's no time to think about that.
I'm getting 20% off on a car.
Exactly.
Yeah.
We're having a fire sale 20% off, but the car is inflated and probably worth,
you know, 60% more than it was.
You don't actually stop and make a judgment on that anchor, but that's,
that's how you avoid anchoring is really dissecting the anchor that you've been

(49:25):
given, so you want to go out and buy a car, don't just go out and look at all
the prices as your first thing or go to your first dealership, do your research
first.
So you go to, um, you know, in the U S I think it's the blue books, um, and those
kinds of websites where you say, okay, cause of this make model mileage are

(49:45):
generally worth about how much, and then you're going into that dealership for
armed, um, with what a reasonable price is.
And that way, if you see a car that's actually $4,000 higher, but on sale, but
still higher than what you've been told by the internet is a reasonable price for
that make model and mileage, you're actually much more informed to not make

(50:07):
an emotional response and think, Oh, what a bargain.
You can say, no, that was not a good value at its original price, at its anchor
price, and it's not a good value, um, purchase even with its discount.
Um, so that's one way you could do it by doing that prior research.
I was just going to jump in with perhaps a more, well, much more serious example

(50:27):
of where anchoring bias impacts us.
And, and let's, let's look and think about people who get stuck in abusive
relationships, you know, how often do we hear that those people have grown up in
abusive relationships themselves?
And, and, you know, the tragedy of this is that we're seeing an anchoring bias

(50:48):
at play in that situation, because when they think about what normal is, what
does a normal relationship look like?
They're drawing on a very, uh, destructive anchor that's painted a picture for them
of, well, you know, this, this is, this is kind of baseline normal.
So, and then what you hear is, is often this rationalization that will say, well,

(51:13):
yeah, sure, this person has been verbally abusive or physically abusive, but they're
not as bad as someone else, you know, that they'll, they'll compare it to
something else, but they're, they're comparing to a baseline of that, that
really is this horrendous anchor.
Um, so I think again, you know, just being conscious that anchors pull us in

(51:35):
many different areas of life, um, sometimes in something trivial, like
bargaining for a carpet when we're traveling to, to Egypt for a holiday, but
even when it comes to the standards that we, you know, subconsciously set for
ourselves, when we're thinking about a prospective partner or a suitable life
partner, we're anchored by experiences and things that we've seen previously.

(51:58):
You, you don't even know that you're being anchored and that's like, again,
this is the one that I was just like, I cannot believe this and you can see how
confirmation and anchoring can just almost go hand in hand, you get that
anchor of that first bit of information and everything is just, uh, you're just
searching to confirm it.
So one of the things that made me realize, uh, I'm not a nine on how

(52:22):
objective I make my information was actually that the anchoring bias.
So when I was out looking for software on how to do a podcast and record
digitally, this was the first software that I found was Riverside, the, the
program that we're using right now was Riverside.
And so it was just the first piece of information.
So everything was being measured against that.

(52:43):
And I didn't realize it until I listened to your guys podcast.
And I was like, Oh my gosh, like I'm being anchored and like to the
benefit of Rick Riverside, like they don't even know that it happened to him.
I just, I guess that's the importance in marketing.
Yeah, we're still using it.
So that's, that's, that's how, how good it's in there.
It's just like, yeah.

(53:03):
It doesn't mean it was a bad decision, Brian, but it, as you said, it affected
all of your future decisions, didn't it?
Everything was literally weighed against the first information that I got on
Riverside, I was like, Oh, well, Riverside can do this, Riverside can do this.
And again, when you stop and think, that's the internal processing that's happening,
but I'm not stopping and thinking about how am I making my decisions?

(53:26):
I'm just making my decisions, not impulsively, but just like, Oh, Riverside
can do this and so you need to do better or can you even do better?
Yeah, we can also think of friends who've come and joined us, you know, whether
they've come to a new school or they've come from another country or they've come
from another corporation to come to work with us and have been that person who

(53:50):
always keeps comparing, Oh, that's not how we do it there.
That's not how, Oh, it was much better here, you know, and it's an, again,
an example of being anchored to, you know, what they had previously experienced as
being, well, that was the right way to do it.
You know, we, our website is called good, better, right.
But the,
Well, right now, okay.
Well, I thought we were doing something, but yeah, I'll go check it out.

(54:13):
But the reason that we called it that was that I think there can be a strong sense
that, you know, our way of doing things is the right way.
We do something for long enough, whether it's our loading of the dishwasher or,
you know, the route that we take when we're driving to work or the software
that we use when we're recording a podcast or whatever, you know, it's
software that we use when we're recording a podcast, we quickly go from

(54:37):
thinking, well, this is a good way to do things to thinking, well, this is actually
a better way to do things better than all the others.
And then it becomes almost like a moral judgment.
This is the right way.
And this, I think is where this is what got me hooked on the whole thing of
decision making as a concept for a podcast was I think it's very easy for us
to then become very judgmental of others because we think, well, our way is the

(55:00):
right way. And we see it in, you know, relationships, whether it's, you know,
I've been married now for nearly 26 years.
I'm still stubborn about some things that, you know, I thought was the right way to
do things when I first got married, you know, because we get very set in our
ways. And I suppose, again, we can speculate that this is just the way our

(55:22):
brain evolved. We work out a good way to do things.
And I think a lot of scientists suggest that it's because of the fact that our
brain uses so much energy.
Right. So thinking and forming judgments is a very energy depleting process.
And for us, OK, we're all well nourished.
But, you know, our ancestors who were probably depleted because they were

(55:46):
hunter gatherers and didn't have huge amounts of calories to feed the big brains
would conserve energy.
We'll make a quick judgment and then we can relax.
So I think just getting being aware of that, we want to make a judgment, stick
to it. We don't want to use the energy that's required to rethink our judgments.
So being conscious of that tendency perhaps can help us a little bit as well

(56:11):
to avoid these these biases.
Yeah, just being like, why am I making these decisions?
But there is other areas that you you mentioned with anchoring bias.
And one of them was how you think you're getting a sale, but you're really not.
And so Amazon actually got in trouble for marking down items or pretending that they

(56:34):
were more expensive in reality, pretending that they marked them down for a sale.
In reality, they marked them up.
So you were paying more thinking you were getting a sale.
And that's why, like, I wanted to have a conversation like, how do we?
Because there's bad actors out there.
Not everyone's a bad actor, but the brain has been figured out to a certain extent

(56:55):
for the people who are trying to figure it out.
And a lot of time that motivator to figure it out is money.
And so how do we protect ourselves?
And and just knowing like, OK, I think I'm getting a deal, but but am I really?
I see this. I'm getting excited.
I'm getting a hit of dopamine in my brain seeing that.

(57:16):
Oh, this is my lucky day.
I just I want to get this item and it's on sale.
Like, what a coincidence.
It would be foolish to not purchase it in this moment.
But in reality, it's a clever marketing ploy.
And this has big societal implications about just how we make our decisions,
because in your guys' anchoring episode, you talk about organ donations.

(57:41):
And I would love for you guys to touch upon that again.
Yeah, that's a really interesting one.
And I'm trying to think, I think, is it Richard Saylor who wrote the book Nudge,
who's another Nobel Prize winner?
There's some very clever people who've looked into these biases.
Nudge is a fascinating book because it's really exploring these biases,

(58:04):
but then suggesting or opening up a conversation to say,
are we open to the fact that we could possibly be manipulating people for a good end?
Like, does the end justify the means of manipulation that we can use?
And an example is, as you alluded to, organ donation.

(58:24):
And I don't have the stats at my fingertips, so I'm making a guess.
It's pretty close.
But he compares Austria and Germany.
So two neighbouring countries, similar cultures,
they have a different system for determining
whether someone will become an organ donor after death.
And I believe in Austria that you are by default an organ donor

(58:49):
unless you make the effort to opt out.
Whereas in Germany, you are by default not an organ donor,
which I think is the same in Australia, perhaps in the US,
some states, I'm not sure, but certainly in Australia.
You have to opt in.
So you have to make the effort to choose to be an organ donor.
Now, when you look at organ donation in many countries in the world,

(59:10):
particularly Western countries, are struggling because of a shortage
of organ donors.
Many people are dying, waiting for an organ to be donated.
It would seem like a good thing that we can increase organ donation,
depending on your religious beliefs or other.
In the country, in Austria, where organ donation is a default,

(59:33):
they said organ donation is something like 98%.
So in the high 90s, 90% of the population are organ donors.
In Germany, I think it's something like 18%.
So it's a crazy statistic just because of the fact
that people are lazy or whatever it is.
Maybe it's a combination of laziness, but also a sense of social proof.

(59:56):
It's normal to be an organ donor.
That's the default.
So we kind of conform with that herd instinct
to what everyone else is doing.
And who wants to be that person who stands out and says,
you're not having my organ?
So people would rather conform.
And so I think it's a really interesting conversation

(01:00:17):
that Saylor opens up.
What do we think about that socially?
Do we feel like that's unfair that we're being manipulated?
Or do we think, well, maybe that's not a bad thing socially.
It's perhaps for the greater good that we can exploit some of these tendencies.
Another US version of this too with retirement savings.
So, Brian, in the US, very different to Australia.

(01:00:38):
Retirement savings aren't mandatory.
Whereas in Australia, companies have to pay a certain percentage
of your wage to your retirement.
But in the US, some countries are using...
Yeah, I mean, it's great because we get no control over it.
So you have all the savings at the end.
But in the US, obviously, it's very voluntary.

(01:00:59):
But some companies are experimenting in the same way with organ donation
is the opt out.
So they're saying, okay, we're going to put a certain percentage of your wage
into your retirement account.
If you don't want this, you have to opt out and choose not to do it.
And they're having the same success rates where people are saving
so much more for their retirement.
Because again, they're anchored by that initial decision of their company.

(01:01:21):
And they're just choosing not to move very far.
Certain percentage will say, no, I don't want that.
I want that money in my pocket now.
But the vast majority won't move from that initial point.
It's helping people and not even realizing...
I feel like the manipulation on...
We are being manipulated, but I don't...

(01:01:43):
That's benevolent manipulation.
I don't know.
So for organ donating or literally helping people save for their future.
And all you're doing is tweaking the way that they've always...
Actually never thought about it.
I've never had to think about being an organ donor.
I am an organ donor.
But just saying in general, you never had to think about it

(01:02:05):
because you were given this decision on a little box.
And then if you don't select the box, then you opt out of it.
And the same for retirement.
I imagine for the country, it seems smart to do.
There's a burden that's not there because the country

(01:02:26):
intervened in a way.
And that seems like a win-win to me.
But yeah, there's still ways.
That's why I wanted to talk about, even though we are sitting
and spending a lot of time with two specific biases, they're not bad.
Confirmation bias isn't necessarily bad.
That can help you give resolve.
So when there are bad actors who do want to change your mind,

(01:02:50):
at least you have a little bit of safety built into it.
And maybe it's flawed because you're looking to get your own information.
But it takes a little bit more information for you to change your mind.
So there's safety in that way.
It's good and bad.
With anchoring, it's good and bad.
You can use it bad, but you can also use it good.

(01:03:12):
And that's just what's really interesting about it.
It's just, OK, we make these decisions.
These are how we make them sometimes.
Well, how can we, on average, make better decisions
knowing these things about us as humans?
Yeah, it's really, I think that's a great point, Brian.
It is what it is, right?

(01:03:34):
We have this hardware that we're stuck with.
No doubt they evolve.
Yeah, exactly.
And no doubt it played a role in our survival.
You'd have, I would have to think, you know,
I believe that our brains have evolved to serve a certain purpose.
So two others that your listeners might be interested in too,
that are Australian specials,

(01:03:55):
when it comes to government intervention.
So one is in relation to cigarettes.
So some years ago, the Australian government decided that,
you know, trying to reduce both the number of people
that are succumbing to lung cancer,
but also the burden on the health system,
was to introduce legislation around cigarette packaging.
So they couldn't put any packaging.

(01:04:17):
They had to come in neutral packets.
So every cigarette packet,
there's nothing on there that indicates the company.
And then the other thing, the external packaging,
that is, that goes around it.
So when you go into the store,
you can't see the flashy sort of colors
or all the things that we know manipulate us.
The other thing too, is that on the back of cigarette packets,

(01:04:39):
they are obliged to put pictures, and they're gruesome pictures.
I've seen these, they are absolutely gruesome.
They're quite horrifying.
Yeah. Now, I don't know if my sister is listening.
I've got a sister who is a smoker,
and I know she's trying to give up, but you know, I don't smoke.
But when I go to her place, I see these packets and I'm thinking,

(01:04:59):
how can you look at these pictures of these, you know, these patients,
emaciated patients or pictures of organs
that have been destroyed by lung cancer, you know, by nicotine.
So anyway, that's an interesting one.
The other one is gambling.
And I know something that, you know, is of high interest,

(01:05:21):
that, you know, we have a lot of gambling sponsorship
for sports shows on streaming networks.
But after every one of those,
they're now obliged to have a little disclaimer,
which is the last thing you hear.
And it's a little slogan that is repeated.
There's two or three that they're cycling through.
And one is that you win more than, sorry, you lose more than you win.

(01:05:44):
I should get that right.
You lose more than you win.
And then there's another one that says gambling.
What are you really playing with?
There's a couple of different ways.
So again, little messaging that's designed,
you know, government decision to say, well, let's use some of these
tricks of the trade that advertisers use to manipulate us

(01:06:05):
to try and manipulate people for what we think is a positive outcome.
Now, people may disagree and say, well, we don't think that's a good outcome.
We don't like the fact the government's manipulating us in that way.
But that's something else.
Another example of where we're seeing it.
But at least you'd have the information.
And what's really interesting as you're talking about your sister
and just kind of just everyone in general to keep it human,

(01:06:28):
when we do act in these ways that are more risk heavy
or we know that a bad outcome is possible,
there's something called the optimism bias.
Where you think that things will be just all right for you.
Things are going to turn out a-okay.

(01:06:50):
There is a cognitive bias playing a role in our decision-making
and we don't even know.
It's almost like a protection.
Like, no, no matter what you do, you're going to be safe.
Yeah, optimism bias is one of those.
Yeah. I mean, you see it affecting all sorts of people, you know,
and it does affect your decision-making
because you think the bad things aren't going to happen to you.

(01:07:11):
You think you're not the one who's going to be in a car accident,
or that your smoking will lead to your early death.
You think you're going to be the exception.
And so it actually can help us make bad decisions
because we do think we're exceptional.
But even on mundane things too,
like if you're planning a kitchen renovation

(01:07:31):
or your rollout of your podcast, Brian,
and how much time and effort it's going to take,
you're going to underestimate.
You're going to be optimistic that you're going to be more efficient,
that you'll be above average,
and that things will be easier because we do.
We are optimistic and it helps us take risks.
So it's useful in some ways, but at the same time,

(01:07:53):
it's really bad for our planning
and that realistic side of things as well.
Yeah, I think there's a great quote from the Prairie Home Companion
for those who do like listening to that,
where Garrison Keylaw has a real tongue-in-cheek statement
about Lake Wobegon where he says,
I think, where all the women are strong,
where all the men are beautiful.
I call her something humorous.

(01:08:15):
And then he says, and where all the children are above.
And I just love that because it's a dig at our optimism bias.
You know, we all think we're exceptional.
We all think we're above average.
And we know, obviously, anyone who has basic statistics
knows that we can't all be above average by definition.

(01:08:35):
And that's why it's not good or bad.
But if you look at the statistics
and you look at how often businesses fail
and you just happen to be opening the same type of business
in the building where the previous business failed,
well, if you don't have any sort of that optimism bias,

(01:08:57):
you're just going to walk away.
And maybe that's for the best,
but maybe that's the best thing that happens to you
is like you opened up that business
and it's just massively successful.
But yeah, you don't try without that optimism.
We don't try to host a podcast
if we think we're going to be the only ones listening to it.
If we're literally just talking to ourselves
and we could just write a diary

(01:09:18):
and then that would be the same effect.
There is some sort of optimism
that this will at least reach and impact one other person
or however we set our sights, whatever our goal lines are.
But yeah, without that.
Yeah, and I just say you're spot on.
I think it's a balance, isn't it?
Optimism helps us to generate momentum and overcome inertia.

(01:09:42):
But the interesting thing is too that when we read stories,
these success stories of people who've been optimistic
and even though everyone around them said,
you're never going to make it, I believed and I tried.
It's a mythos that I think is really important
in your culture and in our culture.
We love those stories,
the success against all the odd stories.

(01:10:03):
The problem with that is that we're only hearing
about the success stories.
That's not a representation of the broader reality.
We don't hear about the 999,999 who tried,
even though wise people were saying,
that's a stupid idea.
Don't spend all your money on that.
Don't invest all your time in that.

(01:10:24):
Do something practical.
We don't hear those stories
because they're not inspiring stories.
So somehow I think there's a middle ground, right?
That we, yes, let's be inspired by stories of greatness
and people overcoming the odds,
but don't dismiss the odds as though they're not important
to help inform our decision-making
because I think there is still a place

(01:10:45):
and I'm sure there are many, many untold stories
and never going to make it into Hollywood.
But people who look back and think,
wow, I wish I'd listened to the person who said to me,
that's a bad idea, something different.
But my confirmation bias won't let me hear anything
that I don't want to hear, so it's okay.
But that statistic goes great with base rate neglect.

(01:11:10):
These two kind of go hand in hand with the optimism bias
and the base rate neglect.
So can you tell us a little bit more
about what is base rate neglect?
Yeah, are you happy for me to lead on that one, Tess?
Yeah, go for it.
If I talk a little bit?
Yeah, so base rate neglect is fascinating
and this is one that surprised me,
but studies show that our brains are typically not good

(01:11:31):
at managing statistics.
And again, we tend to rely more on stereotypes
and simplification.
So let's use an example.
So Tom loves music and the arts.
So imagine Tom, I want you to picture Tom.
He likes to go to concerts.
He wears suits when he's going out for dinner.

(01:11:53):
Is Tom more likely to be a plumber
or a trumpet player with a major symphony orchestra?
Now, this is a really interesting question, right?
Because different things pop into our minds.
I've painted a picture for you of Tom.
I won't get you to answer this question
because you're a nine out of 10
and you won't be manipulated easily.

(01:12:15):
Not this guy, never.
But what we don't tend to think about is the statistics.
So here's some stats.
There are 481,000 plumbers in the United States.
At a generous estimate, there is perhaps 5,000 trumpet players

(01:12:36):
in major symphony orchestras.
So for every orchestral trumpet player, there are 100 plumbers.
So statistically, even though we've heard a lot of things
about Tom that might make us think,
well, that doesn't sound like any plumber that I've met
wearing suits, listening to classical music,

(01:12:56):
loving, you know, the arts.
Yet statistically, it's much more likely
that this person is going to be a plumber.
Now, I think what we struggle with with that is that we think,
oh, yeah, but plumbers, even when we hear the statistics,
we tend to still go, well, yeah, but that doesn't sound like a plumber.
It was a really nice suit.

(01:13:16):
So I'm pretty sure.
I'm pretty sure.
Pretty sure, pretty sure.
So, you know, I think what we tend to do with base rate neglect
is we evaluate likelihood with no reference
to the actual probability of an event occurring.
So, you know, if I describe that person to you
and I asked you what kind of job they're more likely to do,

(01:13:38):
you focus on the description.
The description becomes a distraction.
Rather than if I said to you, you know,
what's the likelihood that a person, no description,
what's the likelihood that a person is a plumber
compared to the likelihood that they play a trumpet
for a major symphony orchestra?
I think most of us at that point start to think statistically

(01:13:59):
because the language pushes us in a different direction, doesn't it?
Yeah, but as you said it, I would 100% just say,
okay, well, based on this information that really isn't even pertinent,
that's given me anything valuable.
Well, that's where I'm basing my decision off of.
He dresses nice.
So, a plumber, you know, stereotypically,

(01:14:22):
they have plumber crack.
You're not showing plumber crack if you're in a suit.
It just doesn't go hand in hand.
So, there you go.
He's 100% not a plumber if he's in a suit.
And so, this is something else that was really interesting.
You guys touch upon a lot about type one and type two thinking.
And I feel like that can really help us.

(01:14:43):
I don't know if one is necessarily better than the other.
And with my exposure to you guys' podcast
and a little bit about cognitive biases that I know,
it seems like cognitive biases live a lot more in the type one thinking.
And so, I think that would be like a good area
to also spend like a little bit of time with.
So, what is type one?

(01:15:04):
What is type two thinking?
And how can knowing the difference between the two
really help us make better decisions to avoid the utter embarrassment
that this guy was a plumber
and I just got it all wrong?
This comes back to Kahneman.
And he's actually got a book called Thinking Fast and Slow,

(01:15:24):
which really unpacks this well.
And we would definitely recommend your listeners read this book.
But as you said, there's no inherently good or bad nature
to either types of thinking.
They're both useful in different circumstances.
So, thinking fast is really that intuitive brain.
It's a part of your brain that goes,
how many plumbers have I seen wearing suits on the weekend?

(01:15:47):
That's your fast thinking brain.
You're not even aware of the thought process
that got you to that judgment.
It happens automatically.
It's drawing on your confirmation bias, on availability bias,
some images you've seen before,
and it's all happening in the background.
Whereas the slow thinking is really deliberate.
So, it's the ability to stop and go,

(01:16:08):
okay, I think it's more likely to be a plumber,
but actually how many plumbers are there in America
versus how many trumpet players?
And just by stopping and getting into that slow thinking brain,
you can really start to unpack the judgment that you've made.
And even if you don't have those statistics to mind,
you can realize by thinking slowly

(01:16:30):
that there are a lot more plumbers out there
than there are trumpet players.
So, that's when you can actually make that more informed decision,
that more informed judgment
by getting out of that intuitive fast thinking
and slowing down and getting into that metacognition
which we've talked about,
which is that really unpacking your thinking
and realizing the shortcuts that your brain has made.

(01:16:52):
And again, the shortcuts, the fast thinking is useful
because you don't want to actually have to stop and think,
do I brush my teeth with my left or my right hand?
Because that would be exhausting
if you're having to unpack all those decisions.
You need that fast thinking.
But there are times when you need to stop and go,
actually, this is a slow thinking time.
I need to really get myself into that state of mind.

(01:17:14):
Ken, I'm sure you've got some things to add too.
Yeah, we could talk about this a lot
because it's something that we both think about regularly.
But one thing that jumped out at me,
we have an expression in English language, jumping at shadows.
And it's a pejorative expression.
You're overreacting.
You're reacting to a nothing.

(01:17:34):
So, we use it when we're criticizing people.
But in fact, if you pause and think,
where does this expression come from?
Why do we jump at shadows?
It's a demonstration of system one thinking in action.
Our brains evolved.
Our ancestors were out in the forest
and they saw a big dark shadow in the woods.

(01:17:58):
You don't stop at that point and think,
okay, now, let me just analyze this.
Is this shadow caused by a cloud coming over?
Or maybe it's something else.
It might be a tree branch.
It's quick reactions that saved our ancestors.
So, the ability to act first and think later

(01:18:20):
is actually an advantage in some situations.
And so, system one thinking, I think you said this, Brian.
It's not good or bad.
It's just about the context.
Sometimes we need to react quickly before thinking.
And Gary Klein has written some really interesting stuff on this.
He's a researcher that's looked into the role of intuition with decision making.

(01:18:42):
And he talks about how expert firefighters or emergency care nurses
will make these snap decisions when a life is at stake.
And they won't know why they've made that decision,
but they just know somehow that that's what they need to do to save lives.
And interestingly, when Gary Klein interviewed these people subsequently

(01:19:05):
to try and understand what was going on,
some of them were really convinced that they had a sixth sense
that it was almost something supernatural
that was helping them to know what to do.
Whereas, in fact, Klein's conclusion was
these people just were drawing on a huge amount of data.
So, the data that had accumulated over years and decades

(01:19:28):
was informing that quick intuitive subconscious type one or system one thinking
to help them make an accurate judgment and decision.
Whereas, for most of us in the same situation,
we don't have all those data points to draw from.
So, we're making uninformed decisions that aren't necessarily accurate.

(01:19:49):
So, again, there's a time and a place for that really instinctive intuitive quick decision.
And we don't always have time for anything else.
But I think the danger is with system one,
and you can think of this in many situations,
those people you meet who always seem to have a hunch.

(01:20:10):
I just have a hunch that this is the right thing to do.
And you're thinking to yourself, why would that person's hunch be accurate?
They have no extra knowledge to draw from than I do.
And I don't personally believe in people having a sixth sense that's accurate.
So, maybe instead of relying on those hunches,

(01:20:34):
which could be I just had a nice meal or it's those subconscious things,
I met this person and it's something about them
that reminds me subconsciously of my first grade teacher
who was super nice and friendly.
So, I just have a hunch that this person is a good trustworthy person.
We're not even aware of what's informing those hunches.

(01:20:54):
So, I think we need to just develop that strong tendency to say,
if there is no urgency here,
then let's take time to gather those pieces of data
and make a more informed decision.
Because there's a lot of bad judgments that are made on hunches.

(01:21:16):
Like you said, there's just human history or just,
even not even human history, just being alive as a human,
we do not have the luxury to always sit and think
and then think about thinking and why we're making the decision.
Sometimes you just, you have to act.
Kind of being a little bit mindful about time.

(01:21:36):
There's obviously still so much and so many interesting things to talk about.
But I wanted to get into the last one that we'll have time to talk about is groupthink.
And this one is just fascinating because clearly we're all self-informed individuals
and can't be swayed by just nonsense being hurled at us

(01:22:00):
by a handful of people, let's say three people.
Groupthink is such a great one.
It's one that we've all heard of.
I think it's something that actually is in that common vernacular.
But there's a little bit more depth to it.
So, there's two parts.
Informative influence, where you are a non-expert
and you just trust that the group knows better,

(01:22:20):
so you go along with what they're saying.
So, there's almost that sense of being convinced.
Whereas there's a normative influence where you do know better than the group.
You're sure that the group is taking a wrong decision,
but you decide to conform rather than rock the boat.
And I think this is a part of groupthink that really frightens me
because it just shows our human tendency that even when we know

(01:22:45):
that someone's making a terrible decision,
sometimes we choose not to speak up because it's uncomfortable.
Or it's our boss who's making the terrible decision.
We're not brave enough to put our hand up and say,
actually, sir or ma'am, you're wrong.
Yeah, that's right.
I think there's something else interesting if we think it from an evolutionary perspective.

(01:23:07):
There's obviously a lot of survival benefit for groupthink and conformity.
The whole concept of conforming,
when you are dependent on others for your survival,
we understand that it's often prudent to agree
rather than to be ostracized and removed from the safety of the group.

(01:23:28):
So, I think that it makes a lot of sense.
But again, there's huge risk with it, isn't there,
that it's that ability to be the lone voice of reason
that speaks out courageously.
And one of the values that's often espoused in the Australian public service

(01:23:48):
is speaking truth to power or having the courage
to confront and speak out and say what we need to say.
You know, I think it's really hard.
And I know, Tess, you shared a study that you'd seen
that demonstrates that, yes, the first person has a real role to play

(01:24:13):
in a group situation to speak out with a voice that proposes a different view.
But it's actually the second person, isn't it, that when they speak out,
that often then has the impact on the rest of the group
because you end up with a situation of what's called social proof.
So, you end up with a second group, right?

(01:24:35):
It's not just an individual.
There's a second group that there's critical mass there
so that others now have the option of saying,
well, I agree with this group or with that group.
So, I think being conscious of group dynamics is really important
that we can have an impact on other people.
Sometimes we withhold our views for a whole range of reasons.

(01:24:56):
We don't want to look stupid.
We don't want to be embarrassed.
But being aware that sometimes having the courage to speak out,
it actually empowers other people to think differently
and to engage system to that slower thinking and say, well, hang on a second.
Brian has now offered a different point of view.
And I know that that must have taken some courage for Brian to do that

(01:25:19):
because we're in a group of people and I know that it's hard to speak out.
So, now I'm liberated to change my view.
I'm empowered to think differently.
So, I think being aware of those dynamics is important.
You can also do really deliberate interventions as well
by getting people to speak up or write down their opinions anonymously.

(01:25:41):
Or if you are the boss or the person with, I guess, the most influence in the room,
you can deliberately choose to speak last.
So, you hear everyone's raw opinions before anchoring them
or having the group bias kick in by speaking too early.
And then they're all giving you their adulterated views,

(01:26:01):
which are possibly complying with you if you're the most senior person in the room.
That's a really good point that Tess makes.
And I think understanding those group dynamics,
there's several things that we would advocate to our analysts that we're training
that they can do to help manage these kind of group dynamics.
And another is to if you have a group of people that are sitting down and discussing an issue,

(01:26:26):
you can decide ahead of time to make one of those people take on a role of devil's advocate.
And what that will do is that that person then sets aside their own position
and they will deliberately present an alternate viewpoint to the group.
And that acknowledges that by doing that,

(01:26:47):
you can free up people's thinking to consider things differently.
Now, I know that that sounds a very formal way to approach things,
but I honestly believe that you can apply some of these techniques in a personal setting.
So, if you want to work through an issue collectively as a group,
if you've got a group of people, friends who have to make a decision,

(01:27:08):
and it's a really difficult one,
and it could be a decision about an elderly relative who maybe you're making a decision,
how do we best care for this person?
Do they need to go into a high care situation?
There's a lot of emotions involved.
Maybe you can bring an outsider in who can bring a different perspective
and can just shed light on things in a non-emotional way.

(01:27:28):
So, yeah, I think there's things we can do to address the challenge of groupthink
because it is very powerful,
particularly when there is a power imbalance as Tess spoke about.
When there's a power imbalance in the group,
a leader will sway other people for a range of reasons.
So, maybe the leader can choose to withhold their view until later in the conversation.

(01:27:52):
Yeah, if you actually want to hear what everyone has to say to get the best idea
forward to avoid this bias, it just seems like, yeah, let's go ahead.
And as humans, we'll tend to defer to someone that we feel is in a position of authority
or knows the most, like what you're saying.

(01:28:13):
And the bad and the worst way of groupthink to where you know better,
but everyone else is saying otherwise and you're just like,
well, I'm not going to rock the boat because there still is that small percentage that
I am wrong.
And then if I am wrong, well, I'm just going to be absolutely humiliated
because everyone else was already saying these things.

(01:28:35):
And now I spoke up and I gave a contrary position.
And now while I learned my lesson, I'm just going to go along with the group forever.
As far as leadership, being aware of that and helping facilitate
and letting everyone come forward with their own opinion.
Yeah, I just can't imagine just not paying attention to that.

(01:28:58):
I was just saying, it takes real bravery, Brian.
It's really hard.
And I think it's important to note that, that it is a really hard thing to do.
And there's some classic examples like the Cuban missile crisis where
so many people in that room knew better and they gave their president bad advice
or went along with bad advice because nobody was brave.

(01:29:19):
Nobody was willing to risk their reputation, risk looking stupid in front of others.
So instead they let their president make a terrible, terrible decision.
That could have been catastrophic, globally catastrophic.
And it's just because we're so delicate that we need to preserve egos in the room.

(01:29:40):
Like, what a silly way to think.
But that is something that we need to be cognizant about.
Yeah, and I think that the question we always need to be thinking about,
particularly when we have influence either because of position
or because of just respect that other people have of us,
thinking about what sort of culture am I creating

(01:30:01):
in the environment in which I'm living and working?
Is it a culture that welcomes contrary views,
that welcomes different ways of thinking, that welcomes challenge?
Because if we don't create that culture, then most of these biases will kick in
and impact the kind of judgments and decisions that we're forming.

(01:30:22):
I just wanted to throw in too, there's something that I really liked, I heard,
understanding that different personalities contribute differently in a group.
So someone said once that if you don't know what an extrovert thinks,
then you're not listening.
And if you don't know what an introvert thinks, you haven't asked them.
And I really, I think about that a bit because, you know, I work,

(01:30:44):
I lead a team of people who are quite, you know, different personalities.
And I think, well, have I asked the quieter people in the group?
Have I actually opened up the opportunity for them to share by inviting them to contribute?
And with the extroverts, have I been listening or have I just tuned out
because they, perhaps like myself, have talked for too long and everyone's now

(01:31:07):
eyes are glazing over.
So how do I ensure that I'm hearing all the voices in the room?
Yeah, there's just like some great leadership practices
and knowing that about human decision making.
And yeah, just not easy to make good decisions and make sure that good decisions
have space to really be thought through all the way.
Honestly, I could keep talking to you guys forever,

(01:31:28):
but I don't want to take any more time than what we agreed on.
So this one, I'm going to, this is going to be the last question for you guys.
As long as that's okay.
You guys aren't just like, no, we need this in our lives.
I'm sure that's not the case.
So the last question I want to ask, and I'm going to put this on you first, Ken,
because I made Tessa open up with the conversation.

(01:31:51):
But, you know, we could sit and think about, okay, well, what is the meaning of life?
And, you know, we could all put out our best thought and it could be really profound.
And what I'm curious is what do you think is the purpose of your life?
Oh, wow. This is an easy one to finish off with.
Yeah, great question.
So if I can give a probably a bit of background,

(01:32:12):
my views on some of these things have changed over decades
as a result of different things that I've experienced.
And maybe I'd like to think that reflects in something of the philosophy
that we're talking about here is trying to stay open and deal with the reality
and complexity of the world that we see.
But I think personally, for me, that the purpose of our life is very much tying

(01:32:38):
to thinking how we create a good place so much as we have power to do so.
And I certainly, for me, the value of caring for people and respecting people is really important.
Protecting people where we have the power to do so.

(01:32:59):
And the environment, the world in which we live in.
And I suppose that sounds very grandiose,
but that's the overarching view that I have.
And so, you know, whether it's in my team at work or in my family at home,
which is certainly the most important thing for me with my three kids and my wife,

(01:33:21):
is to care for, respect, protect,
and I think build up the people around us as much as we can too,
because acknowledging that I can only have a very limited impact
on the world around me to make it a better place.
So doing whatever I can to inspire and empower others to do the same,
I think is the opportunity I have to leave this world perhaps,

(01:33:46):
you know, slightly better or slightly improved,
at least in the little arena that I have influence in.
So I don't know.
I guess that's how I would see it, Brian.
It's beautiful.
I love it, Ken.
All right, Tessa, lay it on me.
Very, very big question for a Sunday morning here.

(01:34:06):
Similar to Ken.
I think in a nutshell, it's to have a positive impact.
I think in my youth, I probably had more grandiose aspirations
in terms of what that impact might be.
I think I've moderated it a lot now,
and I see it as something that you can do with every interaction,
with every job, with every task.

(01:34:28):
You know, even as a 16-year-old working in a supermarket,
I was always very diligent and tried to be polite
and tried to do a good job even for things
that I didn't necessarily care a lot about at the time.
But I've tried to take that through my whole life,
is that if you're going to do something,

(01:34:49):
you might as well do it to the best of your ability,
and you should treat people well,
and you should try and make others leave your interactions
feeling positive, feeling good.
And particularly, as you just saw, I've got my second child,
and I think that awareness of what impact you can have
on the next generation too,

(01:35:09):
and the kind of human being you're helping to create
is something that's at the forefront of my mind at the moment now,
in terms of what lessons am I teaching.
Like Ken, my dad has passed,
but he has had a big impact on me
in terms of the way that I view the world.
And I hope that I can impart something similarly positive

(01:35:30):
to my children too.
Yeah, both absolutely beautiful answers.
There's really no wrong answer with that.
I mean, both of you guys' answers
revolved around having a positive impact where I can,
and just leaving the world a little bit better.
Yeah, it's beautiful.

(01:35:52):
There's a few questions I always want to know
when I just meet random people.
And one's like, what is your purpose?
Why are you doing the things that you're doing?
Just knowing we have those protections in our brain
that we're going to live forever.
In reality, we're not,
but why are you doing the things that you're doing?
And are you happy?
I always want to know, are you happy?

(01:36:14):
And that can go a lot of different ways.
So I always figure that's a better question not to ask.
You guys are just two awesome people.
And I love that we live in the time that we do
that outside of sheer chance,
it already is sheer chance that we met each other.
But just to be able to literally,
you guys are in my future.

(01:36:35):
It's Saturday for me, it's Sunday for you.
We're on opposite sides of the world.
If we lived in different times,
we would not have this opportunity.
And I'm just so happy and grateful
to have this opportunity with you guys.
And so just thank you so much for being here.
Well, thank you so much for having us.
It's been a real pleasure.
And I think for us, it's always,

(01:36:56):
it's good to be asked questions.
And you've asked some great questions
because asking questions gets our brains working
and forces us to think and justify what we're thinking.
So I think that's very much in the spirit
of what we are trying to encourage
through the podcast too.
So it's great to chat with you.
And thank you for your interest in what we're doing.

(01:37:17):
And I hope that the people that are listening
will be prompted to think as well
about how they're making decisions and judgments.
Yeah, thank you so much.
This is the kind of stuff
that Ken and I talk about for fun anyway.
So in answer to your other question,
we are both very happy.
And we're so happy to be here with you today.
Like I said, it's a topic and information

(01:37:39):
we're just happy to riff on all the time.
So it's been really enjoyable.
Thank you so much.
You both are too kind.
And to anyone that is listening to this
for all the decisions that you will make
throughout your life,
let me help you with this one.
Go listen to the How to Choose Podcast
with Ken Smith and Tessa Mudge.
Your life will be richer for it.
You can find their podcast on podcast platforms

(01:38:01):
like Apple, iTunes, and Spotify.
You can also find it on their website,
goodbetterright.com.au
along with their blog,
where they go into more depth
about some of the cognitive biases
we talked about today.
You have a resource section
where they have a downloadable PDF
to help you navigate
some of the cognitive biases we talked about.
List the books that they talked about

(01:38:21):
on their own podcast episodes.
And you can contact them directly
through their website.
It's the best place to learn more about them
and what they're all about.
Ken and Tessa, thank you so much for being here.
I enjoyed every second of our conversation today.
Thank you so much, you guys.
Thank you, Brian.

(01:39:02):
Bless you.
Thank you.
I remember year after year
all of my sick days going unused
because of my confounded excellent immune system.
I was growing weary of watching my lucky elderly
and immunocompromised co-workers

(01:39:23):
take additional days off for being sick
while I was stuck working like a chump
doing not only my job, but theirs too.
Naturally, I was doing my best to even the playing field
and come into contact with some foreign pathogens.
I was doing everything I could.
I was licking handrails,

(01:39:44):
licking escalator railings,
licking handrails on public transit.
I wasted a lot of good saliva before I found.
Achoo-2u.
I'll admit, I was a little skeptical of their efficacy
before I acquired their service.
But after trying it once, I was convinced.

(01:40:06):
No more Russian roulette than I come into contact
with someone who was sick.
Achoo-2u sent someone straight to my door
to sneeze and cough directly in my face, food, and beverages.
Now, years on, I have the cold cascade of other infections
to prove the potency of their services.

(01:40:29):
But do you know what I don't have?
Any remaining unused sick days.
Thanks, Achoo-2u.
Achoo-2u.
Never let your sick days go to waste again.
Advertise With Us

Popular Podcasts

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.