All Episodes

April 5, 2024 40 mins
This week we delve into the world of cognitive bias when we review a recent review of this topic by the team at the University of Arkansas. What are some of the more important common cognitive biases that plague humans and are particularly deleterious for good decision making in 'case management/surgical' conference? What are some approaches to reducing the impact of these biases? How does the advent of ZOOM enhance or reduce bias for surgical conference decisions? We speak with noted authority on this topic, Dr. Joshua Daily who is Associate Professor of Pediatrics at The University of Arkansas in Little Rock for his deep insights into this topic. 

https://doi.org/10.1007/s00246-024-03462-4

Other resources mentioned by Dr. Daily include:

- Thinking Fast and Slow by Daniel Kahneman
- Thinking In Bets by Annie Duke 
- Predictably Irrational by Dan Ariely
- How To Decide by Annie Duke
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:16):
Welcome to Pdheart Pediatric Cardiology today.My name is doctor Robert Pass and I'm
the host of this podcast. Iam Professor of Pediatrics at the Icon School
of Medicine at Mount Sinai, whereI'm also the Chief of Pediatric Cardiology.
Thank you very much for joining mefor this two hundred and ninety first episode
of the podcast. I hope allenjoyed last week's episode, in which we
reviewed the concept of junctional rhythm inthe fontane patient with doctor Sisadre Bloggi of

(00:41):
OHSU. For those of you withan interest in the fontane and electrophysiology,
I would certainly recommend you to takea listen to last week's episode two hundred
and ninety. As I say everyweek, if you'd like to get in
touch with me, my email iseasy to remember. It's Pdheart at gmail
dot com. This week can moveon to the world of cognitive biases.

(01:02):
And this concept and this idea cameto me from doctor Tony Rossi at Nicholas
Children's Hospital, and i'd like tothank him forward. And it is apropos
that we discussed this concept this weekas we lost the great psychology Professor Daniel
Khneman, who's famous for many differentbooks that he wrote, as well as
winning the Nobel Prize for many insightsinto the concept of cognitive biases and the

(01:26):
application of his work to the worldof economics, has ushered in the entire
novel field of behavioral economics, andperhaps most recently, is most well known
for writing the book Thinking Fast andSlow. The title of the work we'll
be reviewing this week is Cognitive Biasesin High Stakes Decision Making Implications for Joint
Pediatric Cardiology and Cardiothoracic Surgery Conference.The first author of this work is Joshua

(01:51):
Daley and the senior author Lawrence Griton, and this work comes to us from
the University of Arkansas for Medical Sciencesin Little Rock, Arkansas. When we're
done reviewing this paper, the firstauthor of this work, doctor Joshua Daily,
has kindly agreed to speak with usabout it. Therefore, let's get
straight onto this article and then aconversation about cognitive biases and how we might

(02:12):
be able to mitigate them. Thisweek's work is about cognitive biases, which
can be defined as patterns of deviationfrom objective or rational judgment that leads people
to make decisions or form judgments ina way that's consistently skewed or distorted from
what one might choose if we werepurely rational or unbiased. The authors start
by explaining that there's a wealth ofdata suggesting that we make these sorts of

(02:36):
errors often, and this has beenexplored in the fields of psychology and economics
in business, but in medicine abit less. So they explain that surgery
conference, in which we routinely makedecisions that are critically important and potentially life
saving, is a situation where highquality decision making would seem very important,
and so the authors decided to empiricallydiscuss or lay out for us what the

(02:59):
most common biases are and how wecan protect ourselves and our patients from the
influence of these in our decisions.They explain that the great psychologist Daniel Kahnemann
proposed that there are effectively two waysthat we think, and these are known
as system one and system two.System one thinking relies heavily on rules of
thumb and can be thought of asfast or intuitive or even automatic thinking,

(03:23):
where a system two thinking is moreslow, deliberate, and analytical. Human
beings. The authors explain mostly usesystem one thinking for daily decisions because it's
efficient and a quick way to makedecisions, but this can leave us prone
to cognitive biases and errors that createerrors in perception, reasoning, or decision
making. The authors then review howsurgical conference typically is run in most centers

(03:46):
with a multidisciplinary group of cardiologists,surgeons, and intensivists as well as some
other individuals and other specialties, andhow a concise summary of the patient is
presented with imaging and other diagnostic datalike cath information, and how this is
typically followed by conversation about the case, in which the more senior members of
the group will often express their opinionsabout the best plan of action. And

(04:09):
they explain how some questions are veryeasy and have a very short conversation as
the answer is in fact easy,and the authors offer us an example of
a patient who might have a largeperimembranus of VSD and a child with heart
failure, and they contrast this withmore complicated questions like the management of a
child who has LTGA. The authorsexplain that in the end, the chief

(04:29):
surgeon or cardiologist will then at theend state the plan prior to moving on
to the next patient. I'm suremost listening to this will see similarities between
this description of surgical conference and thatat your own institution. With this description
in place, the authors then reviewthe major known common cognitive biases that occurred
during surgical conference and offer some examplesand perhaps that, like me, you

(04:51):
will think about how these affect yourown conference. First, among the common
biases is that of confirmation bias,in which we tend to a search for,
or interpret or favor information that confirmsor supports one's pre existing beliefs or
a hypothesis, which may result ina person ignoring or discounting alternative evidence that

(05:11):
contradicts the preconceived beliefs. The authorsexplain that this is clearly something that can
hinder objective or rational decision making.An or for an example of how a
surgeon who may be very skillful andwho rarely makes an error may dismiss the
presence of a residual VSD and thinkit's insignificant and encourage medical management rather than
reoperation, even when there's objective evidenceotherwise. Or Another offered example is that

(05:34):
of the cardiologist who before conference iscertain that they know what is best for
the patient, but then focus onlyon the information during the meeting that supports
that belief. The next bias thatthe author's review is that of availability bias,
in which there's a tendency to relyon information that can be readily recalled
when making decisions or judgments rather thanconsideration of all relevant information, and they

(06:00):
explain that it occurs because individuals willoften give more weight to information that is
prominent in their memory due to arecent exposure or personal experience. This is
the being the victim of your lastcase syndrome. As a result, a
cardiologist in conference may not want torecommend a particular procedure, even if optimal,
to a patient, if they hada recent patient who had a bad

(06:20):
outcome with this and in similar vein, a surgeon may recall a prior case
of something in which the surgery turnsout very poorly, and therefore is disinclined
to do that same operation in anotherpatient, even it would objectively seem to
be the best option. The nextbias is that of outcome bias, which
is, to quote the authors,the tendency to judge the quality of the

(06:42):
decision based on the outcome or result, rather than evaluating the decision making process
itself, and the authors explain thatit is also sometimes referred to as resulting.
Essentially, a bad decision may resultin a good outcome and vice versa,
and the authors offer as an examplethe situation where the group and co
conference may have decided to do aparticular surgery and a patient that had a

(07:02):
bad outcome, and then the groupmay decide that this was a bad decision
and decide that they cannot make thesame poor decision again, and the opposite.
Is also possible that a bad decisionto proceed with a surgery may have
resulted in a good outcome and undulyinfluence a similar decision in the future with
a possible bad outcome. The nextbias that they review is that of overconfidence

(07:25):
bias, in which one overestimates thelikelihood of success and tasks or underestimates risk,
which can result in a bad outcome, and the authors explain that it
is rare that a center would considerthemselves to be below average, even though
we all know that by definition,one half of centers very much are.
They reference the notion of a surgeon'stendency to dismiss a bad outcome is caused

(07:46):
by something other than their own mistakein the oar as an error of overconfidence,
as this belief would allow them topreserve their own belief that they are
in fact very competent and will thereforereduce the chance of cognitive dissonance. They
explain that this can result in surgeonsor centers taking on cases that they are
ill prepared for, rather than referringthem to more experience centers, even if

(08:07):
there's a better chance of success ata different center. The authors then reviewed
the so called sunk cost fallacy,which is described as a tendency of individuals
to continue investing time, money,or effort into something that they have committed
to even when it seems unlikely thata positive outcome will arise. An example
of this might be a stock purchasethat one has put a lot of money

(08:28):
in and lost money in, butone just knows will turn around, and
which you have put a lot ofyour net worth in. You just don't
want to realize those losses and sohope that it will arise from the ashes,
when perhaps best it would be tojust sell the stock in order to
just stop the bleeding. The authorsreview loss aversion which is a tendency to
overvalue loss over gain, resulting ina risk aversion bias, and it could

(08:52):
influence doctors to favor a conservative treatmentapproach over opting for what is perceived as
a high risk surgery, even ifthe possible benefits of the surgery far outweigh
the possible drawbacks. They explain thatthe structure of public data reporting perpetuates this
sort of loss aversion behavior, asprograms are incentivized to minimize mortality rates rather

(09:13):
than striving to optimize patient long termsurvival or quality of life. They give
the example of small or mid sizedprograms where one surgical mortality can have massive
repercussions on status and national ranking,leading some to be very risk averse.
The authors review next the notion ofplanning fallacy, yet another cognitive bias,
whereby people tend to underestimate the time, cost and risks associated with actions or

(09:39):
projects because they're overestimating the benefits oroutcomes. And doctor Dalian colleagues essentially suggests
that people generally tend to be overlyoptimistic when making plans because they believe the
chances for success are higher than theyactually are. This bias can lead to
unrealistic expectations due to overconfidence in ourown planning abilities. Bias is also reviewed

(10:01):
and is basically giving more weight tothe opinion or decision of authority figures or
those in positions of power. Thiscan be a particularly pernicious one, and
that people often will abdicate their ownviews that are contrary to the powerbroker because
of a perception that they might bemore knowledgeable. They also review a concept
called a halo effect, whereby ifsomeone is known to be smart about one

(10:22):
thing, they assume that they're smartabout other things, which they may in
fact be less knowledgeable about. Thefinal bias is what the authors term the
illusion of agreement, whereby some individualsthink that others in the group share their
opinion to a greater degree than theyactually do. This bias seems to exist
because people generally tend to surround themselveswith like minded individuals, and so it

(10:45):
leads to an overestimation of consensus onissues and results in a sort of group
think whereby consideration of alternative opinions isnot routine. The authors give the example
of a senior person giving their opinionin conference and no one disagreeing, and
the group then may believe that theabsence of a conversation of disagreement may signal
a greater degree of agreement than thereactually is. Okay, so we've now

(11:09):
reviewed the author's thoughts regarding the mostcommon biases that can plague us. So
how do they suggest we work againstthis? Well, here are a few
of their suggested strategies. First,they recommend avoiding interruptions or recommendations until all
of the data is presented, whichcan prevent confirmation bias or authority bias,
or even the illusion of agreement.Second, leaders or those in power should

(11:33):
wait till the end to speak.The authors suggest that deliberate restraint by the
leader will allow more time for othersto independently formulate opinions and express them.
They suggest that one option would beto have the fellows present and then even
present the available options, and perhapseven articulate their own suggested course of action,
which might have the secondary gain ofimproving fellow education. Third, the

(11:56):
authors wonder if anonymous balloting would makesome sense prior to anyone sharing their personal
opinions. Fourth, the authors speakof options being presented with actual known data
regarding the precise probability of a particularoutcome. Words like this is unlikely or
likely are too vague, and soto avoid availability bias in estimating the rates

(12:18):
of things, actual data from theliterature should be used, and the author's
offer as an example interstage mortality afterthe Norward palliation and how knowing the actual
range in the literature of this canbetter inform an optimal strategy. Fifth,
the authors wonder if routinely seeking anoutside opinion is a good idea and suggest

(12:39):
that this should not be performed onlywhen patients request this. This should be
deliberate and a routine practice to diminishbiases and to get an external unbiased by
your center approach. Finally, theauthors suggest that designating someone as a devil's
advocate who can advocate for the oppositeview might be useful and this might require
a smaller group of doctors to reviewthe literature or the cases in advance,

(13:03):
and the chief of cardiology of surgeryshould be routinely asking for this opinion,
as it shows an openness of thoughtas well as modeling respectful behavior regarding all
people's opinions. The authors conclude thisreview of cognitive biases by making a few
recommendations, including acquainting yourself with yourbiases, as this podcast hopefully is doing,
conducting an assessment of your own conferenceto see where bias might be influencing

(13:28):
decisions. Consider some of the suggestionsI just reviewed and which might be best
for a particular center. Make surethat the important stakeholders have buy in by
these changes, and finally assessed thebenefits and cost of these changes, And
make sure that your plan is worththe additional mental effort so as to not
be biased by the so called planningfallacy bias. The author's end by stating,

(13:50):
and I quote our hope is thatthis commentary has served as an introductory
exploration of the realm of high qualitydecision making and cognitive bias, offering practical
RECs recommendations that readers can integrate intotheir own practice. However, it's essential
to acknowledge that this introduction is byno means exhaustive, and we fervently encourage
all readers to delve deeper into thispivotal subject. A highly recommended starting point

(14:15):
is Daniel Khnemann's seminal work Thinking Fastand Slow. Well. This is definitely
a very interesting review and I amsure that if you're like me, you've
noticed or seen yourself or your centerin some of these biases. As a
chief of cardiology, I'm especially sensitiveto the authority bias issue, and certainly
I'm going to think a long timeabout this issue going forward. I also

(14:37):
think that it's important that people beallowed to openly share their opinions without fear,
intimidation, or insult. I willremember how a senior cardiologist once suggested
to me that a thought I hadabout how to manage a patient cause that
individual to question my entire ability tomake any decisions, as they explain that
the decision I was making was soprofoundly wrong headed and misguided that it made

(15:01):
them question whether I could make anygood decisions. That comment was made to
me in front of at least fiveof my then daily work colleagues, and
certainly both angered and intimidated me.I learned in that interaction once more of
the importance of being civil and respectfulto one another in these interactions, and
I would say that if I couldadd one more ingredient to helping with these

(15:24):
bias reducing approaches, profound respect ofone another and an open mindedness would seem
of at least similar importance to implementmany of these interesting suggestions. In the
interests of time, I think we'llmove forward to our conversation with doctor Daily.
The works first author, Joshua Dailyis Associate Professor of Pediatrics at the

(15:45):
University of Arkansas, and he specializesin non invasive imaging as well as general
cardiology. He is also the Fellowshipdirector at Arkansas Children's Hospital. Doctor Daily
is an Arkansas native, having attendedthe University of Arkansas for medical school.
However, he followed this with residencyand fellowship at Cincinnati Children's Hospital. He
also holds a master's degree in medicaleducation from the University of Cincinnati. It

(16:08):
is a real delight and pleasure towelcome him to the podcast. Welcome doctor
Daily. I'm here now with doctorjosh Daily of the University of Arkansas.
Doctor Daily, thanks so much forjoining us this week on the podcast,
Doctor Pass. Thank you so muchfor having me here. I'm really excited
about our discussion today. Thank you, Thank you so am. I I'm
wondering that doctor Daily. You know, your work is quite interesting, and

(16:30):
I thought maybe you could share withthe audience why you and your colleagues felt
it was important to write a paperabout the role of bias in pediatric cardiology
and cardiac surgery. Sure, soit initially stemmed simply from reading broadly outside
of our space in medicine, andso I had become very interested in how
we make decisions and the ways inwhich we deviate from rationality. So interesting.

(16:52):
Historically, man has been thought ofas a rational actor. However,
the last fifty years there's been thissurge in research within psychology and economics revealing
the humans consistently deviate from rationality andpredictable ways which are cognitive biases. And
while extensive research and application has beendone and feels like business economics and public
policy, there's really been little attentiongiven to medicine and virtually none to our

(17:17):
domain and pedatric cryology. So aswe delve deeper into the science of eye
quality, decision making, and cognitivebias, we begin to see biases certainly
within ourselves our own clinical practices,but also within the systems individuals we collaborate
with. One of the things that'sparticularly concerning about this is the literature clearly
indicates that intelligence and higher education donot immunize this against cognitive biases. But

(17:42):
rather they often exacerbate them. That'sbecause educated and intelligent individuals become we can
become very adept at justifying our viewsand beliefs and weaving narratives and interpreting data
in such a way that reinforces whatwe already believe. This, in combination
with how our system works is welooked at the world of pietric cardiology and

(18:03):
and jail heart surgeon. We sawone particular arena in which these biases often
manifest, and that's surgical case managea conference. So we decided we vamp
our own conference, and we sawan opportunity to delve in these topics more
deeply and drawing from some of thebest practices in other industries to evaluate enhance
our own conference. Nice. Giventhat it's an ongoing work, we decided
to be beneficial to record this andthen share our finance with others. And

(18:26):
that's where the paper came from.Wow, very interesting. Another plug for
reading broadly, not just within ourfield, something I could be accused of
for sure. You know, DoctorDaily, in your work, you mentioned
a number of different common biases thatplague us humans in our work and in

(18:47):
our way of thinking about things.I'm wondering if you might, for the
audience, share with us three orfour of these that you believe are the
most important that hurt our ability tomake good decisions and surgical conference for our
patients. US. Sure, theyall cognitive biases influence our behavior, but
I think there are a few inparticular that deserve to be highlighted. One

(19:08):
of the most important, and oneof the most fundamental, is confirmation by
mass, which is the tendency tosearch for, interpret, to favor,
recall information that confirms or supports ourpre existing beliefs. So it leads individuals
to favor information that aligns with ourown existing beliefs, while we ignore discount
evidence that contradicts them. So we'veall seen this play out in surgery conference.

(19:30):
For example, if a surgeon believesthat he is an elite surgeon and
highly skilled and unlikely to make significanterrors, his tendency is to dismiss or
residual VSD as heman amically insignificant andrather encourage medical management instead. Or as
a cardiologist, if I have alreadystated that I think a patient should undergo
a two ventrible repair instead of asingle ventricle paliitation I'm in a tend interpret

(19:53):
data to reinforce what I've already saidthat I know to be true, and
this greatly influences the way that wecorporate additional data and come to decisions.
The second bias that I think oftenplays out, especially in the hierarchy of
medicine, is the authority bias.So this is when individuals tend to give
excessive weight or trust to the opinionsor recommendations of authority figures, experts,

(20:17):
or those in a position of power. This bias can lead individuals to accept
information or viewpoints without subjecting them tocritical scrutiny, solely based on the perception
of the source of authoritative So,unfortunately, though, individuals often erroneously assume
that expertise in one domain automatically extendsto expertise in other domains. This is
known as the halo effective expertise.So when a senior cardiologist who may be

(20:41):
an expert in one given domain speaksup with authority about another domain, we
tend to just accept that. Sowe have to be very careful that we
are not assuming that someone's expertise extendsto other areas. And just because someone
in the hierarchy of medicine maybe havea higher position us, be it whether
it be a chief of cridiology orsurgery. We have to recognize that our

(21:03):
tendency is just to align our ownviews with theirs, in large part to
minimize the cognitive dissonance that we experiencewhen we draw a different conclusion than the
perceived expert in the room. Sowe need very careful about that, and
leaders in particular need to recognize thattheir words carry tremendous weight and the timing
in which they speak up and howthey speak can greatly influence the conversation.

(21:27):
So I believe conference should be structuredin a way to try to minimize this
and en bronmentem to recognize this incredibleimpact on our field. The next cognitive
bias I think deserves particular attention,especially in the current state of public reporting
of data, is loss a version, and this refers to our innate human

(21:47):
tendency to prioritize the avoidance of lossesover the pursuit of equivalent gains. This
leads to this heightened inclination towards riskaverse behavior. It's really fascinating the economics
literature. This has been very clearlysorted out, and the ratio is two
to one, So to offset thepain of the loss of one dollar,
you actually have to gain two dollars. So we prefer law minimizing losses of

(22:11):
the ratio two to one over pursuingequivalent gains. So the current structure of
public or reporting the data in ourfield tightens this loss of verse behavior as
we are incentivized to minimize patient mortalityinstead of optimized things like quality of life
or long term survival prospects, whichmay be more important for a given patient.
And no, in many cases,a single mortality can desert last and

(22:33):
adverse repercussions on our programs national ranking. So when you combine that with our
underlying tendency toward loss of version,we can end up prioritizing conservative treatments when
it's really in the patient's best interestto pursue a high ris surgical intervention.
So the last cognitias I wanted justto briefly highlight I see this play out

(22:55):
so many times in our conference isthe illusion of agreement, and this is
when we believe that others share ouropinions or perspectives to a much greater extent.
Than is actually the case. Soinsurgery conference, when a senior cardiologist
may speak up and no one speaksup and disagrees, everyone may assume that
everyone's in agreement, when in realityothers may be hesitant to speak up.

(23:17):
This can contribute to this concept ofgroupthink, which is the tendency for individuals
to push aside their personal opinions inorder to reach a shared consensus. The
other way this plays out, though, is when people use words like likely,
unlikely, lower, high risk withoutproviding explicit definitions for these terms.
So when a speaker serves that outcomeis likely, he might mean that the

(23:38):
chance of death is fifty percent.Someone else might perceive that it's ninety percent,
and those are very different. SoI think there's great value in a
conference in getting rid of the termslikely, unlikely, low risk, high
risk and developing a standard practice ofsaying exactly what you mean, whether it's
assigning a likelihood ratio or a percentage. One of the ways this can be

(24:00):
done in practice is to assign aninety percent confidence interval with that. So,
for instance, if I think apatient is going to potentially die in
the interstage period, I might saybased on A D and C. I
think there's a chance of interstage mortalitybetween twenty five and fifty percent. That
conveys the listener exactly what I'm thinkingin terms of what I mean is likely

(24:21):
they'll die, and also my confidencein that assessment, and that leads to
much higher quality discussions, right right, much more precision. Well, boy,
I'm sure many people listening to thiscould see themselves and their conferences and
many of the biases you just described. That was wonderful. Thanks, thanks
a lot of dark and daily youknow, one of the recommendations in your
paper to try and minimize bias andperhaps a little bit of minimizing groupthink bias

(24:48):
was that you describe the possibility ofroutinely seeking second opinions on any non straightforward
case. As you write in thepaper eloquently. Most of the time we
are asked by the patient when wedo this. Patient will often say,
would you mind asking a different centerwhat they think about this? But we
don't routinely ask other centers to dothis. I'm wondering, given your profound

(25:15):
insights into biases at your center,do you routinely seek sent in opinions on
all non straightforward cases? And howhow does that practically work? Unfortunately,
we do not do that right now. Like most centers, we continue to
primarily seek second opinions when either afamily requested or we have chosen not to
offer surgery and we want to findout another program is willing to offer surgery.

(25:37):
And there are a lot of reasonsfor this, but in large part
there are significant obstacles to making thissustainer of practice. I think the most
significant is simply the amount of uncompensatedwork involved for me to do this for
a given patient. Obviously, Ispend hours tracking down the ECHO and MRI
and getting those uploaded and talking tosomeone with other center. It's a lot
of work. So we tend toonly do this when we're triggered in one

(26:00):
of these two areas. We aretrying to make this more of a standard
practice, well, we are along way from getting there, and I
think as a field, as aholder, be great value in making this
as easy a process as possible andmaking this a stainer practice, not just
where a smaller center may ask alarger center what they think should be done,
but rather as an opportunity to consistentlyget the outside view. It's less

(26:23):
influenced by all these biases. We'recenters of similar size or even a larger
center may go to a mid sizedcenter and ask them to weigh in in
terms of what they think should bedone. I think we'd all benefit from
this as well as our patients.Yeah, well, good, good thoughts
all. Thank you. You know, since the pandemic, we've all been
using Zoom to a much more significantdegree than we ever did before. And

(26:48):
in fact, this interview is beingconducted on Zoom and I was wondering what
you think about zoom and the possibilityof its role in bias. Do you
think that the opportunity for bias isgreater or less when people are not in
the same room as would be thecase if we were in a Zoom meeting.
So there's obviously both good and badthat comes with Zoom meetings. While

(27:11):
it allows individuals from different geographic locationsto participate, I have found that there
tends to be a mental shift frombeing an active participant to an observer with
zoom, and so it's so mucheasier to be distracted and try to do
something else at the same time.And the end result is that the thresholders
speak up is typically higher and participantsare less engaged. And I think this

(27:33):
is especially exacerbated when we take ahybrid approach, which is what we typically
do, in which there are thosethat are in person participating and then there
are those that are participating virtually.And what ends up happening in that kind
of environment is those that are inperson end up being more willing to speak
up and engaged, and those thatare watching from home or other locations tend

(27:53):
to just be observers. And Ithink there's a lot of opportunity for bias
in that kind of setup, andthat's obviously less than idea, but it
doesn't just exacerbate bias. I thinkin some ways it can actually minimize bias.
So, for instance, participating byzoom makes it easier to form an
independent recommendation that's not influenced by everyonearound you. So when you're in the

(28:15):
room and and a fellow's presenting,you're often looking around in the room,
noticing the facial expressions and the responsesof those in the room. It's very
clear what other people think, andas you're formulating own opinion, you tend
to align as much are the expertsaround you. Whereas when you're participating by
zoom, it limits some of thatexposure and forces you to present to develop
your own opinion based simply on thedata at hand. So in that way,

(28:37):
I think it actually inherently minimizes theimpact of bias. In total,
it's still humans and humans are biased, and so I think the opportunities for
bias are about the same, justthe specific ways in which they manifest maybe
a little bit different in the zoomenvironment versus in person environment. And anytime
a conference is being structured, thosethings need to be taken account Yeah,

(29:00):
yeah, good points. Well,you obviously are now the guru of bias
in our field, and I'm wondering, practically, how have you integrated some
of these approaches that you're espousing andyour work into your own surgical conference in
Arkansas. And I was wondering ifthere was a lot of pushback from either
of your cardiology or surgical colleagues whenyou proposed changing certain things. Well,

(29:26):
first, we are very much stillin the process of revamping our conference.
It hasn't been the type of thingwhere on January first we made all of
these changes. We're kind of anongoing process, so we've implemented some of
the strategies outline the papers, butothers have not yet been put into practice.
So the actual process of obtaining buyin from key stakeholders is really important,

(29:48):
and it's what we've tried to do. Instead of just have a large
group meeting and say these are thethings we're going to do, we've formed
a small group of a few keyindividuals who are meeting regularly discussing how we
can improve conference. We've reached outto other programs throughout the nation to find
out how they do it and tofind out there are aspects of their practices

(30:10):
that we could borrow, and thenwe've individually gone to key stakeholders. So
in our instance, obviously our chiefof surgery is a key stakeholder and any
buying from is extremely important. Andit's very clear he greatly values efficiency and
he becomes frustrated when the conference goeson and on for hours and when cardiologists
like myself perseverate on details that hedoesn't consider important. So it's important we

(30:30):
met with him individually and make surewe understand his perspective, and as we
roll out any changes, it willbe important to take that into account and
then to get buying from him.We're going to really emphasize how this will
improve the efficiency of conference, andit's important that any changes that we make
need to be somewhat neutral from atime perspective. We can make all these

(30:51):
changes, but if conference becomes fourhours long, we've made it far worse
than it started with. So we'vewere in the midst of taking that approach
and I identifying who the primary individualsare who we need to get buy in
from, and then borrowing from practicesfrom others. So it's a lot more
effective for me to come in andnot just say how I want to do
this, but rather say this isthe issue, this is the rationale,

(31:15):
these are the best practices around thenation of how it's been done. This
is some implementation that's been performed inthe business domain. We'd like to try
this. It's a much easier,it's a lot easier to get buy in
from individual of that kind of approach. So we're in the midst of this,
and I recognize we're not going tomake every single change that we outline
in the paper on starting July firstor whatever day. This will be an

(31:37):
ongoing process and we may not geteverything so it's very important even within that
that I think we organize the importanceof items. If you want twenty things,
rarely are you going to get alltwenty things. So what are the
most important five or six things?And to emphasize and prioritize getting those changes
made and being willing to let goof a few things that may be may

(32:00):
contribute, may not be the mostimportant, right I guess that's true of
life as well. Well, DoctorDailey, I can't really thank you enough.
I tremendous insight into a topic thatis willfully underdescribed, and some very
appreciative of your time. I waswondering for those who've listened today and maybe
are interested to learn more about thistopic, can you suggest some resources for

(32:23):
them? Sure? First I wouldencourage you to read the article, which
hopefully will serve as an introduction tothe subject with some very specific recommendations to
our field, and then I wouldrecommend reading at least one book on cognitive
biases. This impacts every single oneof us, not just in surgery conference,
but in every aspect of our life, and it's very important to understand

(32:45):
the basics of how we make decisionsand how we can end up deviating from
rationality, so I would recommend startingwith the seminal work by the Nobel Laureate
Dan o'connoman. That's probably the mostcomprehensive called Thinking Fast and Slow, but
it is on. Other shorter readsthat I think are also excellent are Predictably
Irrational by Dan Arielli, and thenmultiple books by Annie Duke, such as

(33:08):
Thinking Events and How to Decide.But I picked one of those and start
with that, and then you canexpand from there if you continue to be
interested in this area. Thank youfor that, Doctor Daily, and for
those listening as usual, there'll bea link to doctor Daily's paper in the
show notes, and I'll also putlinks to some of the books that he

(33:28):
just mentioned for those of you whoare interested. Doctor Daily, congratulations to
you and your co writers on thisvery insightful paper, and thank you very
much for taking time from your busyschedule to speak with us this week.
Thank you, Doctor pass it's beena pleasure, pleasure. Thank you.
Well. I'm sure you found thatconversation of interest and doctor Daily is a
wonderful speaker and his comments really helpedme better understand some of the concepts that

(33:51):
he reviews. In his paper.He made many interesting points, and I
thought it reassuring that even he hasnot yet initiated all of the changes that
he espouses in the work, he'sclearly working hard on getting buy in from
the most important stakeholders. I foundhis notion of making sure that the changes
are to use his terminology time neutralto be very important, as I'm sure

(34:12):
most are aware that this case conferenceseems always to drag on just a bit
beyond all of our limited attention spans. As I mentioned, I'll put a
few links to some of the resourcesthat doctor Daily mentioned today in the show
notes, and I would once morelike to thank him from taking time to
speak with us about this important topicthis week. To conclude this two hundred
and ninety first episode of ped HartPediatric Cardiology, today we hear the lovely

(34:36):
Bess You is My Woman Now duetfrom Gershwin's Porgy and Bess, and today
we'll hear it sung by two youngsingers from Israel, the wonderful mezzo soprano
Maya Gore and baritone oh deed Reich. Both young singers started their training in
Israel but have gone on to singthroughout Europe. In different opera houses as
they forged their career, and bothhave impressive online presences. Sure sure you'll

(35:00):
agree that both seem to have brightfutures ahead. Thank you very much for
joining the podcast this week, andthanks once more to Darcor Daily for his
many insights. I hope you allhave a good week ahead. So long

(35:28):
it's oh awsome gold, stay asa chass. You go wait strating influence

(36:52):
if you like coming see time,So went to the time, all in

(37:21):
time, and in in time,sot a time and winter time. You
got your he's he's not as youso said by time more time in time,

(38:52):
so the time and went to times. W this man atoining you I

(39:17):
did despall se
Advertise With Us

Popular Podcasts

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.