Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Morten Andersen (00:05):
Hello, and
welcome to What Monkeys Do. My
name is Morten Kamp Andersen.
And this is a podcast about whatit takes to make a change and
make it stick.
This podcast is about change.
And today we'll talk about howto use an evidence-based
approach to make our decisions.
(00:29):
Because before we make a change,before we make any change, you
first have to decide what tochange, and importantly, how to
change and the 'how' isimportant. Let me take an
example. Let's say that yousuffer from a mild depression.
So on a scale from minus 10, toplus 10, you are, let's say a
minus two or minus three. Now,there are many things that you
(00:51):
can potentially do, you can goto therapy, and there are many
therapeutic directions. Socognitive behavioral therapy,
psycho analysis, emotionaltherapy, relational therapy,
act, and 10s of others oftherapy directions, you can also
take medicine, or you can forceyourself to be socially active,
(01:11):
you know, meet people, you know,or you can read a self help book
and follow the instructions inthat one, or you can do
exercise, there are many thingsyou can do. And because we're
complex humans, there is not onerecipe for what is certain to
work. But a few of those that Ijust mentioned, are more likely
to work for you than the others.
We simply have evidence whichsuggest some things will work
(01:33):
better for you, if you have amild depression, and some
things, that does not work. Andthat is also the case with
changes at work. Maybe people donot have a high level of job
satisfaction in the team thatyou work in. Is that a problem?
Can you do something about it?
And if you do want to dosomething about it, what? What
(01:56):
is the most effective way tobuild satisfaction at work?
Well, that's actually alsosomething we know something
about. And when we look at theevidence, we often find that we
are basing many of our decisionson myth or incorrect facts, so
to speak. So the interestingthing is how can we make better
decisions using the evidencewhich we have at hand. My guest
(02:19):
today is a front person in theevidence-based practice
movement. He is a professor inorganizational psychology at
Queen Mary University in London.
And he's also the founder andscientific director of the
Center for Evidence-basedManagement. Welcome to you, Rob
Briner.
Rob Briner (02:37):
Thank you very much,
Morten, thank you for inviting
me on your podcast.
Morten Andersen (02:40):
Yeah,
absolutely. Absolutely. So in
this episode, we'll talk abouthow to make better decisions
using evidence-based practice.
And you've spent many yearsfocusing on exactly that. Can
you tell us what evidencepractice is and how it's
different from how we normallymake decisions?
Rob Briner (02:57):
Yeah, sure. That's a
very important question. So it's
an evidence-based practice isreally about making more
informed decisions. Andactually, both now I think,
historically, the whole termevidence-based is a little bit
It's either off putting topeople or it make these people
very particular impression aboutwhat we mean by evidence, it is
important to think about whatare the differences between what
(03:19):
we seem to normally do and anevidence-based practice
approach, and I think there arethree main differences. The
first, I guess, is the broadapproach to thinking and acting
around evidence. And thetraditional definition for
medicine is conscientious,explicit. And judicious. So
conscientious means you makelots of effort around gathering
evidence, explicit means youshare it, you'd say what
(03:42):
evidence and information anddata we're using judicious is
judging the quality of thatinformation. So it's almost like
a whole general approach. So forexample, if you take the
judicious bit an important partof evidence-based practice, is
not the use of all the evidence,use the best available evidence
on the set a lot of informationdata we've got, it's very
(04:02):
unreliable, we probablyshouldn't even really look at
it. But without looking at it,we can't start to judge its
quality. So the first is thisgeneral conscientious, explicit
and judicious approach, or saythe second main difference is
the use of multiple sources ofdata. So for example, in HR and
organizations and other contextshas a lot of emphasis on data
(04:24):
analytics, but typically thatthat's only actually using one
source of data, which is oftenorganizational data, or data
from employees, and that's fine.
But the idea of using multiplesources is for two reasons. I
think one is this idea oftriangulation, you cross check
and you compare, which is prettyimportant. The second reason for
doing it is to contextualizedata. So for example, you
mentioned your introductionaround depression, you might
(04:47):
look at the scientific evidencearound for example, treating
mild depression, but actuallyyou need to know all about the
person, individual situation.
Other things about them beforewe can say does this scientific
evidence apply to this persondoes exactly the same
organizations, does thescientific evidence, for example
about job satisfaction? Again,you mentioned there's quite a
(05:08):
lot of that, does it apply inthis situation to these people
to this organization right now,it may do it may not, without
looking across multiple sources,you just don't know. So I think
it's so multiple sources is tocontextualize and cross check
and triangulate. I think thethird main difference is taking
a structured and step approach.
Of course, everybody always usesevidence and information and
(05:30):
data in their decision making,you can't make a decision
without doing it. But typically,since people don't follow a
structured process, so they willjump in, for example, to a
solution, and then try andjustify the solution, or they'll
dig around and not really be tooclear about what the problem is.
But once act anyway, so thepoint about having structure is
to help us stick to a processfor making a more informed
(05:54):
decision. Because the seems tobe seems to be the case, it's
really difficult for people touse evidence and use data and
use information not because it'sbecause it's technically
difficult. You don't have to bea geek or Brainiac. It's because
many, many things get in theway, which I think we're gonna
come on to rate. Three maindifferences. We always use
(06:14):
evidence, but it's thisapproach, conscientious,
explicit and judicious. It'sabout multiple sources to try
and guide and cross check. It'salso taking a structured
approach, which has mentionedhelps us stick on track, but
also, in a sense forces us tothink about the problem or
opportunity first, before we gointo solution mode. In a way,
they sound like simpledifferences. But in other way,
(06:34):
they're quite importantdifferences.
Morten Andersen (06:37):
If I look at
sort of the history of
evidence-based approaches, itoriginates from medicine, I
think it was back in the late80s or early 90s. I remember
reading and wondering why thereis such a big gap between what
we know works and what peopleare applying. And I remember, as
I studied psychology, I wasstruck by that same gap that in
(06:57):
psychology, we also have someevidence about, you know, which
directions to look at and whichnot to look at, and then what is
being applied. And in yourwriting, You have also shown how
that is true for management andleadership and HR and management
consultants, etc. Why do youthink there is such a big gap?
Is that a resistance towardslooking at evidence? Is it just
(07:20):
you know, we do it how wenormally do it? Or what is the
explanation?
Rob Briner (07:25):
That's a great
question. And in my experience,
or thinking about this for along time, it's extremely
complicated. There's many thingsgoing on. But if you wanted to
pick one or two reasons whyevidence isn't better used. I
think, at the moment, at leastmy current thinking today or
this morning, is that it'sbecause practitioners, including
me, as an academic areincentivized not necessarily to
(07:48):
tackle the most importantproblems, opportunities, are
incentivized to not necessarilydo what's most effective.
crudely speaking, most of us areincentivized to do stuff,
whether you're a surgeon, policeofficer, an academic researcher,
or consultant, your incentivesare to do stuff, because that's
(08:09):
what will get you rewards andrecognitions. And if you don't
do stuff, you may well getpunished for it as well. So
that's the issue. So if peoplecan't see a reason to use
evidence, and to take moreinformed decisions, really
simple, they won't do it, theysee any possible benefit.
Morten Andersen (08:27):
So we are more
incentivized to do something
rather than do nothing. Andwe're more incentivized to do
something rather than do theright thing. And because doing
the right thing probablyinvolves more work or a lead
time, or we may not even knowwhether it is going to be right.
Therefore, doing things isactually what we're just going
to do.
Rob Briner (08:46):
Yeah. And also
because if you think about many
and you know, in your context ofchange is particularly true, but
in many organizations,interventions and HR and
elsewhere, as you know, theytypically not not evaluated only
am I paid to do stuff to do itfast to get things done.
actually doesn't matter unlessit unless it's a disaster. And
very few things are completedisaster. I'll get away with it
(09:10):
in the sense of it just doesn'tmatter. Here's a new training
program here, the high talentmanagement program, whatever it
is, let's just roll it out,bring it in, do the stuff.
People become happy. And ifthey're not, that's okay. We'll
do something else in two yearstime. It's pot it is tempting
sometimes to blame individualdecision makers and managers.
(09:30):
But increasingly, I think it's acontextual issue. Of course that
makes it as it were a wickedproblem is quite hard to tackle.
Morten Andersen (09:37):
Yes.
Interestingly enough, there'sactually more and more studies
being made but also more andmore studies being made to and
shown to the general public. Soif I just look at my Facebook
feed, I will see or newspaper Iwill see many headlines like
study shows that coffee is goodfor your weight loss or study
shows that you're talking to amother can prolong your life or
(09:57):
studies shows that havingchildren makes you unhappy or
things like that. And thenthere'll be a link to some
study, most of those studiesactually not very good to be
quite frank, you, it's very hardto read behind them. But if you
can, it's based on a very littlepopulation and poor quality. But
on the other hand, there aremore sort of studies that people
can see now. And that's also whyI suppose why having multiple
(10:20):
sources is very important.
Because otherwise you can alwaysfind research to support your
way of thinking, so to speak.
Rob Briner (10:29):
Yeah, multiple
sources. And as you imply
critical appraisal, which meansjudging the quality of it, and
this obviously, huge paradox,it's a kind of signal and noise
problem. At the moment, probablymost organizations, decision
makers and other decision makershave access to more information
than probably ever, but itdoesn't, I don't think it means
they're making better informeddecisions. And I think, again,
(10:50):
as you now think, like Facebook,or LinkedIn, or lots of sources,
the most readily availableinformation in my experience is
typically the worst quality. Andthis odd sort of paradox of
being more and more and morenews data Information Studies,
which is you say, often a quitepoor quality, and actually
wading through that, to get tothe signal behind it is actually
(11:12):
quite effortful. Which isexactly why, for example, in
medicine, there's been a hugeinstitutional effort globally,
for decades now, in the context,in the case, particularly of
scientific evidence, to try andsummarize it to try and do
systematic reviews. So peoplearen't bombarded with one study
here one study there, can wehave an overview, as of now of
the best available evidence, andwhat generally seems to be
(11:34):
telling us,
Morten Andersen (11:35):
there's been a
lot of focus on our biases, then
economists work, of course, havebeen very influential in that
sense. I wonder ifevidence-based approaches is a
way to overcome some of thoseinfluences of bias?
Rob Briner (11:50):
I think it is a way.
And I think in that sense, Ithink you can never overcome
bias I noticed on LinkedIn andelsewhere, people talking about
in the context of diversity andinclusion, the idea of removing
bias or crushing bias, anyonewith even a basic psychology
education that you can't removebias. But as you said, maybe
there's ways of trying to reduceit control. I think the
(12:11):
evidence-based practice approachis one potential way of doing
now called politics and power isalways there, it's always within
you can't ignore it. But atleast if you're taking this idea
of a conscientious, explicit andjudicious approach, if you're
following steps, if you'relooking across multiple sources,
if we're involving groups indecision making, it seems to me,
all those steps make it morelikely you address some of the
(12:35):
biases that economists andothers have been writing about,
again, for decades and decades,such as confirmation bias. So in
a context of evidence-basedpractice decision, one of the
things you train people do to beaware of some of these biases.
So people go out searching forevidence that already confirms
our existing beliefs. The ideais everyone would see that and
(12:55):
be aware of it. And of course,at the point that often other
people, and this is where Ithink the social element, or the
shared cognition element ofevidence-based practice is
important. Other people can seeour bias is probably much better
than weekend. So if I Morton waspresenting information to you
about, oh, this is why I want todo this organizational change.
I've just come across thisthing. It's really cool.
Microsoft, do it, Google, do it,whatever, whatever. You'd say,
(13:18):
hold on a minute, Rob, I can seeyou love this idea. But come on,
let's think about what's behindit. So the social element of it,
the shared decision makingelement of it is also I think,
another way in which it helpsdeal to some extent with some of
these biases. Yeah,
Morten Andersen (13:34):
I think that's
actually really, really
important. Because also, as youmentioned, you know, your your
three, your three elements ofwhat is evidence-based approach,
one of them is to, to make itcontextual. And contextual, can
also be a way to say, well, thisstudy doesn't apply here because
of things. And then you makeexcuses for why you want to use
a particular intervention, so tospeak.
Rob Briner (13:56):
That's right. And
that's almost the opposite of
our people. The older, everyorganization is so special and
unique. Nothing anyone else doescan possibly work here. And
again, there's a real paradox,because on the one hand,
organizations do say that, onthe other hand, organizations
will also leap onto ideas, forexample, uses the Google do
there's nothing to do with theirorganization. People don't like
(14:17):
something, they'll say, well, itjust won't work here. If people
do like something that ignorethe fact they're nothing
whatsoever like Google. So whywould that work for you? So you
get these two again, it's justanother way I suppose of, of
either doing something you wantto not do something you don't
want, by drawing in not a veryeffective way on different
sources of evidence andinformation.
Morten Andersen (14:37):
Evidence-based
approach is essentially trying
to take the best availableknowledge that we have in order
to make better decisions. Inorder to get that we basically
have to change the way weapproach this. So we basically
have to make our decisionprocess explicit. We also have
to judge it much better. We haveto look at multiple sources and
we have to use a structuredapproach and if we Apply this
(15:01):
evidence-based practice, thenour decision making and
ultimately, our outcomes will bemuch better.
Rob Briner (15:08):
Exactly, as you said
right at the beginning with your
depression examples. It's notabout having a solution. And
this is something that drives meand I'm sure others a bit crazy
when it comes to a lot ofmanagement practice. It's both
marketed as and I thinksometimes thought of as an
answer. The whole point aboutevidence-based practice isn't
about having an answer. We'renot dealing with an equation,
we're not dealing with a mathproblem, we're dealing with
(15:29):
usually quite a complicatedsituation. And there's many
things we can do. And what we'relooking to do is do the thing
that is most likely to work andit may turn out three things
that equally most likely work,and that's fine. So we're not
looking for a single answer,we're looking at in terms of
probability, the more likelycourse of action leading to the
outcome we want.
Morten Andersen (15:47):
Great.
So I'm really interested in howto use evidence-based practice
in in real life. So let's use anexample. A team in an
organization is not performingwell. They say themselves that
(16:08):
the team dynamics are not verygood. People are not very
engaged, they don't work welltogether, and maybe some that
don't even like each other thatmuch. So they call HR to make an
intervention, make the team workwell, again, so to speak. How
can the HR person use theevidence-based approach in such
a such a situation?
Rob Briner (16:27):
Sure. So I think
there's a couple of things, I
think the first thing is thatit's about multiple sources. And
we haven't yet said what thosesources are. So we talked about
scientific literature a bit. Butof course, when we're talking
about organizational data a bit,but only to have four, ideally,
four sources would use wouldalso talk to stakeholders in the
context of your team example, itwould, of course, be the team
(16:50):
members, but they may also needthe team's clients and
customers, they may also beother teams, it may also be the
team's manager or team leader.
That's the third sort ofstakeholders. And the fourth is
practitioners own expertise. Soagain, an example you've given,
that will be the expertise ofthe HR practitioner, the HR team
in dealing with these kinds ofissues. So the first thing is
the multiple sources. The secondthing is, is really, of course,
(17:12):
trying to understand what theproblem or opportunity is, for
everything you mentioned, peoplenot getting on people not liking
each other people are feelingengaged. The question is, of
course, why is that a problem?
So what, you know, that's life,right? People don't like each
other. So what we have to bequite careful about leaping on
to what we think is a problembecause we've already sort of a
(17:33):
solution. So in the case of thisexample, it may be a what we
need is a team level and a groupintervention are going to give
everybody no Belbin team rolesare going to give them this
thing, because you've left onthis thing to be the problem.
Now, in and of itself, I thinkpeople not getting on. And I,
you know, why is that? So what?
Not being engaged, sort of sowhat it's almost like, when
(17:55):
presenting this problem, werealmost implying in diagnosing
the problem. We already knowwhat the solution is. Yeah. So
it should increase or helppeople not not not being
conflict anymore. And just anexample, team conflict. You
know, as you know, there's asense of team conflict is not
actually a bad thing. So if youremove team conflict, is that
good and good for what? So veryoften, I think, in this initial
(18:18):
these initial problems thatpresented what isn't done so
much is asking, why is this aproblem? And is there actually a
more fundamental problem or setof problems or opportunities
there that we haven't reallydiscussed yet? Because we're
dealing the kind of surfacelevel of people saying I don't
like things I'm not gay. So Ithink there's one of multiple
sorts and the first thing is tospend quite a lot of time in
what is the issue? What is theproblem? What is the
(18:40):
opportunity? And it'sinteresting that when we do
training with this, even if it'sonly a day or half a day, with a
particular organization, wequite often say, let's just
spend, say, the morning or thefirst hour just thinking about,
what is the problem? What arethe opportunity, and what nearly
always happens is withinminutes, within 10 minutes, five
minutes, people are right on tothe solution. And I think again,
(19:03):
going back to your questionabout why is this difficult? I
think one of the reasons it'sdifficult, is because people
quite enjoy talking aboutsolutions. They love chatting
about, you know what they cando? should do they should do
that, how can make it happen?
Because in a sense, there's tworeasons. One, it's cognitively
easier. It's not so much hardwork. And secondly, it's less
(19:24):
likely to lead to conflict,which people particularly want
want to avoid. So spending timethink about what is the problem
or issue or opportunity? Ifthere's one thing people often
say, what's the one thing youcould do it that's the one thing
you can do. If you want to bespaced, spend more time and that
of course that drives somepeople nuts, because they're
thinking that I need to dosomething, I want to do
(19:46):
something stop making me thinkabout the issue. Let's just do
something and of course, youcan, you can, but the quicker
you do it. The more you dowithout understanding the
situation, the less likely it isyou found a problem or issue and
therefore Most likely is goingto work, there's a kind of trade
off, almost like speed accuracy,trade off a classic
psychological trade off. Sothink back to your example, I
(20:08):
think I've spent a lot of timelooking across those four areas,
but also really trying tounderstand sort of what the
problems or issues are. First,before we do anything, think
about a solution.
Morten Andersen (20:19):
And one of the
things that often arise when you
look at the problem is that youthink, is this really a big
problem? So for instance, withlack of engagement, there is
always a feeling that lack ofengagement will lead to people
leaving the organization orlower productivity. But when we
look at the evidence, sure,there is a positive correlation
there, but it's actually a lotlower than people think it is.
Rob Briner (20:40):
Yes. And it's like,
again, the example you gave with
the job satisfaction. Yeah, oldfashioned engagement, if you
like job satisfaction,performance link or job
satisfaction, turnover link isnot there is no link, it's that
my understanding, broadlyspeaking, the kind of quite
large body of evidence is a linkisn't necessarily strong. And
you know, some cases, it mightbe quite small. So if the
(21:03):
problem is performance, whywould you start with job
satisfaction is your solution,because it appears isn't that
striking. And again, becauseinteresting to go back to those
four sources, it may be that inyour organization, for the
particular employees you'retalking about, maybe there is a
really strong link. Again,that's why it's important to
look at your organizational dataas well as the scientific
evidence, because you get thatsort of sense of triangulation
(21:25):
and contextualization.
Morten Andersen (21:27):
Many people
think that evidence-based
approach is really about findingresearch from academia and so
on. But I think you highlightingthat there are actually four
sources that are equallyimportant. So one is the
research, but the other one isspeaking with the stakeholders
at play, also your ownexperience, and what works in
the organization is reallyimportant. And then
(21:47):
organizational data. And I guesspeople struggle a little bit
finding good research. We'lltalk a little bit about that.
But also, many organizationshave really bad data. I mean,
when you try to find out what isthe link between engagement and
productivity for this team,nobody knows.
Rob Briner (22:04):
Exactly. And that's
amazing. Another if you needed
another reason for trying to dosome, like evidence-based
practice. One reason is that italmost accidentally, or as a
side effect, acts as a sort of aquality audit of your data. You
say, if you go searching forinformation, you go, Well, it
isn't here, or we do have it orit's impossible to get hold of
(22:25):
or the data simmer, unreliable?
Why are we measuring it likethis? That's pretty important,
because it tells you there'salmost as an organizational
development intervention, if youwant to make more informed
decisions, rather than rushingoff and implementing the next
thing, the next thing, the nextthing, you actually spend some
effort and time getting goodquality, relevant information to
decision makers, because if youdon't do that, you'll never be
(22:46):
able to be particularlyevidence-based because you just
don't have access to it.
Morten Andersen (22:51):
Yes. Now, one
of the important sources you
mentioned is the scientificknowledge. How do non academic
people get access to that? Howcan we navigate in that
knowledge? Do you have anyadvice for people in HR or
personally?
Rob Briner (23:04):
Yeah, I do so so
quick, and dirty tips are to go
on something like GoogleScholar. Because open access is
improving all the time, it'sgetting a bit easier to go on
Google Scholar, just startlooking around type in the
concepts the word you'reinterested in, as I mentioned,
something, I think I mentioned,high performance individuals or
talent management or some aspectof telemanagement going to
(23:25):
Google Scholar, type in some acouple of words, couple of
terms, and then search forthings like systematic reviews,
or reviews, or meta analyses.
And depending on the topic, tryand restricted to maybe the last
10 years, and just start to digaround and see what's there. Now
again, I think for HRpractitioners or managers, they
are quite used to havingextremely pre digested as it
(23:46):
were. otters are sometimes notvery good evidence, I guess, the
quiet used to, but where's thewhere's the answer? Where's the
where's the sort of heart ofthis? And of course, it's
difficult with academics. To dothat. I would say it's also
important to manage people'sexpectations and say, you're
pretty unlikely to go on toGoogle type things and get the
(24:06):
answer you're looking for.
Because Well, for all kinds ofreasons. So but that's okay. And
again, in terms of an oddevelopment, I'd also say, for
any manager or HR practitioner,this is almost like a personal
professional development thing.
start trying it, you know, gointo Google Scholar, the
decision you're making nextweek, just dig out three or four
things don't go crazy. Just tryread them, they probably won't
(24:27):
be very nice to read. They'reprobably a bit difficulty.
That's okay. Have a go readthem. And again, it's the whole
point, you're more likely tomake a better informed decision
if you use at least someevidence rather than none or
some reasonable quality evidencerather than poor quality.
Morten Andersen (24:45):
So it is
actually more available now. It
used to be Reed Elsevier and acouple of other publications
that owned it all. Now it's moreopen source which is fantastic.
But as you say, it is still alittle bit hard to get hold of
and the meta studies are Reallythe ones to focus on because
they take the most everythingelse and digest and say, This is
(25:06):
what we know.
Rob Briner (25:07):
Exactly. Again, I
think sometimes people when they
read this stuff, they either getangry with academics, which is
fine, or they sort of blamethemselves for not being smart
enough, which is not fine,because of course, they're smart
enough. So it's a sort oflearning thing exactly like I
would do with the students Iteach saying, the first time we
read one of these papers, you'llthink what the hell is this
(25:27):
about the second time, it's abit better 10 times, you know,
20 times it's just practice. Theissue, again, about using
organizational data unless youstart doing this stuff, a lot of
the barriers and because youdon't have the skills, knowledge
and access to information, themore you try to do it, the more
you'll build that individual andorganizational capacity to do.
Morten Andersen (25:48):
Yes. Now, HR is
is fairly renowned for using
many tools that have no basis inevidence at all almost, and some
that are sort of pseudoacademic. So for instance, many
organizations use subversion ofMBTI, as a way to select or
recruit or develop. And I knowin academia, there's a lot of
(26:10):
reservation about that type ofpersonality test, especially
when using intellection. Why doyou think that HR continue to
use tools which academia showedthat does not work? Well,
Rob Briner (26:23):
I think it's because
they are doing different things
or asking different questions.
So this is MBTI happens to besomething I've had a lot of
discussions with practitionersabout, because occasionally, I
mean, I've given completelygiven up raised a lot of posts
on Twitter and LinkedIn aboutthis and got some really, you
know, facet, for me reallyfascinating discussions with
people about this. So as anacademic is asking the question,
is MBTI a valid and reliablemeasure personality? To which
(26:47):
the answer is no, really, apractitioner is saying, What do
I feel if I give this to someoneand give them feedback on it,
where I think it's useful? Well,they think it's useful. Again,
talking to some practitioners,they completely accept it's not
a valid measure personality, butthey think it's a tool for doing
something else. There's almostacademics and practitioners are
coming to the same instrument,asking, the questions are fine,
(27:10):
but they're different questions.
And indeed, if you look at thatsecond one, the idea that if you
give people personality tests,of course, it's not very valid.
that's problematic. But that's adifferent kind of discussion. If
you give people feedback frompersonality tests, the question
is, yeah, the person giving thetest may enjoy doing it, of
course, because they feel thehelping someone, the person who
(27:32):
receiving it may find ithelpful, fine. But the most
important question is, is it infact, the case? that by giving
people feedback from apersonality test, whatever it
is, is it actuallydevelopmental? And that's a
question which, from myunderstanding is, at least in
the scientific literature, wedon't have an answer. We have
huge amounts of practitionerexperienced practitioner
(27:53):
evidence, practitioner expertisethat says, yes, it does, fine.
But the central that's just onesource of evidence. And we need
to look at other sources aswell. Hmm. Many practitioners in
many fields do stuff, because ithas an immediate impact. And
people like it, when peopleenjoy it, and they enjoy doing
it. And they will just always doit, it doesn't matter what
(28:14):
evidence you show them, becausethey're doing something else,
they're doing something theythink is valuable, and that's
fine. But then you have to pindown you have to drill down to
what they say they think it'svaluable, what do they really
think it's doing? And can wefind out if it is doing that
thing? They think is an IV mightbe like a, you know, some
medical interventions? Yeah,sure. The patient might love it,
(28:36):
the practitioner like might loveit. Great. That's one way of
thinking about it. But actually,is it curing? is it helping? is
it doing what you expect? Whichis another kind of question?
Morten Andersen (28:46):
Yes. And I
guess this is where sort of the
four sources you have experienceas, as one source and you have
scientific research as anotherhere, you can actually have a
conflict of those sources, wheremy experiences it works. When I
use this tool on a person theywill develop. And we don't
really measure before and after,we don't really evaluate, but
(29:07):
that is my experience. Andscientific research says, Well,
if you know, that's not the bestmeasure for measuring a
personality trait.
Rob Briner (29:15):
Yeah. And I think
going back to the idea that
spirits get killed by the issueof also say, what's the quality
of that evidence? They couldsay? Yes, for my experience, I
believe whenever I do this, Isee people develop. And then
again, I think it's aboutdrilling down. So okay, so what
have you actually observed? Howmany times you've observed it?
have you kept track? How do youknow, you know, if you control
(29:37):
for biases in some way? Is itpossible? got it wrong in some
way? Exactly. The same questionwas asked about a body of
scientific evidence because,yeah, sure people can believe
it. But the question is, what isthat belief based on? Yes, and
if based on is quitequestionable, and probably not
reliable? Sure, we can stillhave that belief, but I need to
separate out I just believe thiswhen it gets to something like
(29:59):
faith From actually I also haveother kinds of data and evidence
that support that belief. So webelieve things we can't really
support doesn't mean they'rewrong or right, it means we
can't support them very well.
So, again, I think it's not asif I have this belief, I like
it. But actually, that's allI've got. Well, that's most.
Morten Andersen (30:18):
And in
organisations, I guess there are
some things that we can measurebetter than other sorts
selection, for instance, is alittle bit better to to measure
because we can see how many of acohort is still there after 12
months or after 18 months. And,and I suppose we, we do have
some evidence to suggest thatsemi structured interviews and
are very good IQ is actuallyreasonably good, and, and so on.
(30:40):
So we do have some evidencearound selection, for instance.
Rob Briner (30:43):
Sure, yeah, you're
right. I think in some some
areas, certainly terms oforganizational data and
scientific evidence, just theother slightly more amenable
some kinds of interventions. AndI think that's okay. Because,
again, I mean, obviously, yourerrors, organizational changes,
lots of things about that.
They're just not very amenableto organization in the sort of
investigations or scientificinvestigations. But I think it's
(31:04):
a question of knowledge that wemaybe don't know. And that's
okay. It's but other thanpretending this is going to
work. And we know it well. No,we don't really come on maybe
got some clues, some hints? Andactually, you might not never
know, but it's extremelydifficult. For this question,
this area to actually get datadirect the answer that question,
let's say, okay, you just don't,is it worth pretend that, you
(31:28):
know, you've got this great bodyof scientific evidence, for
example, binds? And
Morten Andersen (31:35):
so I think what
is, I think what is really
fascinating about this is onethat you would probably spend
more time on the quality of yourquestioning, and answer the
assumptions of your problemquestioning. And that will give
you a lot of value instead ofgoing straight to solution
thinking, and then the multiplesources of of, of data. So you
(31:58):
can challenge your biases, soyou can understand, you know,
what other ways of ofintervention? Might there be?
Rob Briner (32:05):
Yeah. and the value
also, I think, is part, as I
mentioned, if you think about itas a professional organizational
development intervention, Ithink it's also to keep doing
it. I hear sometimespractitioners say, Well, I tried
this once. We didn't get theanswer. So forget it. I think it
will. It's not, the point is tokeep do and also, probably the
point is not to do for everydecision, just think about big
or important ones, and do it andtry for those again, you know, I
(32:29):
think I was less in favor ofthis some years ago. But I'm
just much more in favor nowsaying, try it out. Do it as
best as you can reflect on it,do it again, do it again, do it
again, probably, you will startto say get the data and
information you need. But you'llget better at it. And then it
was it'll make a lot more sense.
Just doing as a one off. I don'tthink like anything, you can't
(32:51):
learn very much.
Morten Andersen (33:02):
So many times,
we also want to make a change in
our personal life. We want tolose weight, we want to reduce
our stress level or improve ourmarriage. I don't know, how can
you suggest that we useevidence-based practice in our
daily life?
Rob Briner (33:17):
So again, in the
last few years, I'm kind of
coming to the conclusion thatwhat decision makers, of course
do is they take the way theymake decisions in their personal
life into work, because ofcourse, we're human rights, what
we do, and always say, well,that's fine, probably for some
kinds of personal decisions, butit's not fine. If your decisions
as a manager in organization mayhave impact on hundreds or 1000s
(33:40):
of people and an organizationand its success. So the way we
make decisions individually. Andpersonally. Yeah, it's not great
all the time. But that's okay.
Because normally consequencesaren't great. But I think, to
take it into our personal livesis something we can do. And I
think it's part of what I wassaying right in the beginning,
which is about thinking aboutwhat is the problem or
(34:00):
opportunity. So you gave theexample of weight. I mean,
weight is a classic example ofbeing something people worry
about, I guess, partly becausewe're bombarded with images of
slimness, and so on, and weightsassociated with all kinds of
certainly in our culturalcontext associated with all
kinds of not good things on thewhole. And of course, sure, Yes,
there is. Other kinds ofscientific evidence that
(34:23):
suggests that being overweightis not great. Sure. But often
we, this manifests itself andmaybe anxieties, certain kinds
of worries, which then leads to,again, diet, it's a great
example of the kind of quickfix, if I do this thing, I can
lose whatever, 10 kilos in 10weeks, whatever wherever crazy
thing is. And again, I thinkit's quite some parallels,
(34:43):
particularly the diet industry,and the the consulting industry
in that a lot of things thatsome consultancies sell as quick
fixes to complicated problems.
Of course, inevitably, theydon't work. And so you look for
the next thing. The next thingthe next thing, I think this is
just as fat fashions arethinking in business and HR, the
same fads and fashions in diets.
And in the end, for both cases,probably what you need to do to
(35:07):
effectively bring about changepretty much hasn't changed.
Maybe some of these new thingsmake some peripheral difference,
but it probably hasn't changedvery much. So I think starting
on what is the problem, sayaround weight. And the other
thing is very noticeable. Justbecause I'm pretty often like
using when I'm teaching andtraining is example of kitchen
equipment. So Morten, I don'tknow about your kitchen. But if
(35:30):
I think about my kitchen, someof my cupboards in my kitchen is
pieces of equipment there like ajuicer, or Steam, or a bread
maker, whatever it is that whenI bought it, it made a lot of
sense when I bought out all thisjuice. So this is going to, it's
going to fix some problem. Did Iknow what the problem is? Not
really. But I still kind ofsolution. This is kind of
(35:51):
solutionary, you identify aproblem by the absence of the
solution. problem in my life, isI don't have enough juice, the
problem in my life, is I don'thave enough fresh bread and
whatever it is, isn't, itprobably isn't really a problem.
So if you look around, you know,our houses example, personal
decision making, we can findthis stuff that we bought. In
the end, it wasn't for a greatreason. We just thought it was
(36:15):
cool, it would help with USforeign or everybody else was
doing it. So I think there'squite a lot of lessons there.
And that for me personally, abit depressing looking at things
on my shelves, I'm looking atyour bookcase behind you. Morten
seeing it in the video, you knowhow often we buy books, and
they're up there? And are weever going to read them? And
they kind of, or mentors andteasers and go haha, never going
(36:36):
to and again, buy this stuff?
Because we we imagine, yeah, wewant to read it or we're going
to read it or not for it'spretty unrealistic. So think,
thinking about how can you thisapproach in our private lives? I
think, again, it's the sameimportant thing is a what
actually was the problem orissue or opportunity here? We're
(36:56):
really you know, spend more timeon that and is particularly I
think now is the kind of onlineshopping that was existing
anyway, particularly locked downa number of people have just
bought stuff that's completewitness. And they'll probably
end up having to try and resellit or something else. I think
it's it's part of that sense ofwe feel we can buy stuff to help
(37:16):
us ensure or sometimes it mightor we feel we can adopt a diet
plan to help us ensure sometimeswhat's missing from those a lot
of action in there.
Morten Andersen (37:25):
Yes.
Rob Briner (37:26):
So I think it kind
of provides individual things.
But as difficult were just as weare consumers of products and
services is it in our personallife, we're consumers of
products and services and ideas,I think my professional lives as
well. And I think as I say,there's a huge similarity
between those two fields.
Morten Andersen (37:43):
So I think
framing the problem really well.
But also looking at the problemfrom many different angles in
our personal life is actually areally, really important thing.
And I so for instance, with thewith the weight, as you
mentioned, so sure, you can seethe positive benefits of losing
five kilos or something likethat. But as you say, there is
also a lot of stress related tolosing that, anxiety, maybe
(38:08):
also, there's a lot of evidencethat that weight loss do not
work. So you will actuallyembark on something which has a
very slim chance of evenworking. So having all of that
knowledge before you formulatethe problem and therefore look
at the solutions will actuallyhelp you decide maybe this is
not worth it anyway, like inorganizations where you look at
the engagement and engagementhas fallen a little bit. And
(38:30):
then you look at Well, theconsequences is not actually
that much. So does it reallymatter? Should we really make
this big intervention to changethat. So I think that's a
really, really powerful messagethat spend some more time on the
problem and researching it andgetting a lot of data to to
understand the problem beforeyou look at a solution.
Rob Briner (38:49):
Yeah. And also think
reframing sometimes those
problems, not not in a kind ofcheesy positive psychology way
but to reframe from thoseproblems as opportunities as in
occur when we're not reallyclear that this is a problem
this team not getting on. But isthat it? Can we detect an
opportunity to do something wethink is actually going to be a
benefit rather than Can we dosomething to avoid a negative
(39:12):
could do something to enhancethe positive? And why is that
potentially important? So Ithink thinking about problems
and opportunities maybe occursto me simultaneously, but
thinking what the best way toframe is also quite
Morten Andersen (39:23):
so the idea
with evidence-based practice and
approach is that we will makeour decisions and our
interventions better. And wewill do that by having a
different approach to how wemake decisions we will have we
will use multiple sources tounderstand the problem and
therefore some of the solutions.
And we'll use a structured openapproach to making decisions and
(39:46):
if we do that, we can still relyon our gut feeling our
experience we can we just add inscientific research and
knowledge from otherstakeholders and and maybe
organizational data. All of thatwill basically means that we
will make better decisions. Andyes, it will take longer time.
And yes, it probably willcomplicate something. But the
benefit of this should be thatwe will have, you know, better
(40:09):
interventions and betterdecisions made. And there might
be a lot of things that wethought we would jump straight
to and and and make a solutionaround that we will not do
anyway. And that's also abenefit, doing less is actually
a big benefit, I think fromthis. Yeah, definitely.
Fantastic. I'm Rob, I want tosay thank you very much for
taking the time to speak withme. I think the the
(40:31):
evidence-based practice isvastly underused and where,
where whether I look atmedicine, or psychology, or
organizations or even at home, Ithink we can just benefit from
just thinking much smarter aboutour problems. And it doesn't
mean we have to read a lot ofacademic research, we may have
to pick up one or two articles,but it's actually applying a
(40:52):
more structured approach toseeking more information and
being more critical.
Rob Briner (40:58):
That's right. And I
think the key thing to take away
from this as well as it's notabout as you're saying, it's not
about making a perfect decision.
It's about making a moreinformed decision. And even if
it's a little more informed,just a little bit more informed,
you're still more likely to getthe outcome you want. By
focusing on getting more betterquality information to both
understand the problem. Let'sthink about potential
interventions. Yeah.
Morten Andersen (41:20):
Fantastic.
Thank you very much for ourconversation.
Rob Briner (41:23):
You're very welcome.
Morten, than you again forasking me.
Morten Andersen (41:26):
Thanks.
What a great interview, I tookthree things away from my talk
with Rob one, we can make ourdecisions better by using an
evidence-based approach. makinga decision evidence-based means
(41:46):
that we must change our approachto decisions. And we do that in
three ways. Firstly, we mustmake a conscientious, explicit
and judicious use of the bestavailable evidence. consensus
means that we must try hard andmake a real effort to gather and
use the best evidence. Explicitmeans that we must describe the
(42:08):
evidence on which we're basingour claims. And judicious use
means to critically appraise thequality of the evidence. And
then we must use multiplesources so we can triangulate
our data. And finally, we mustuse a structured approach to our
decision making. And if we dothat, we'll be able to make
(42:29):
better informed decision. Two weshould spend more time defining
our problem. Once we haveidentified a problem, we often
jump straight into solutions.
Why? Well, because the fieldsbetter and it's more action
orientated. But instead Robsuggests that we spent more time
on understanding and perhapsreframing the problem. If we see
(42:50):
a decline in job satisfaction,is that really a problem? And
why is that a problem? Or if wewant to lose weight, perhaps
understanding some of thepotential costs in trying to
lose weight before deciding ifthat five kilos is worth the
effort. And three, it is notabout reading a lot of boring
research. It's about usingseveral sources. There are four
(43:12):
sources that we should look at,to make evidence-based
decisions, scientific research,speak with relevant
stakeholders, use our ownexperience, and look for
organizational data. And only bylooking at all sources of data,
can we make better informeddecisions with better outcomes.
Rob's work is important. runningfaster, improving our
(43:37):
productivity working bettertogether, only makes sense if we
make good decisions in the firstplace. Otherwise, we're just
climbing faster up the wrongladder standing against the
wrong wall. Until next time,take care