All Episodes

February 8, 2024 74 mins

In this thought-provoking episode of the innovative business podcast, Battling with Business, former Royal Marines Officer, Gareth Tennant, and Product Manager, Chris Kitchener, stimulate a rich discussion about the critical aspect of measuring success. The episode starts by exploring the triad of effect, performance, and activity, highlighting their integral role in testing business or military assumptions.

The conversation takes an intriguing turn towards scientific principles such as the Heisenberg principle and Schrodinger's cat, as the hosts elucidate their relevance in modern business systems. An analysis based on nation-building in Afghanistan provides listeners with an in-depth look at the challenges faced in gauging the efficacy of projects subject to fluctuating systemic changes.

Tennant and Kitchener draw apt parallels in a business context, illustrating the powerful sway of project governance meetings on team behavior and project reporting. This enlightening episode of Battling with Business provides valuable insights into the complicated process of measuring success, and underscores its significance in strategic decision-making in both military and business landscapes.

Focused on the complex world of measuring performance and effect, the hosts unravel how hidden assumptions shape interpretations of performance. They discuss the critical distinction between measuring internal performance and evaluating its impact on external aspects like customer perception or market reputation. Ultimately, asserting the importance of assessing the right things, this episode encourages listeners to re-evaluate their understanding of measuring success.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Music.

(00:09):
Hello and welcome to this episode of Battling with Business with me,
Gareth Tennant. And me, Chris Kitchener.
As you know, I'm a former Royal Marines officer. And I'm a product manager from the world of business.
And we explore ideas and concepts around teams and teamwork,
leaders and leadership, and all things in between, comparing and contrasting
our experiences as we attempt to work out what makes teams, leaders, and businesses tick.

(00:32):
So Gareth, topic for today. I know this was one that you've been talking about
off the podcast for a while, but I'm looking forward to this one because I think
there's a very military view of this and there's a very business view and we'll
see if they touch in the middle.
We've got a podcast where we compare these things. Yeah, absolutely.
So this is all about how you measure success.

(00:56):
So I want to cover what in the military we call measurement of effect.
That and for the civilian members of the audience i'll just say the word kpi
and there'll be shivers down people's spine and we can talk about that so but
i i think that's such an interesting concept.

(01:17):
We take it for granted that you
measure for effect so i'm i'm
gonna cheat we haven't you know sometimes we structure this and we
think about in advance this one isn't one of those why do you
think we want to measure effect so i'll
come on to all the definitions and things
that the military have but effectively we have

(01:37):
two things that we measure or three things
actually we measure effect we measure performance and we measure
activity and of course if you measure performance
and activity you are you are
measuring how well you are doing things
but you're skipping the really important piece
and we'll come on to this and this will come up many times i suspect

(01:58):
over the next hour you're missing the bit where you're
testing the assumption about what you're doing
affecting the external environment in
the way that you want it to so if you're going to measure success
you've got to test that assumption and
the reason that in the military we talk about measurement of effect is because
effect is a very specific word in the military and it's all about out the desired

(02:22):
effect that you have on the enemy on the environment on other actors on partners on.
Audiences of the civilian population all of these things the commander at any
given level of command when they work out how they're going to achieve their

(02:43):
mission the first thing they do after they've They've had their intelligence
briefs and assessments of what the situation is,
if they work out what effect they want to have,
and only then do they start to work back about how they're going to achieve those effects.
But before we get into that, I think it's worth...

(03:04):
Moving away from the military for a moment, to talk about science.
Because we're going to get into definitions about measurement and analysis and assessment.
What do these things mean? I'm also going to come at a very different angle
for measurement effect as well.
It was interesting as you were sort of talking about, I think,
a traditional military view.
But we'll come back to that. And actually, just before we go any further, I have to apologize.

(03:29):
We do have, I was going to say, a silent guest star today.
But he's not that silent in fact those you've
listened to battling with business before there's one
particular episode which was a recent episode where
you can hear him sneezing in the background and by him i mean ted which is gareth's
dog so ted is sitting next to me and i can hear him heavy breathing so if you

(03:53):
listeners can hear heavy breathing on the podcast please don't get get the wrong
idea about either myself or Gareth.
That is Ted the dog sitting next to us. He's actually quite a regular guest
in this podcast and avid listeners will recognise his heavy breathing,
shaking up the towel and barking.
Anyway, let's get back.
So you wanted to bring science to this podcast. I want to take us to a whole

(04:18):
new level of intellectual conversation.
So are you aware of the Heisenberg Principle? I am familiar.
And also not familiar with the Heisenberg. Do you see what I did there?
Luckily, a chap called Schrodinger created a very good thought experiment.
So everyone will be aware of Schrodinger's cat. The thought experiment that

(04:40):
tries to explain the absurdity of the quantum world.
If you try to think about it in terms of the macro normal world of standard
Newtonian and Einsteinian physics.
And in the quantum world, and I suspect in a future episode on future technology,

(05:02):
we'll talk about quantum computing and we'll get into superposition and entanglement
and all of these great things.
But effectively, there are these really strange phenomena.
As you think about subatomic particles, and
if you think about them as particles you expect them to
behave as particles in
the macro world so think about pool balls

(05:24):
on a pool table you would expect as they hit each
other you know conservation of momentum and all of
those kind of things they will bounce off the cushions and at
the subatomic level that doesn't always happen and
there's a very famous experiment called double slit experiment that
kind of shows that subatomic particles behave as both waves and as particles

(05:45):
and that has opened up the world of quantum quantum physics now i'm very much
out of my comfort zone here but
i do want to highlight the the paradox of schrodinger's cat and this idea.
That the heisenberg principle which is at the quantum level you can't measure
the position position of a particle and at the same time know its velocity,

(06:12):
so its speed and direction.
You can either know its velocity or its position, but by measuring one,
you are changing the state of that article. And so you can never know both.
That's a really difficult thing to get your head around. The idea of Schrodinger's
cat being dead and alive is absurd.
It's a thought experiment to have a cat that is both dead and alive at the same

(06:36):
time. It's absolutely absurd.
The whole point in that experiment is to show that there is this paradox.
Let's go a little bit more on Schrodinger's cat. The point being,
there is a random chance.
I'm short-cutting this. There is a random chance there is a cat in a box.
There is some poison, which is either released or not released,

(06:59):
and there is a random quantum chance that this will be released or not released,
and the point being made is...
Before you open the box, though... The cat is neither dead nor alive.
The cat is both dead and alive, because this isn't random chance,
this isn't flipping a coin probability, this is superposition.

(07:21):
So at the point before you've observed the the
particle is both in one position
and another and and this is
where the idea of the multiverse so actually in quantum theory it could be in
infinite positions then there are infinite caps and but but the point here is
once you open the box by observing the conditions you change the the state of the particle,

(07:48):
and therefore it picks where it is.
And this is how, in a quantum computer, a quantum bit, a qubit,
has multiple positions.
Unlike a binary bit in a normal computer that is either on or off,
it is on, off, or both until you observe its status,
at which point it goes from both to on or off, which gives you another level.

(08:13):
So it's trinary rather than binary.
We're getting kind of into the level of detail, but just to back it up.
I was about to say, so your point about measurement of effect is… So the Heisenberg
principle is you can't measure and observe things without changing them.
Now, if we're thinking about things in the macro world, it's very easy to fall
into the trap of thinking, oh, well, this is because big, clumsy instruments affect the system.

(08:36):
But in reality, it's about this quantum superposition.
And Roger Penrose, the physicist, and this is really stretching my understanding
of this, but he talks about a situation where the observation of something not
happening also affects the system.
So you're not measuring it. By not measuring it, you are affecting it.

(08:58):
So imagine you have a particle, and that particle can go in one direction or
another, and you put a particle detector out.
And then you run the experiment and the particle detector does not detect the particle.
So the particle has not gone past that detector, but by knowing it hasn't gone
one direction, you know that it's gone the other way, you have now affected the system.

(09:23):
Not affected your knowledge of the system, you have affected the system at that quantum level.
You have forced it out of superposition by not measuring it,
which really twists my brain.
The point of all of this, Bring it right back to a military.
I think the point is we're not quite ready for battling with science yet.
We'll stick with battling with doctors and move to science at a later date.

(09:46):
But that Heisenberg principle can be applied to measuring effects in complex
systems. So bring it back to the military.
If we take Afghanistan as a good example of a complex system where one of the
strategic aims was to build enough security, enough rule of law to build governance,

(10:08):
to build a functioning democratic government system,
to allow Afghanistan to rule its own security, to rule its own governance,
to rule its own democratic processes such that the coalition didn't need to be there.
There were loads and loads of things happening to try and make that the case.

(10:31):
So lots of education programs, lots of building of schools, building wells,
building of things that would generate local commerce, lots of training of the judiciary,
training of the military, training of the police, all of these things,
as well as all of the traditional security operations trying to counter the

(10:51):
Taliban, who are obviously trying to undermine the government and prevent these
things happening. happening.
So how do you measure the effectiveness of, let's say, a local project to allow
local leaders to apply governance to a regional area?
You have to be on the ground doing it, but as the coalition being on the ground

(11:15):
doing that, you are creating a conditional change.
You are systemically changing the operating environment.
And so you can't measure directly how much security indirectly creating governance
creates, because by going there, you are creating instability because you are the enemy.

(11:37):
So I think the example you've given, there's lots of people that can say,
well, luckily, I'm not doing any nation building, and I don't have an insurgency.
And so this is a fabulous theoretical discussion, but it's not relevant to me.
And I don't think that's the case.
And maybe we'll go into a bit more detail on this later.
But I think there is an equivalent that says you are running a project and you

(12:01):
say, we are going to have a project governance meeting.
And in that meeting, we are going to review and look at the project.
I think it's the same impact. by having that
project governance meeting that has an impact on
the behavior of that team and potentially
what they report to you and how they report to you so we we had a we had a podcast

(12:24):
which i really enjoyed doing it was after a particularly difficult day i'd screwed
up i'd missed something and i realized oh there's a podcast in this which is
how do you know whether whether your project is not going well.
Yeah. And one of the items I seem to remember was everyone tells you it's going really well.
In other words, you go to the governance meeting and the purpose of the governance

(12:45):
meeting is to successfully complete the governance meeting.
And therefore by you impacted the reality and they're changing.
Yeah. So I, I think this is a really interesting point, but I want to,
I want to pull you a different way because I.
I think you've trotted past a really important point.

(13:06):
So you had said the military wants to measure effect.
Yeah. And we'll talk about businesses measuring effect.
But I want to ask you, why does the military want to measure effect?
And I'm not looking for the sort of the first answer, which is,
you know, have the enemy left the area.

(13:28):
At a higher level why do we
care about measuring effect yeah so
this i'm glad you've asked me because
this this allows me to to expand this out to
start talking about the process that we
go through so we're talking about strategy we're
talking about complex systems and we're talking about conditions where

(13:50):
there are i'm going
to say it i'm going to say connecting framework framework you know
conditions where you
the external environment affects what's
happening there are lots of things that we
can't predict lots of things that we don't fully understand the cause and effect

(14:11):
relationship to and so the role of a of a commander of a leader is to navigate
through that complexity to achieve strategic outcomes yep if you don't have feedback back.
Then how are you as a leader going to navigate through
that because you come up with your plan at the

(14:32):
very beginning of this process because you're you're given a
mission or you come up with an objective you want to achieve and then you create
a whole load of assumptions about the operating environment and that those will
be assumptions about stuff we just don't know so So we're going to make a best
guess about what it could be,

(14:52):
as well as assumptions about cause and effect.
So my assumption is if I destroy this command and control node,
that will have a direct effect on the ability of the enemy to cohere their forces
and therefore it will dislocate them and create the conditions where I can then

(15:14):
have a decisive attack here and create winning conditions,
whatever it is. These are assumptions.
And of course, what could happen is you could destroy that command and control
node and an even more competent commander might step up, take control,
and you might find that the enemy's capability is incorrect. Yep, yep.

(15:37):
You might actually find that that command and control node was a deception and
it's not a command and control node.
That was a feint. That was emanating loads of signals, but actually they're somewhere else.
And you've made this assumption that you've had this effect and you haven't.
So you need to have feedback. And we've talked about the importance of feedback before.
Can I try and do something? Because this is a really good example of a real

(16:02):
life thing that happens to me every day.
So we work with teams. I have multiple teams and I have a strategic goal.
Yeah. And there's a bunch of blah, blah, blah.
Here are three real world examples where people get.
Nervous and misinterpret what i
am trying to measure and i think this is this measurement effect

(16:23):
the first one is i say to my
team i want to measure your development
velocity i we we typically measure something called story points story points
represent pieces of work so here's a piece of work it's five story points and
so one of the things i want to do and i I regularly would do to a team is to

(16:45):
say, I want to understand your velocity.
I want to understand how fast you're going. And that makes people very, very, very nervous.
Why do you want to know how fast we're going? Does this imply that you'll be
angry if you don't think we're going fast?
But let's start with that one. I want to understand the team's velocity.
The next one is I want my my teams to tell me when there is a problem or when

(17:11):
they think there might be a problem.
And people get very nervous about that because I don't want to tell my boss
something's going wrong.
And the third example is I want to use a specific set of tools to understand
how my users use my software.
And people say, why would you do that? Now, those are three real world examples.

(17:34):
And I And I wanted to come back because I think, for me at least,
and I think there's a good overlap, there are three reasons or three simple
categories of reasons in my case.
I think there might be more why I want answers to those three questions.
So why do I want to understand my team's velocity? It is not to blame them or congratulate them.
It is because I want to predict. If I can measure, that allows me some form of prediction.

(18:00):
And prediction is highly valuable in my world. Yeah.
So, and, and, and what's interesting about
this is there's usually a misinterpretation about why I want to know.
Yeah. So the first reason I want to measure is because I want to predict.
The second reason I want to measure, so when my teams tell me something is going
wrong, is not because I want to berate them, not because I want to do something

(18:22):
else. It's because I am here to help.
My job is to help the team be successful.
So I am measuring because I have an ability to support the team.
And if I measure, I can tell whether there is value and a necessity in me helping the team.
And the third one, and I used a very specific example about using tools to see

(18:44):
how my customers use the product, is because I want to know if I need to do something different.
And I found that a really, I don't think that's an exhaustive list of why I want to measure things.
But I think we spend so much time, and in the business world,
we're eternally talking about KPIs.
People almost forget why are we measuring in the first place.

(19:07):
I think prediction, the ability to help, and the ability to change,
these are certainly three that I found valuable.
And I think they overlap a bit with what you were just saying.
We've hit a particular center.
We want to measure the effect because we want to predict.
Does that allow us to move forward? Yeah, absolutely.

(19:28):
So I'm wondering. I mean, we haven't discussed this before, those three ideas.
Is prediction help and a decision
point for change do you see other reasons
at that sort of category level why you might want to measure i i do and i so
i want to the nervousness about asking the questions is a really interesting

(19:50):
point i'm going to park it for now yeah i want to get through how the process
works before we start looking at why it's difficult and.
Sometimes where it goes wrong.
The point here is this is not just a military sort of strategic grand army thing.
Measurement of effect is at this, you know, your example at the beginning was at the quantum level.

(20:14):
It's at the most tactical day-to-day level as well as the highest strategic
level. So what were your three things? Prediction. Prediction. Help.
Decision point. Help. Yeah, okay. So I think those are three very, very valid reasons.
And that's, very much what we're trying to do so
as a leader we're all
leaders we've talked about that before you are having to

(20:37):
guide other people and get them to do things
hopefully for the achievement of the goals of the team so understanding what
those people are doing and how well they're doing them allows you to offer more
assistance more it might be more money or more time more people or take the

(20:58):
workload off them if they're,
you know, so understanding how they're doing is really important.
Prediction about the future is that feedback loop that helps us navigate strategic complex challenges.
And the third one about change decisions, ultimately, all information that we
use to help support decisions, we've talked about that being intelligence.

(21:19):
That can be intelligence about internal process and how your team is doing,
or it can be external information about how the environment is changing as a
result of your activity or changing as a result of others.
There is a crucial difference here, though, between measuring performance and measurement of effect.

(21:42):
And the difference, and this is where this sort of more formal military process is quite important.
The difference is, one, you are measuring what you've asked people to do or
what you are expecting your capability to achieve.
The second thing, you are measuring how that is affecting the operating environment.

(22:03):
So if we think about a military campaign,
there are geostrategic reasons why countries go to war or end up deploying military
forces into a combat zone.
Measuring the effectiveness of the overall military operation is the role of

(22:26):
political scientists and historians eventually.
But it's a geostrategic assessment and we're sort of slightly outside of that.
We're at the level below that where we are trying to measure whether we can
achieve our mission as a military force in a combat zone or in an operational

(22:46):
theatre to deliver the decisive conditions that allow us to not be there.
Because ultimately, a situation where military forces aren't deployed is more
favourable to one where they are.
In Afghanistan, the coalition coming home, but with Afghanistan having a stable
democratic government, was the decisive condition that we were trying to achieve.

(23:11):
Now, ultimately, that failed. And I think a big part of that is the disconnect
between measuring performance and measuring effect. And we'll come back to the Afghanistan example.
Let's talk more about the performance and effect. And it was interesting,
as you first said, performance is measuring performance.
As it were the what your team are

(23:31):
doing you're doing what you're doing and effect is the outcome i i don't know
that i i'm willing to accept that yeah i don't know that i buy that because
i think there's a really blurry line as in what my team are doing you could
easily argue is velocity,
you know how fast are they going is the team are

(23:52):
you going one mile an hour two miles an hour five
miles an hour yeah so through one lens
that is absolutely performance and in fact that's why people
get very nervous when you say how fast are
you going they're going are you questioning my performance yeah but actually
it is also an effect it is an effect because of the work we are doing the effect

(24:14):
is that we are going this fast and so this idea that effect is only at the sort
of the strategic level or at the high level i'm not sure i buy that you're misinterpreting
what i think so So perhaps I'm being clear.
This has nothing to do with being at the strategic level or the operational level.
This is about impact on the external operating environment. So how fast you're going is performance.

(24:37):
What effect that has on the customer is the effect.
What effect that has on the reputation and the brand of the company is an effect.
So why are you going... That I'll buy. right that i'll let's
be really clear what we mean so performance is

(24:57):
by definition a task
or operation seen in terms of
how successfully it is conducted so if you are going to make a widget you're
a manufacturing organization you're going to make a widget performance might
be how fast can you make that widget how cheap can you make that widget how

(25:18):
precise are you know the measurements months on that widget?
How long will it last for before it breaks?
These are measurable things that tell you the performance of your manufacturing process.
Effect is a change which is a result or consequence of action or other cause.

(25:39):
If you've made a good widget or a bad widget, how does that affect the customer?
Now, as a company producing widgets, you want your customer to be satisfied
and you want them to come back.
Or you want them to recommend it to a friend. You want them to come to you over
somebody else because your widgets are better quality and cheaper.

(26:01):
So your measurement of how fast you make them is a useful metric.
But it's not a measure of effect.
Effect how many times a customer recommends you
to somebody else would be a measurement of effect if you
have increased your quality in order to achieve
that i think i'm feeling going back to our sense
of it they're both a particle and a wave because i think these

(26:22):
things are your definitions you
know irrefutable but i think these
two things are directly connected performance of
course is connected to effect yeah so
so by definition increased performance has
a positive effect can do because you

(26:43):
can also increase performance and have no tangible well
well in which case you are potentially wasting
money well and you can have negative effects that
were unintended consequences so the the point
you've made about them being intrinsically connected is absolutely absolutely
true and the first thing we need to do is is measure
our performance the second thing we need to

(27:05):
do is then test the assumptions about that
performance achieving the desired effect well this
comes back so in my world everyone gets very very very excited about kpis yeah
and score cards oh my god it's the sexiest thing in the world we need to be
more data driven all of this stuff and I'll talk some stories about this, but first statement.

(27:28):
Which actually directly touched on what you've just said is you have to pick
the right things to measure because as you rightly said.
A measure of performance going faster may be a good thing that directly impacts
how much you sell, or it might just be a thing and it has no impact on whether

(27:49):
you sell more or not sell. It might stroke some ego.
That, for me, I know this is, and I'll let you go back to the definitions,
but it's easy to talk about measurement as a good thing.
Actually i'll i'll quickly leap ahead and
say measurement of the right things is a
good thing yeah measurements of the wrong things can

(28:10):
be a catastrophic thing yes and what often
causes some confusion even in the military when
we talk about these things is the definitions
i've just given for performance and effect could be applied
internally but as i said at the very beginning of
this podcast the military has very specific definitions of
effect and it's all about affecting the external

(28:33):
environment so what we're not measuring is how
effective is the marketing team at communicating
with the finance team because that's an
internal dynamic so although there is an effect we
don't that's not what we're trying to measure when we're talking about effect it's
how are we affecting the operating environment
in the pursuit of our strategic goals

(28:56):
so you can have the most effective
communication internally but if it's not driving you towards operating great
i agree but what but it's interesting because i'm and this may one way you're
going to be very unsatisfied and when we just stop doing this but you measure
something to declare effect and so.

(29:19):
That's really really important in other words it comes back to the thing that
the right performance is you measuring effect if that makes sense this is going
to sound crazy you can't measure effect you can only measure performance how
many people died this year this month in this region,

(29:39):
a smaller number half the amount than last month we are having a positive effect,
on the safety of the people in this area, if that makes sense.
So I question that slightly in that you can directly measure effect if the effect
is an objective and directly attributable physical effect.

(30:03):
You can measure... Number of attacks, number of widgets you've built, sales figures.
Well, widgets you've built, performance, because that's internal. internal.
Sales figures would be a measure of effect.
Because it's external you have sold
to a third party if we

(30:24):
take it to a military example number of hours flown
is internal number of
bombs dropped performance because it's
still internal number of bombs that have hit
their target we're now starting to talk
about the effect now we haven't
quite got not there yet though because a bomb can hit its target and

(30:46):
not detonate or or not have the effect
so you can you it can detonate it can destroy things but
the effect the commander wanted to achieve might have
been to kill an individual leader or he might not have been there so but i've
got to say so let's go back because i i actually really like this the widget

(31:07):
example which i gave incorrectly actually is a really good example at one level
it doesn't matter how how many widgets I've made,
it matters how many widgets I've sold.
Now, to sell that many widgets, I have to have made that many, but that's different.
The reason why I say that is because there's a thing in my world,
and I don't know if this is the correct definition, but vanity metrics.

(31:29):
We get very excited about measuring things. If you've got a good dashboard,
people think Kitchener's on top of his business because he's got a good dashboard.
But I think that's the The widget example is a good example of,
if you know that you are always, your widget production always outstrips sales,
you don't need to measure your widget production.

(31:51):
That is not the most interesting thing to measure. You measure your sales.
If you think your constraining factor in your sales is your widget production,
you should measure your widget production.
And there's an implication that goes there as well. And I've done this,
I've derailed because I know you had a really good way you were going to talk about all of this.
It is really important. The first thing that we said was, it is important to

(32:15):
measure the right things to understand effect.
I'll go one step further. It is important to measure the smallest number of valuable things.
And this is counterintuitive because people get so excited about gathering data.
Perfect example, I used to, and I apologize, I may have told this story before.

(32:37):
When I used to work at Adobe, I would talk to the Acrobat team and I once spoke
to the Acrobat product manager and he got very excited and said,
we instrument the whole of Acrobat.
So we know exactly how users are using our product and that will help us make better decisions.
And he said, look, look, every day I get four gigabytes of data.

(33:01):
So he gets all this huge amount of measurement.
And I said, great. What does the data tell you? and he looked at me and said,
I have no idea. It's four gigabytes of data.
How am I supposed to look at that? So this idea of, and that's a ridiculous
example, but I've seen dashboards with 30 things.
Okay, I'm not going to look at 30 things every day.
Show me the three things that really, really matter. And in fact,

(33:24):
fewer things, there's always the fear that we're not measuring something.
And if we don't measure it, it will go terribly wrong.
But I think that's where, if you understand the logic
behind this concept you end up
focusing very much on the right things and
you measure the right things and of course we've already highlighted there

(33:44):
is a need to measure performance in order to get to the
measurement of effect but we've talked a lot on this
podcast in the past about aushvat tactic
mission command you know these ideas about
don't tell people what to do tell them what to achieve well if
we take that as a really important principle and
it is in the military very much a really important

(34:06):
principle of how we execute operations you tell
people what their mission is so you know i want you to go and do this in order
to achieve something else that in order to achieve is the effect and what you're
allowing people by having mission command is if the situation changes such that they go and do this.

(34:29):
Is no longer relevant because the context of
the operating environment has changed the enemy have overrun
that position or but they still understand
the in order to they understand how what
they were going to do is going to support the wider
mission the context of their bit of
the plan they can then start to make local judgments about how they can continue

(34:53):
to support that wider mission and there There are loads of really good examples
of people taking the initiative because they understood the wider context and
how what the enemy have just done is going to undermine the wider mission.
It's really important in this that the mission itself, what you've been asked
to do, that is the performance.
Go in assault position, that's what you've been asked to do.

(35:17):
That is activity, that is an action.
So you are measuring performance. did the platoon
arrive at the landing departure on time
did they have all their kit that's performance did they
have all their weapons did they assault at each hour
did they carry out the drills correctly did they
overrun the position that's all performance the effect

(35:39):
the in order to you are to assault that position
in order to create a feint to draw
the enemy's attention attention so that the this larger attack can get into
place that's what you need to then measure is did that assault because we've
now just measured the performance and said yes that all happened and they did
really well did that have the effect of diverting the attention away from.

(36:04):
This is the wider operation this
this leads me to one of the one of
hopefully for me today at least the most
useful statements to make for people in business and i and i think this relates
to the military as well but business very much which is and i i'm i'm going
to say this and everyone's going to well of course this is really obvious why

(36:25):
are you saying this but i've seen it so many times go so wrong i've seen businesses
that are focused on activity and not progress Yes.
And there's a really good example. It's a military example of this in a really tangible way.
So measure of performance is a measure of the mission. Are we doing things right?

(36:46):
So are you doing things the way you were taught?
Are you using the equipment properly? measure of effect
is the measure of the intent are we
doing the right things and here's a
really good example in about 2005 in
iraq the americans realized that improvised

(37:08):
explosive devices were having a catastrophic effect
on the coalition's overall mission it
was becoming a strategic strategic problem so the
u.s air force were flying an aircraft called the
u-2 dragon lady and our listeners probably are
aware the u-2 big spy plane flies very
high the one that gary power has got shot down over

(37:30):
the same james made in an excellent tv program where he started crying it's
very good not the crying bit but the it's a phenomenal aircraft and that's why
it's still in service i think it's about to be retired but you know how it's
been for 50 years or so So spy aircraft, there for surveillance and reconnaissance.
The US Air Force decided to dedicate the use of this aircraft to counter improvised

(37:55):
explosive devices in Iraq.
And it has all sorts of sensors on it and it can identify.
Using its cameras and stuff, potential devices in the road or,
you know, outside of the road or whatever.
It can spot a Timex from three miles up.
That sort of thing, yeah. So very, very useful, hopefully, in helping mitigate

(38:17):
the effects of these improvised explosive devices.
They ran this operation for something like five or six years.
And it was only when the Pentagon were going to cut the U-2 and finally retire it out of service.
And the US Air Force said, we can't do that because we really need this capability.
We need the funding to keep it going.

(38:39):
We need lots of evidence to justify how useful this aircraft is.
It did this operation in Iraq. It was a strategic thing. It must have saved thousands of lives.
Let's go and do a study into how many lives it saved so we can present that
to the Pentagon. That'll be a really convincing argument to say,
let's keep the aircraft.
So they got a US Air Force colonel to go and look at this stuff.

(39:01):
How many lives do you think the YouTube over five years of doing this saved?
I don't know. Well, the fact that I'm asking is either going to be a really
big number or a really little number.
Even before you tell me the answer, what's
really interesting to me is something has
already failed and going back

(39:25):
to the moment moment you observe something the fact
that they don't know the impact or the effect
of the youtube is bad the fact that someone was asked to go and write a report
is bad because the implication is that report better show that the youtube does
a good job so there's two bad things happening well there's potential that it

(39:47):
didn't and the reality is this report came back and And the colonel said,
I'm not sure we can actually show it saved any lives at all.
In fact, all the evidence I found has shown that we wasted a lot of time.
Now, that's not to say it didn't, but in the investigation, they couldn't find any.
And lots of evidence to suggest that they'd wasted a really,

(40:10):
really important capability and they'd wasted the potential to save lives.
Now this is this story is tragic but really
really important so on a measurement of performance perspective
they flew it flew to 97 percent or whatever it was of the missions which for
a u2 being a really complex aircraft to fly is really good so well done to all

(40:33):
the maintenance teams well done to the air force for doing that the sensors
operated so the sensors that are really really complex and technical,
you know, 98% of the time the sensors were operational and were working. So brilliant.
They got to the right areas to take the right images or whatever it was.
So the tasking was really, really good. And the information required to do all

(40:56):
the mission planning, perfect, brilliant.
The images were coming back and then going to the image analysts,
wherever they are, to do the processing and the analysis and the interpretation.
And every so often, they were finding potential devices.
They were highlighting things and saying this looks like
it could or is probably a roadside bomb or

(41:18):
whatever so that was good and you're
not expecting to find one every time but but all the performance
metrics were green this is all good by the
way no i mean and i'm using a very specific yeah our
business language all the performance metrics are
green we use the same traffic light system yeah absolutely
so everyone's patting themselves on the back and everyone's

(41:40):
assuming we're saving lots of lives the analysis takes
place the intelligence products are created the mission
reports whatever those are going back into iraq and getting filtered down that
intelligence system to the battalions the battle groups that need them that
needs know where the bombs are what they're getting are reports that say in

(42:03):
this area there is a red card bomb it all all sound really good,
except they're getting them four days on average after they needed that information.
So a roadside bomb in Iraq, an IED in Iraq, was in the ground for less than a day, a few hours.
And what the insurgents tended to do was either detonate the device or recover it and reuse it.

(42:30):
Or it was found by the coalition and removed. move so
a piece of intelligence coming four days later to
say we think that we've got a high confidence
in the device in this area isn't that useful and
most of the time what was happening was the the battalion
intelligence officer was taking this information and saying yeah we
know because we found that device yesterday because expletive

(42:53):
drove over it or that's really
useless because we're not patrolling that we patrolled there yesterday we're over
here now the problem was there was
no feedback so there was no way for everybody
in this very convoluted system of systems to say
this isn't working for me i need this information quicker and for that to go

(43:15):
all the way through back to the mission planning teams in the air force to either
speed up or say we can't do that and therefore we're not going to use this capability
we'll find another way of doing it.
People could be listening to this podcast and say, well, actually,
I'm very good. I understand the difference between performance and effect.
And I, you know, Chris, that's great for you to mention that we should record

(43:40):
or measure the smallest number of all of these things are good.
The bit that I think then becomes interesting for people. And by the way,
I don't think as many people master this as it sounds, but why does it go wrong?
And this is, I see this all the time. And it's this, again, it goes back to
this, which it's all green.
Green it's all green until it's not

(44:00):
green yeah and and you you started by saying about
the feedback yeah and so i think that
the the feed the feedback loop is a really good example that you do need to
do that but i want to go one level deeper again that says the answer was there
wasn't feedback so write that down everyone you should regularly check that

(44:23):
the the the performance is connected connected to the fact.
But why is it, though, that we still make these mistakes?
Yeah. That's the answer. But here's the magic for me.
How can I go into my business today, look at the performance metrics,

(44:43):
and say, wait a minute, stop.
These are not the right performance metrics I need to change. How do we do that?
So the process you need to go through is it comes back to this idea of of understanding
what you do, whether you sweep the floors, whether you're in HR.

(45:04):
Whether you're in the C-suite, it doesn't really matter.
If everybody in the organization knows what they do and how that contributes
to what the organization does, it's really a much easier job for the leaders,
the commanders, for the senior managers to then
talk about why their performance is

(45:25):
going to have or why we assume is going
to have a positive effect now when
somebody makes a decision about what we're going to do if they
do it properly they do it in the
way the military does it because they decide what effect they
want to have first and then they think about how they're
going to achieve it if you fall into the trap of

(45:46):
assuming that you've seen this problem before so
we're going to do what we did last time you haven't explicitly said
we're going to do this in order order to achieve something
we're going to have this effect or this is the effect we
want to have now of course whether you do that
explicitly or it's something you've done before
as long as people know what you're trying to achieve you can now start to think

(46:10):
about how you're going to measure whether you're achieving it or not crucially
though you need to know what your metric is going to be and you need to know
how you're going to measure that and you need to know what success success looks like.
And it might be a subjective thing.
So if we take the example of.
I think on the podcast, I sort of talked about this in the past,

(46:31):
and I used the example of losing weight, and I asked you if I'd lost weight,
and you didn't know, because you don't know what I weighed. You don't know what I weighed last week.
And part of the problem is, we talk about KPIs, and we talk about these things
after the fact, because people want the good news. Tell me how well I'm doing.
They haven't had the conversation back when they did their strategic review.

(46:52):
Back when they did their quarterly planning, to say.
This is what we're going going to measure so in three months
time we want to see these results well
you're you're talking explicitly about leading and lagging indicators
yeah and so the i mean i think
you've already described it pretty well a lagging indicator tells
you how you have done and there is no way to

(47:14):
change that number absolutely no actually better
there is no way to change the effect that is
the effect yes a leading indicator are indicators
that tell you how you are doing on your progress
to the desired effect yeah and that's another classic thing
which is how do i you know
monthly logins is a monthly active

(47:37):
users a classic one in the software world how many
people logged in this month congratulations that is a
lagging indicator yeah that's not going to tell you how to change
anything that's going to give you a lagging history of
what has happened yeah and also what you're what
you're not doing there is attributing any kind of causal
effect what we're trying to do by doing

(47:59):
this as a leading indicator and say we are going to
have this effect we're going to sell more
widgets right how are we going to sell more widgets well
we're going to increase the quality and decrease the cost paul how
are we going to do that right engineers have gone away and they've come
back and said right we can use this material that's cheaper
whatever that work they've done is going to increase the performance so can.

(48:24):
You measure that you have produced widgets cheaper yes you can are they higher
quality you can measure this yes you can has that had the effect of selling more units cheaper.
We can measure that. What we also need to do is say, we decided we were going
to do this and this was where success looked like, and we were going to measure

(48:45):
it using this way of measuring.
Now, if you're selling units, that's a relatively easy thing to measure.
If it's increasing confidence in a brand, that's a harder thing to measure.
And in a military capacity, if it's dislocating the enemy, and dislocate is
a very explicit military effect,
very difficult to measure dislocate by the

(49:07):
number of bombs you've dropped so dislocate as
a military effect very difficult to actually measure directly
in the same way that in the commercial world measuring
increased confidence or positive
belief in a brand you can't measure it directly that's
okay as long as you recognize that at the

(49:27):
point before you make the change and you say how
are we this is the assumption we're making so in a military sense we're going
to dislocate we're going to destroy that enemy command and control location
so let's say we know where it is we're going to destroy it we're going to drop
a bomb on it now we've decided we're going to destroy that's a supporting effect
destroy in order to dislocate.

(49:51):
How are we going to destroy it? Well, we can assault the position.
We could drop a bomb on it. We could fire a cruise missile.
Lots of options. We could do some sort of electronic attack.
Let's weigh up the pros and cons. Let's work out what we're going to do.
Why are we trying to do it? Well, we're trying to dislocate the enemy.
What does dislocate mean? Well, it means they don't have the ability to cohere their forces.

(50:14):
Communication is difficult, et cetera, et cetera. How are we going to measure their dislocation?
And that's where you go to your intelligence staff and
you say what are the signatures that show their coherence
now that show their ability to coordinate well
maybe we can observe how they're moving around the
battle space in formation following sort of

(50:35):
doctrinal patterns their logistics is tied
up with their maneuver maybe we could
continue to watch that and watch that that break down and that
would be a proxy for dislocation or
maybe we could listen to
their the amount of communications and see if that massively drops
up or massively increase these are proxy measurements

(50:56):
but that's okay as long as you are using
those proxy measurements and you're
you're explicitly doing it and you know what the start state is
what does now look like so we can measure the change
so there's some really really interesting points
as we go through this so the first thing is i've been
in some really painful discussions about kpis and

(51:19):
every business every if they're reasonably smart says what are the kpis we're
going to track and what are the effects all this kind of stuff i not only did
they talk about it too much but
they also talk about it too little So what you've just described there,
I thought, was a really interesting chain of.

(51:41):
Effect-based decisions yeah here's an effect we want to have now how do we let's talk about,
okay we need to know what's at the beginning what's at the end hey how would
you measure this what does that look like i i just don't get the feeling that
businesses do it quite in that way it gets down to well how many widgets did
we make how much do we sell you won't be surprised chris.

(52:03):
Militaries don't do this as well hence the problem
it's very hard and and and it's
made harder harder by by two
well lots of things and we'll go into those in a minute
but but fundamentally you talked at the beginning about nervousness
when you are starting to ask questions about measuring things
people get nervous well there's several

(52:25):
aspects to this one is if people
in a senior position are asking questions about performance well
you're now judging me if you ask if i'm
doing things well i always want the answer
to be yes so i'm going to present to you the information to
show even if i'm not so if
i've missed h hour and h hour means

(52:48):
you the the time that you're
supposed to cross the line of departure to carry out the assault and it's a
golden rule in the military you never miss h hour and because that's how you
synchronize plans etc etc if you're going to miss h hour you you want to not
show that to your boss but of course if you're You're missing H-hour.

(53:10):
That might mean you're in the wrong position.
There might be artillery fire coming down. It comes back to my point as well.
Why do I want to know you missed H-hour? Is it to punish you?
Or is it to adjust the plan? Or is it to adjust the plan? Or is it to say,
you need my help? We didn't give you enough time to get there.
Or is it, it turns out we shouldn't have attacked from there?

(53:32):
And you can see here, there is a link to a subject we talked about before,
about mutual trust about empowerment about
psychological safety those are subjects for
another day but you can see why it's important that there is there is a flow
of information that allows people to know what's going on but also equally when

(53:53):
you're asking these questions and saying i am trying to measure your performance
it's not so that i can grade you and you know it's It's going to be your performance,
it's going to be your money.
Let's be honest, because often that is how unsophisticated people measure people.
I wanted to go back to something else
you said, which I thought was really interesting about the dislocation.

(54:13):
I would hypothesize that we are heavily inclined to measure effects that can be easily measured.
And so the one that we always have is, And I loved your statement, which was.
We measure revenue. We measure sales because it's easy to measure.

(54:39):
In fact, it's one of the most easy things to measure.
But what if I met my sales targets, but in doing so, everyone hates me with
a fiery burning passion because there's a leading indicator that I may not make those sales next year.
And so your dislocation was i thought was
such an interesting topic because what you didn't mean
was was the building they were in destroyed that's

(55:02):
quite easy yeah in inverted commas to measure i can go take
a photograph i can go have a look yeah but they might
have moved to the next building yes what you really wanted to
understand was dislocation and there's a
risk and i'm sure this is true well in fact
i know this is true in vietnam this was the classic vietnam we will
measure dead bodies yeah that is not the

(55:22):
right thing to measure that's an easy thing to measure
and you can quite easily convince yourself that if
we kill that many vietnamese north vietnamese soldiers
at some point we are having the right effect in fact that was not if they had
said what is the impact on morale on the belief that they can win the war then

(55:46):
that's It's much harder to track, but I'm much more...
Going back to my point about we talk about KPIs too much and we don't talk about
KPIs enough, in this scenario, I could imagine we could spend days and days
talking about how we measure dead people and not enough days saying,
how do we measure morale?
And how do we measure a willingness to continue to fight?

(56:10):
And we've gone full circle because I use the example of Afghanistan and the
Heisenberg principle, the fact that if you're measuring dead bodies. If you're measuring...
Much of the area you can patrol because you have freedom of movement,
then you're missing the wider strategic point.
If you're going into the town so that you can talk to the locals to work out

(56:31):
how your impact projects about building a school,
about educating the local political leaders and religious leaders about rule
of law and about judicial processes.
And if you're going into the town to measure that and the impact that's having,

(56:51):
firstly, it's incredibly hard to do.
And secondly, you're going to have an effect by turning up that is potentially
negative because you're going to draw the enemy.
Me you're gonna yeah so we in afghanistan fell into these traps in the same
way that we did in vietnam and your counter-insurgencies are really difficult

(57:14):
because the strategic effects you want to have are really difficult to measure
because it's all about hearts and minds it's all about.
Dislocating the enemy from the population it's all
it's not about how effective you are on
the battlefield in fact the less you
fight arguably that is

(57:35):
a metric for how successful you're being if you can patrol in
an area and engage with the locals and build rule of law and enable engineers
to come and build schools then you are having a really successful impact however
what what do your military leaders want to judge of their junior commanders, well, it's, you know.

(57:58):
How many contacts they got in, how many bad guys they killed,
how many, the heroics, the...
We treasure, as military people, we don't necessarily treasure the...
Intangible soft effects the things because effectively
the military there is to enable other people this
is no different to the civilian world though and actually i want this

(58:20):
this may or may not be an episode and it
may even be an episode that we do and never publish given given
what i do you've just described one of
the challenges of measuring product management yeah
product management measures delivery
of product yeah in the language

(58:41):
we have used it is measuring performance yeah
it is not measuring effect yeah and in
fact the the the next so at one
level you have to measure performance because it's the
do i have enough widgets to sell but the
next level that we get to is do the

(59:02):
people use our software and even that
doesn't scratch the itch for me because the statement using
the language of fact yeah
i wish to create an environment where.
More people wish to pay money for the
software we have and choose to use it instead
of someone else's software and this is why it's not

(59:25):
anything we measure that's why in the especially in
the software world but in the commercial world i've always struggled with
the idea of having a customer relationship manager and a customer relationship
management team because to me what you're doing there is dislocating the i'm
using the word dislocate a
lot in this podcast but you're dislocating nothing you're dislocating the.

(59:51):
The things that are important to measure from the people that have the impact.
So your sales team, your marketing team, your product development team,
your complaints handling team, they should all be customer relationship managers as well.
Because strategically, the effect they are trying to have is all about making

(01:00:17):
the customer better or making life for the customer better.
And if you have somebody else's job it is to go and work on that and deal with
that and measure that, then your marketing team is suddenly now not interested
in the customer. They're interested in the sales figures.
I think it's interesting in defense of customer services,

(01:00:38):
businesses and I'm lucky I work in a business right now where actually I'm pretty
confident everyone feels they play that role but I agree I do get what you mean
and again it comes back to I think maybe the big lesson here is and actually
we should round this off in a minute the big lesson is.
I don't know that we think about the effect enough. I think we think a lot about

(01:01:00):
doing things. We think a lot about performance.
We think about the things that we, it almost goes back to this thing where we've
talked about planning, where you do planning once a year and call it done.
People say, what is the effect I want? I want this effect.
What are the five things I must do to achieve this effect? And then for the
rest of the year, we focus on those five things.

(01:01:21):
And we almost forget, well, hang on.
No, no. The whole point was the effect. yeah yes we
should concentrate on the five things do we do enough widgets blah
blah but actually is that the right effect and are
we doing the things that lead to that effect yeah and of course
this is made harder as well because within
organizations there are different layers of

(01:01:43):
that organization so what you end up with
is potentially somebody who has the
desired effect not being the
same person or the same part of the organization that delivers
the action that will achieve that effect
and we saw that with the u2 example but if
we take my dislocate through destruction of

(01:02:04):
a command and control node if we said that was a land let's say a divisional
commander in the army wants to achieve that dislocate but for whatever reason
that has has been given as a task to the Air Force because they're going to
go and deliver the bomb that's going to destroy the command and control bunker.
If the Air Force don't know why they're doing this,

(01:02:26):
and the army don't have a very good working relationship with the tactical part
of the air force that's going to deliver this effect, what you effectively end
up doing is the air force are now only interested in measuring whether the bomb
went off and if they hit the target. Yeah, performance.
And so the only data the army now have to go off is some images of a smoking crater,

(01:02:47):
where actually they might have wanted
all the information about communications emanating from
that bunker but the people who have the ability to
collect that didn't know and didn't do it and and
so there's disparate bits of the organization so
we're once again back to good communication flows transparency
trust these are important things i i

(01:03:08):
love it every time we do these podcasts and these different topics
it comes and particularly when you think about trust
yeah trust is another one of these ones that people say
really yeah i don't need to trust you to be effective
and each week we come back and say these are
not signs of weakness or signs
of niceness or you know i'll use the

(01:03:28):
woke word these are signs of effectiveness yeah
i trust you i'm going to tell you what i'm going to achieve i'm going to empathize
with you i'm going to all of these things are about effectiveness what if what
if to compound that even more we're in a situation where one department is asking
for something and another department is asking for something and there isn't

(01:03:49):
enough resource to go around.
So they now feel like they're in competition with each other,
even if the Air Force know that the Army are interested in doing this for a
dislocate. Do you think they're going to share that information?
And you see this. And you see it at a very, very tactical level.
And shockingly, you see it where people are deliberately deliberately hiding

(01:04:12):
information because they want another bit of the organization to look bad because
in effect it makes them look better because they're all in competition for resources.
And this is where you end up with.
What starts off as healthy competition and rivalry can very quickly become toxic behaviours.
And police forces struggle with this a huge amount, where the budgets are directed

(01:04:37):
not on effect, but on performance.
And so it's about number of crimes, number of arrests.
Rest it's it's not about how do
the people in society feel about
their trust in the police force about
their safety on the street and so you know

(01:04:58):
somerset and avon police force i'm picking on you know random police forces
here but they are less likely to be incentivized to share information that are
going to allow you the police the metropolitan police will solve some crimes
because they're in competition for budget or there's going to be a lead table or a ranking.
So we've got to be really careful about KPIs and measures of performance driving

(01:05:24):
strategic decision-making as opposed to strategic decision-making driving the measures of effect.
Out of that flowing the measures of performance.
The final thing I want to talk about is the
difficulty difficulty of second and
third order consequence effects the things

(01:05:47):
that you've done the effects you've had
that you weren't trying to have you weren't planning on having and equally other
people in the environment affecting that space as well it might be that we've
destroyed the command and control bunker and we want to dislocate the enemy
but it might be that at the same time.

(01:06:08):
There's a particular religious holiday that affects the behaviour of the enemy force.
If we don't have that information, we might assume that we have been successful
at dislocating them, and it might be that actually, over that period,
they all laid down their arms and gone back to their villages because it's a religious holiday.

(01:06:30):
It might be that we've been doing something and.
With the stated aim of having a positive effect.
And actually, it turns out we are doing that, but we're also having a really,
really negative effect.
Militaries do this all the time, where they go into a local area,
they provide security, and they provide the conditions for rule of law,

(01:06:54):
and they do all these great things, and they take the fight to the enemy.
And they also ruin the economy, because they bring millions of dollars of contractors.
There's some there's lots of really good examples i mean
in in my world there is of course the sales which is
like what is the effect of achieving the
sales in the way you have achieved them but there's a there's another

(01:07:14):
one as well which is a burberry so and
and i'm i'm not an expert on burberry so someone should certainly write in and
tell me i'm completely wrong here but burberry wants to increase sales and so
they took a luxury brand and they started using that luxury brand as a non-luxury
brand. And so everyone had Burberry.

(01:07:35):
And so in the short term, look at this fantastic effect.
Our sales have doubled and tripled, but actually the unexpected outcome was
that Burberry as a brand now is nowhere because it's no longer considered a luxury brand.
So the effect they wanted was an increase in revenue and they declared success.

(01:07:55):
The unexpected outcome of that
activity was that actually long-term the brand decreased in its value.
Yeah, and you've got your Nokia and your Kodak examples of not understanding
other active effects on the operating environment as well.
So all of this… Just on that, you can imagine the Burberry team, though.

(01:08:18):
Sent emails congratulating each other on
the best sales year ever yeah and
if you'd have said to them have you done a good thing this
year they would have said we have done fan we
are brilliant no one else could
have done this except us so this is another i think
this is another cautionary tale which is if you

(01:08:41):
achieve your effect you are
triply quadruply less likely
to question whether you did the
right things to achieve that effect we won yeah
why are you suggesting we didn't win we won demonstrably we said this is the
effect we wanted we're all good as opposed to actually this we did achieve the

(01:09:03):
effect we wanted did did we understand all the other things that And this comes
back to, and I think we can round this off now,
the multiple layers of command,
multiple layers of strategy.
So presumably at the board in Burberry, there is a long-term strategy,
and there's probably some mission statement.

(01:09:25):
I don't know what Burberry is, but it will be to be the most exclusive apparel
brand on the market or whatever it is.
Does that relate to what
they're doing now and somewhere in
that layer of command somebody probably a
chief operating officer or a head of marketing has created the assumption that

(01:09:51):
increased sales is going to help them get towards or continue to have that strategic
goal of being a world-renowned, high-value,
luxury, Veblen apparel brand.
This is the same with the Afghanistan story and the Vietnam story and many other
military operations and big projects that have failed and companies that have gone bust.

(01:10:18):
Somewhere along the line, there is a disconnect between what you're currently
doing and where you're trying to get to.
And if every six months in Afghanistan,
an operational commander is congratulating their
brigade for having the most decisive effect of any
deployment in in the afghan campaign so

(01:10:39):
far and that's happening every six months
and no one's standing back and saying apparently we've had yet another most
decisive success but we're not getting closer to you creating the conditions
where we can withdraw if the information that's coming out is about about numbers of Afghan police.

(01:11:00):
These are measures of performance, not measures of effect.
The number of Afghan police, the number of the Afghan army, these become measures
of performance that people make strategic decisions on rather than the effects,
that we're trying to be achieved, which is confidence in the government,
their ability to stand on their own feet.

(01:11:22):
And it's really easy in hindsight to kind of do this.
But that disconnect that, is
really important in understanding this measurement of
effect has to be driven by what you're
trying to achieve and that from that you have
to decide what your measures of performance are going to be to measure
are we doing the right things and are we doing things right and and so you've

(01:11:46):
got to ask the first question before you ask the second and then finally you
have to recognize that there are multiple layers of command and what you're
trying to achieve has has to align with what the wider organization is trying to achieve.
And this is where measurement of effect.
Has to link to wider understanding of
the context so that when you're starting to make strategic

(01:12:09):
plans you're working off the same assumptions about
the operating environment you've got the same gaps you've
got the same assessments when you're making assumptions everybody knows their
assumptions that need testing and when you're going into the actual operating
process of doing of doing activities and actions of engaging with customers

(01:12:29):
of patrolling the streets of whichever whichever war-torn town it is,
you are working out what the baseline looks like. What does it look like now?
What's it going to look like in three months? What do we want it to look like?
How are we going to know what other people are acting and what effects might they have?
And how might that undermine what we're doing?

(01:12:49):
And so this becomes an incredibly complex thing.
And we're back to the Heisenberg principle. And so what can seem like a scientific
and statistical analytics problem
very quickly becomes a subjective and strategic leadership problem.

(01:13:10):
I think that is a good place to finish.
And it's another one of those episodes where I walk away going,
I need to think about this.
I and I wonder whether we maybe there's a risk today we've talked about it sort
of a geopolitical level and a war level but I think all of this is so directly
relevant to people running businesses today I would ask everyone listening take

(01:13:33):
a really good look at what your business has been planning.
Has is it planning around an effect or is it planning around performance and
those are two two different things. And it is hard.
I mean, I think we've, we've, we've kind of really poked around that.
It's really hard, but the benefits of getting it right are there for everyone.

(01:13:56):
All right. Well, let's call it a day. Thank you for that, Gareth.
Hope you've all enjoyed that. We haven't said this enough. We were gratified
in the last couple of weeks as we've been looking at the numbers that are going
up and up and up, but we'd love if you could help us.
So wherever you get your podcasts, if you could rate or even write a review,
I believe that helps with the algorithms.
Get in touch with us tell us whether you disagree radically

(01:14:17):
or otherwise that's actually a lot more fun when people disagree
with us or talk to us about concepts they've got
so we're battling with biz on x previously known as twitter we're battling with
business two s's at gmail.com tell your friends subscribe if you're listening
to this the first time and haven't subscribed otherwise thank you very much

(01:14:38):
and we'll see you next time on battling with business wonderful.
Music.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.