Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hello and welcome to Making Data Matter. Today we don't have a guest interview
(00:05):
for you. With summer schedules and travel from both myself and Troy, no guest
interview yet. More upcoming later this summer, but this week we have something a
bit different. I did a presentation a couple of weeks ago on LinkedIn around
measuring success for data teams. And so what you're hearing now is a recording
(00:25):
of that presentation. If you're interested in seeing the presentation
with the slide deck and visual elements, you can see that by the link in the
show notes below. This is the audio recording of it and allows you to walk
through just the framework that I've been talking about with people on LinkedIn
and in my daily emails for a while around how do you measure success, what
(00:47):
are the problems and challenges data teams face, what's a practical framework
for thinking through how to do that better, and what are the results of when
you do measure success well as a data team. So please enjoy this presentation
on measuring success for data teams. So here we are, you're at the measuring
success for data teams webinar or alternatively titled how to get
(01:09):
leadership to pay attention to you. And so we're gonna get started here. My name
is Sawyer Nyquist. I run the data shop solo consulting practice. I'm based in
the beautiful woods of West Michigan. I'm firmly convinced that Michigan has a
state with the best summers and we are in the peak middle of summer right now
and I'm just savoring every minute of it. So I'm doing this live stream right
(01:32):
before this. I was out mountain biking and right after this I'm probably gonna
head back outside and enjoy some more beautiful afternoon in Michigan. So
beautiful summer day here. I hope it's beautiful where you're at as well. But
let's get rolling. So what is this conversation all about? And I'm
gonna give us a little overview of where we're headed. So measuring success for
data teams or how to get leadership to pay attention to you. I think these two
(01:55):
ideas are closely related and we'll hopefully unpack that why as we get going.
But I want to start here. Here's where we're headed for today. We're talking about
the problem. The problem around measuring success in your data team, both what
happens if you don't and why is it so hard to measure success. Talking about the
problem. We're gonna talk about a simple framework, my framework for how you can
(02:19):
work through success measurements and movement and progress towards success.
Some basic, there's some simple tools for that. A simple framework. We're gonna
offer you some practical advice about how to go about measuring success and
working on these measurements that we're gonna talk about and implementing the
framework that I'm gonna present. And then we're gonna talk about the results.
(02:39):
What happens when you do this well? What happens when you know how to measure
success and you implement it well? What happens to your team and to your career
and to your organization? That's where we're headed for today. So problem,
framework, some advice and then some results about what that's gonna look
like. So let's dive in here first with the problem. What if you don't measure
(03:00):
success? What if you just skip this part and say I don't have a measurement of
success for my team, I don't need a measurement of success. What happens?
Well a few things come to mind and there's probably dozens more that
we could talk about but here are a few that come to mind right off the bat. I
think your team morale suffers and I've seen this. Your energy and the
enthusiasm of your team is, your team ends up with the enthusiasm and the
(03:23):
energy of painters tape. There's no excitement, there's no clarity
about where you're headed and so your team morale suffers. That inevitably leads
to turnover on your team and inevitably also leads to difficulty hiring. I think
your team morale suffers if you don't have a clear measurement of what
success is. Number two, confidence wanes and in this I specifically mean the
(03:44):
leaders confidence weight. So you as a leader of a team, your confidence, you
start to lose confidence that you know what you are doing and why you're doing
it. It becomes you get lost in kind of this busy work of tickets and reports
and meetings and stakeholders and requirements gathering you like what are
we actually doing? You start to lose confidence that you know what you're
(04:05):
doing and you know why you're doing it. A closely connected to that I think is
also you can't prioritize is you get overwhelmed with the number of requests
that come in, the number of bugs that are present in the system, the number of
opportunities for new things to do and you don't know how to prioritize all the
different opportunities and all the different problems that are hitting you.
You can't prioritize so when you don't measure success I don't think you can
(04:26):
prioritize well and most importantly perhaps leadership ignores you. They
don't know what you do. Your technical activities are meaningless to leaders
who are one level and two levels above you. They really don't care how many
pipelines you built, how many lines of code you wrote, how many visualizations
you crafted. That all falls on deaf ears and so if they don't know what you're
(04:46):
doing they start ignoring you and you struggle to get their attention and
ultimately that either ends in frustration from you or elimination of
your positions and your team. So great data teams know what success is, how to
measure it and how to communicate it and that's what we're focused on here. Great
data teams know what success is, how to measure it and how to communicate it and
(05:08):
I think that speaks to all the different problems we face so far. So why is it so
hard? So we talked about the problem of why this exists. Why is it so hard to
measure success? Too many different answers here, so many different answers.
So let's let's start here a little bit and explore what types of things people
come up with. When I ask data leaders how they measure success I get things like
this operational efficiency. They start to measure success by operational
(05:29):
efficiency. Maybe that's data quality or no turnover on our team or the pipeline
run times or down or we turn around tickets really fast. Maybe we do things
like internal team culture and employee morale is really up and my team culture
is great and maybe that's a success measure or maybe people measure success
(05:50):
by technical goals like we completed our data warehouse, we decreased our cloud
cost. Maybe it's about stakeholder perception of the data team. Maybe you
start to measure your success based on the NPS score of your stakeholders. Maybe
you you know somehow are judging how much the stakeholders like you and that's
your success. I think we see data teams. I see and talk to data teams. I hear from
(06:11):
data leaders constantly a variety of different things. Maybe no one's on the
team is working overtime. That's a success. Maybe completing data quality
checks at a higher percentage is that's success. These answers range so widely. So
why is it such a hard question to answer? Does anybody confuse because it's a
really hard question to answer. It seems like in our industry and in our
space it's a hard question. I think everybody's a little bit confused about
(06:33):
what we're doing here and with the splattering of answers that we get. So
here are a few things that come to mind about why this is difficult. So first off
I think data teams struggle with this because we don't have a clear vision for
data in our organization. We don't have a clear vision for data in our
organization. So is the data team a group of ticket takers that turn our reports?
(06:54):
In what way or by what methodology are data people actually supposed to support
decision-making? Can you tie any actual business results to something the data
team delivered? There's no clear vision for what data does in an organization.
So that's why our success metrics are the things that we're measuring
success on range so widely because there's no clear vision of what it is. I
(07:14):
think that's maybe one reason why this is so hard. A second reason why it's hard
is because data teams align to many different functions in an organization.
So data teams can be embedded in the business unit. They can be
centralized. They could roll up to finance or marketing or IT or software
engineering. And because of the numerous places that data teams sit in the
organization and on the org chart, I see data leaders constantly shifting views
(07:37):
about what success looks like based on who they report to. If it's finance,
that dictates some of it. If it's engineering, that dictates some of it. If
it's embedded or if it's centralized, those all dictate how we measure success
on our team. And I think that leads to some of the confusion and the difficulty
with this question. Another thing is, a third reason that I'll throw out of why
this is so hard for us to do is that we haven't been asked to define success
(08:01):
before. At least not in recent memory. So for the last decade of low interest rates
and cheap cloud costs, data teams could build all sorts of fancy tools and
exciting data projects. And the actual ROI of those projects was
never really demanded because R&D budgets were flush. It was cash was cheap
and it was easy. Well, that has changed. If you've been paying attention the last
(08:22):
couple of years, the cheap money is gone and now the budgets and the staffing
cuts are here and we're scrambling to define success in new ways or actually
needing to define success for the first time. So I think that's another reason
we struggle with this question because it's the first time in recent memory in a
decade or so that we've had to answer this question well. Additionally, I
think we love data too much. This one might sound a little bit
(08:43):
counterintuitive, but I think we struggle with the question of measuring success
because we love data too much. It's hard for us to find clarity about success
when we have or we know where to find any data point we want. And our dashboards
are crowded and clouded with KPIs that make any sort of uniform message really
unclear. When you've got a dozen different data points you're looking at
constantly, these six markers are going up and these six indicators are going
(09:06):
down and these six over here are going sideways. Well, good luck sharing that
progress clearly to your team or good luck sharing any sort of progress from
those flattering of charts to leadership above you. I think we love data too much
or we get overwhelmed with data because we love data points and we love charts.
And finally, I think maybe one more reason why this is so hard for us is we
have ambiguous definitions of measurement. So leaders will tell me they
(09:29):
measure customer satisfaction or data quality or by contributing to business
goals, but then they never clearly define what those terms mean and how they
quantitatively measure or use them to track success. And so for as much as you
love data, most of success measurements on data teams end up being we just trust
our gut about our success. We just say, you know what, it feels good. We're
(09:50):
moving the right direction. I can tell kind of how things are. Culture feels
good. Teams excited. Energy. We're getting good vibes from the people around us. The
business is maybe doing well overall. And so we just end up trusting our gut,
which really kind of sounds like, trust me bro. Just trust me bro. Like we're
good. And so for people who love data, this is our default standard, which is
(10:10):
kind of surprising. Okay. So what if we made this easier? So it's a hard problem
to solve. And I don't think it's a problem that we've solved very well in
many cases. What have we made this easier? And I want to introduce here a simple
framework for measuring success. Data teams are terrible at this. We come up
with a splatter of metrics, throw them on a board and call it success. And so
maybe we do KPIs or OKRs or smart goals or metric trees. But here's a simpler
(10:31):
framework that I think allows us to speak to all the different parts that
could go into success metrics. And so I call this the one, two, three, four
road trip method. I recently went on a summer road trip with my family on
vacation. We drove from Michigan down to Florida. It's a long ways, about 20 hours
of driving. And so I've thought about road trips a lot recently. But let me
introduce this to you. There's one, two, three, four. So one, you get one
(10:54):
destination metric. Define one destination metric. This is your terminal
goal for your team. This is the purpose your team exists for. It's not progress
towards something else. It is the end purpose. So you define one destination
metric, just like you would on a road trip. And then two, you get two
away point metrics. So way points are those key indicators or key markers on
the journey that track progress. They are crucial for tracking your progress
(11:17):
towards your destination goal. So you define two way point metrics. And then
you get three turn signals, three turn signals. Turn signals are also called
decision metrics. They are specifically defined to alert you when you hit a
decision point. And we'll talk a little bit more about how to define those a
little bit later. But you get three turn signals. And then the last one, four
(11:37):
gauges. So a gauge is an indicator of what's happening operationally on your
journey. So what's your speed? How much gas is left in the tank? What's the
temperature of the engine? These are important to have identified, important
to maybe have available, but they're pretty pointless to stare at constantly.
So gauge, there you go. One, two, three, four, one destination, two waypoints,
(11:59):
three turn signals, four gauges. These are all the metrics you get. You get up
to one, up to two, up to three, up to four. You can have less if you want, but
you can't have more. These are the rules. I made them and I enforce them. So at
first, this feels fairly easy. But where I see teams struggling specifically is
with that destination metric. And so consequentially, the rest of the metrics
can't fall down flat or don't come together until this first one comes
(12:21):
together. So let's talk about destination metric. Destination metrics are the
ultimate goal of your team. So let's talk about how to define a destination. I
think you have three main options when you're picking a destination metric. And
so before I explain these options, I need to introduce you to the box. All
right. I need to understand the concept of the box. Okay. So humans are great at
(12:42):
thinking in boxes. It's kind of a necessary coping mechanism for surviving
in a really complex world is we put things in boxes. We're wired to draw a
box around and to optimize our decision making for things that are inside our
box. So we make choices based on what's best for us in our box, both biologically
and psychologically. Like it's nearly impossible for us to not draw these
(13:04):
boxes. Okay. So for example, in your personal life, your primary box might be
yourself and then your family and then your neighborhood, your city, state,
country, planet. All right. That's like the biggest boxes, the planet, maybe the
universe, I suppose, but call it your planet. That's the biggest box possible.
Those are kind of how boxes work. And so you make choices based on what's in the
(13:27):
best interest of the smallest box first, and then you work your way out. So as
you move out, it gets increasingly harder to make optimal choices that take into
account the smallest box, yourself and your family, and the larger boxes, maybe
your country or your planet. And so you end up with a little bit of attention
sometimes between optimizing for the smaller boxes and the closer, more
(13:49):
pertinent boxes to you and the larger boxes. And so if you are buying groceries,
it's really hard and we can't simultaneously make optimal decisions for
my family's health, my financial constraints, while also thinking about
the economics and the environmental and the political factors of my state,
country and planet all at the same time. It overloads the human psychology,
(14:11):
overloads our decision making engine. And so we end up making some optimizations
in smaller boxes. Okay. So we allow ourselves to be rational only in the
smallest box, or potentially a slightly larger box. And so this is you end up
making decisions around like, hey, well, I need to make a decision about my
family, my health and my financial constraints. And maybe, depending on if
you can optimize well on that, maybe you can take a step further, maybe start to
(14:33):
think about some other other slightly larger boxes. Okay. And obviously enough,
this probably shows up in our workplaces. Our boxes are ourself, then our team, and
our department, and then our company. And so just like in our personal lives, we
struggle to make optimal decisions for all areas of our life and all the world.
Just like just like in our personal lives, and our corporate lives, we
(14:56):
struggle to make optimal decisions for all of our company. And so we focus on
optimizing smaller boxes. Okay, so this is similar to a concept in systems
thinking called bounded rationality, where essentially we draw boundaries
around what we are rational in, and we make decisions based on smaller boxes of
rationality. So you as a team, you as a data team are a box, a contained unit of
(15:16):
people with purposes, strategies, flaws, personalities, cultures, goals, and there
are clear walls about what's in your box and what's not in your box. And so at the
same time, your box exists also in the context of some larger boxes, and then
maybe there's some gray lines between some of those, but generally, you know
where your team ends, and where the other team begins. Okay, so fundamentally, our
brains create these boxes with boundaries, so we can make sense of our
(15:38):
world. And this is important when it comes to setting a destination metric,
because boxes are all about what you're optimizing for success is all about what
you're optimizing for, and all about what you're striving for the purpose.
Alright, so I think you have three choices when it comes to picking a
destination. And so let's talk about those you can pick inside the box. All
(15:59):
right. And the box is your data team. So you can pick inside the box, you can
pick on the edge of the box, or you can pick option three is outside the box.
Alright, and let's let's break these down just a bit more here. So inside the
box, there's three types of data teams. Inside the box, your destination metric
is defined by the actions and the results inside the box. Inside the box
metrics look like data teams that are things that the data team explicitly
(16:22):
controls. So maybe that's decreasing the data cloud cost by x percent, maybe
it's maintaining uptime of a data system by y percent. Maybe it's delivering
data requests in x number of days, all new data requests. So things inside the
box that you control, those are desiccated. That's what a destination
metric looks like inside the box. In this scenario, that's what you're optimizing
(16:42):
for. What's in the box, what you control. Edge of the box, your destination
metric looks like connected to some element outside your box. So you don't
control all metrics, all aspects of the metric, but it's also highly connected
and related to work. So an example of this would be of an edge of the box
destination metric might look like stakeholder satisfaction. Okay, so some
(17:02):
sort of NPS score or some other method to assess a measure how happy our
stakeholders are. That's an edge of the box metric where you control maybe like
part of the equation, but not all the equation. The other parts are kind of
like out of your control. And so in this scenario, you're optimizing for edge of
the box interactions. The third scenario, and this is the scary one, is the
outside of the box where your destination metric is not set based on
(17:25):
what's inside the box. Now what's on the edge of the box that you have some
control over? It's beyond your box entirely. And so very often when you set
an outside the box destination metric for your data team, it's like the
terminal metric of the whole organization or the whole company. That's
net revenue, it's student outcomes, it's program participation, it's community
impact, etc. It's usually like a terminal goal for the whole organization.
(17:49):
And so in this scenario, you as the data team are optimizing your activities to
impact something that's completely outside of the box. And so it's a scarier
one, it's a harder one. And the natural question starts to come up is, well,
which one should I pick? Which sort of destination metric should my data team
have? Because this is how you're defining success and defining what you
want to optimize for on your team. And so people struggle here to pick and they
(18:13):
can't determine whether they should be inside, on the edge, or outside. But I
think asking the should question might be the wrong question. I think better to
start with what is already true about your data team. So let me tell you about
the different data teams, the three different types of data teams that fall
into these categories. And maybe one of these describes you better than the
others. First, let's talk about the inside the box data team. Inside the box
data team. This data team are what I call enablers. And their mantra is, here's
(18:38):
your data, go be successful. This data team creates reports, ensures the data
is accurate and available whenever needed. And they throw the data over the
fence to the business team. And this team uses it. And their focus is on
responding to tickets, optimizing data infrastructure and providing data
quality. This is kind of the realm of DBAs and data engineers, people who
ensure data quality and data availability. That's their world. And if
(19:01):
that's your data team's definition of success, here's your data, go be
successful, chuck it over the fence, call it good. That describes an enabler team.
And these enable enabler teams are the type of teams that have success metrics
and destination metrics that are inside the box. All right. Next is edge of the
box teams. Edge of the box teams are what I call advisors. Advisors, this data
(19:21):
team accomplishes the same tasks, all the same tasks as enablers, but they have a
focus on engaging with stakeholders and providing context around the data and
offering analytical insight. So this team includes things like data analysts or
BI developers or data scientists. And their mantra is, let me help you be
successful, right? Because they're optimizing for kind of this edge of the
box relationships between stakeholders and data teams. And so their engagement
(19:46):
with stakeholders is different. Let me help you be successful. There's a sort of
advisor alongside you relationship. Those are advisors. And that's the second
type of data team that often has destination metrics that are on the edge
of the box. The third data team, third and final data team is what I call the
partners. And partners have a different sort of mantra for success. Their mantra
(20:08):
is we win or lose together. I don't win if you don't win and we win or lose
together. And so this data team provides accurate data, offers context advice and
strategic perspective around the data, but they also uniquely attach themselves
to business objectives outside their data team. They are partners with the
business and they are at the decision-making table. They measure their
(20:31):
success by the organization as a whole. As we talked about, they are outside the
box sort of metrics and destination metrics. Okay. So I think before you can
pick a destination metric for your team, you really have to define, decide which
type of data team you are. And so you can look like one of these three. Which data
team are you? Are you an enabler data team? You already are one of these, so
don't try to guess or don't try to make something up or get aspirational. You
(20:54):
already are one of these. So first identify what you already are. Do you
operate more as an enabler, as advisors or as partners? And that'll already help
you answer this question of the destination metric. So, you know, I've given
examples of these already, but enablers, their example destination metric might
be data quality. Advisors might be stakeholder satisfaction. Partners might
be like an overall business objective, net revenue, student outcomes, etc. It is
(21:19):
possible to move from enabler to advisor or from advisor to partner. But that's,
and that might be your aspiration. And we could talk about that another time, but
that's probably another webinar that we can get into at a different time. So for
now, I think it's best to kind of identify what you are when you're setting
your destination metric. What is our team already like? It's going to be too
hard to pick a destination metric that doesn't actually align with what your
(21:41):
team is like. All right. So there's our boxes and there's how we think about
destination metrics. And having a destination is crucial, and there's no
way to be successful without a destination metric. But a destination by
itself is a fluffy unicorn. It's mythical, imaginary, and kind of disconnected from
reality. All right. So what we need to enable our destination metric are some
other things. We need our other metrics. And so let's talk about those other
(22:04):
metrics. The one, two, three, four, we cover the one, which is the destination.
Now we've got the two, three, four. We've got the waypoints, the turn signals, and
the gauges. So let's talk about principles for these. First is the waypoint
metrics. These metrics are reliable indicators of progress towards your
destination metric. These are the signs along the road that show you how many
miles you are from your destination and be able to track progress towards your
(22:25):
destination. So the key question you might be thinking about here is what
things need to be true for our destination metric to be a reality. What
things need to be true for our destination metric to be a reality. And
then you choose, you're likely to go with numerous things that need to be true,
but you narrow your focus down to identify the two most influential things.
It's going to be subjective. That's okay. You might have to swap out your waypoint
(22:49):
metric later on down the road. And that's fine too. I think the two things that
right now seem to indicate and are the most influential towards your destination
metric, those are your waypoint metrics. These tend to be a little faster
feedback loop for you. Waypoint metrics will provide faster feedback than a
destination metric will, which is great. So you can know that you're making
progress even if the destination metric hasn't moved yet. Next, you've got turn
(23:12):
signals and then we've got three of these. So a turn signal is tied to a
specific decision. These metrics are designed as a decision statement. This
isn't just a box with a big number on it. This says when X goes above or below Y,
we'll make this specific decision. We'll make this specific decision. So what
this looks like when you're trying to figure out what are our decision metrics
(23:34):
is we think about what is the dilemma we face trying to make progress toward
their destination? What is the dilemma that we face? What are these decisions
that we're making that are going to influence our decision? As you start to
identify what your destination is and what your waypoints are, now you can
figure out what are the things that I'm doing week in, week out? What are the
decisions that I'm making, maybe it's every month, that are really important
(23:57):
and I need to optimize around those decisions that are going to actually
move these other things. And you design a metric around decision making.
There's all sorts of details we could dive into about how to design decision
metrics and how to reduce the uncertainty and decisions that you're
making using data. Again, that's a webinar for another time. But that's turn
signals. The third one you've got is gauges. So gauges give you overall
(24:19):
operational perspective, but they're not actively monitored. So you can just
imagine staring at your speedometer for the length of a 10-hour road trip or 20-hour
road trip all the way to Florida. It's pointless. You're staring at your
or staring at your gas gauge and watching it just like slowly ticked out or maybe
your electric charge slowly ticked down. These are not things you stare at.
Gauges are not something you review every day. There are things that need to
(24:42):
be available should something go wrong. If my car all of a sudden runs out of gas
and I had no way of verifying that, that would be a problem. Or if I had no way
of knowing how fast I was going when I was driving, that would be a problem. But
it's not something I'm going to stare at the whole time. So they're reviewed on an
infrequent schedule, maybe once a month or something or less, but they are not
actively managed. You're not managing or optimizing towards your gauges. You're
(25:03):
managing and optimizing towards your waypoints and towards your destination
metric. And I could give some examples of these, but I think in the interest of
time, we're not going to walk through it right now. So I want to give a little bit
of a behold, like, wow, this could be great and beware. Watch out. Because once
you start to define metrics and measurements for success, there's something
(25:25):
to behold and there's something to beware of. Because designing success
metrics for your team can unlock a new level of energy, clarity, and confidence
for people on your team. It just does. Once they can see a destination and a
goal and progress and they have visibility into what's happening and they
can get on board with that vision and that purpose, you can unlock a new
(25:50):
energy. But at the same time, you have to beware because metrics are the primary
way your team members and organization is incentivized. Especially when these
are our destination metrics and purpose metrics and success metrics. Success
metrics are naturally and inherently something that your team wants to
optimize for. And so without proper care and without your eyes being wide open,
(26:10):
your metrics could introduce toxic dynamics and toxic incentives for your
team. So incentives are all about what people optimize for. So a salesperson
who's on a commission plan optimizes for closing the most deals to increase the
commission. Regardless of anything else, they need to close deals. Regardless of
their good deals, they need to close deals. Regardless of their great
customers, they need to close deals. Regardless of if it's a good fit for the
(26:32):
product, they need to close deals. Because that's what they're
incentivized for and that's what they optimize for. Same thing with
customers for people that are measured on how many support cases they close.
And they're going to optimize for speedy resolution. Regardless of what might be
ideal or optimal, on the big picture, their box gets very small and they start
to optimize for speedy resolutions because that's their success metric. And
(26:53):
so it's how do we turn tickets as fast as possible. Or a software engineer who's
measured by some sort of toxic incentive, like how many commits they'd make or how
many store points they deliver. And so they can optimize that in all sorts of
poorly incentivized ways, suboptimal ways at the larger picture. But if you set
the success metrics wrong, they're going to optimize it the wrong way. And at
(27:14):
least it's toxic stuff. So these success metrics will be gamed. Promises
for sure. These will be gamed. That's what people do when they want to, when
they're incentivized to optimize something. You can't change that. So what
can you control is make sure that your success metric is something you want
people to game. All right. Don't hate the game. People are going to game your
(27:35):
metrics. Play it. Play the game. Set up your success metrics and your
incentives and what you're trying to get your team to optimize for. Set that up
to be something that you want it to be. You want them to game the system to
optimize and to think rationally and most optimal about what your metric is.
And so make sure you set it well so that people don't introduce toxic
(27:57):
incentives to play a game you don't want them to play. Set up the game that you
want them to play. And then everybody gets excited and energized by the
progress. Okay. Behold and beware your metrics. Next, we'll talk about
designing your metrics. Once you've kind of defined those four things, one,
two, three, four, your destinations, your waypoints, your turn signals, and
(28:18):
your gauges, you have to think about designing them in detail. What data do
you need? What information do you already have? And what information do you
need to collect? You may not have all the information you need and you might need
to figure out how do we actually collect the right details and the right
information to track this metric. That's okay. You probably aren't going to have
everything you need in front of us or very likely you're going to be
(28:40):
collecting data and presenting data in a different way. Probably your turn
signals specifically show up differently than you have in the past. Most people
don't have turn signals. And your destination metric might look different.
Your waypoints might look a little different. So how you're tracking and
collecting that information kind of change. That's part of designing the
metric is what are the data points required here? And it's usually not just
one data point. It's usually a handful of data points that get aggregated or
(29:02):
pieced together or optimized in a specific way to give you the holistic
view that you want for each of these pieces. So part one is kind of like what
data do you need? The second piece of this is how will you monitor these
metrics? And there's a lot more details we could go into on all this. But how
are you going to monitor? Is there alerting set up for your
decision making, for your turn signals, for your decision metrics? Are they
(29:25):
going to be reviewed weekly, monthly, quarterly? Where will they be visible?
Who will review them? Whose metrics are these responsible for? Are there any
external incentives also applied to these metrics in terms of promotions or
bonuses or a team party or outing, et cetera? How will they be monitored and
when are they reviewed and when are they communicated? We'll get to
(29:46):
communicating more later. So that's about designing your metrics is making sure
that you have some tools and you're thoughtful and strategic about keeping
track of them. You go through this work of setting them all up and make
sure you actually monitor them. And third, you're going to iterate. So
schedule metric reviews. The metrics that you design at first, you're going to
(30:06):
give out your best bet. You're going to do your best to think through this
strategically and well, but they're not going to be perfect. And so
scheduling reviews, maybe this is monthly, maybe this is quarterly. If you
get feedback quickly on these, there's quick feedback loops. You might need to
do this a little quicker or you might be able to review them faster. And your
first metric won't be your final one. And your metrics will shift, which is what
(30:27):
we're going to talk about in just a second. Because here's how this lays
out. So your destination metric never changes. Best case scenario, your
destination metric never changes. The only time it might change is if you are
strategically moving your data team from perhaps an enabler and inside the box
destination metric to a advisor and to an edge of the box destination
(30:49):
metric. In that case, your destination metric might change. But unless you're
making a strategic move like that, your destination metric never changes.
It's very sticky. That's the destination. That's the purpose of our team and why
we exist. The waypoint metric rarely changes. Occasionally, you will start to
uncover through time and iteration and practice that certain things actually do
(31:11):
influence, maybe other things influence the destination metric a bit more than
your identified waypoints. And that's fine. Rarely these will change. But
occasionally they will as you learn and get better. Turn signals will probably
change more often. Those will often change as you evaluate, as you look ahead
to the next quarter and realize what are the key decisions that are relevant and
what are the key dilemmas we're facing in the next quarter and how do we optimize
(31:33):
around those. And so maybe next quarter it's hiring decisions. Hey, we've got
some really strategic hiring decisions to make and we need some turn signals
around hiring and staffing and team structure. Maybe it's around cloud costs
and you have some turn signals around cloud costs that are, again, connected to
your waypoints. Or maybe it's around stakeholder engagement and et cetera. So
those will often change. And by often I mean quarterly at most. These should
(31:56):
not be changing weekly. They're quarterly metrics. And you actually might have,
maybe you have 10 total turn signal metrics over time and they kind of like
rotate through. You have three or four or you have three that show up every quarter
and they kind of like slide in and out based on what the emphasis and the
dilemma is that you face at different points in the year. Those will often
change. And I think your gauge metrics will often change. Well, what you track
(32:17):
operationally will probably change. Certain things will fall out of
importance. Other things will slide into higher levels of importance.
So your gauge metrics will change. But destinations never change.
Waypoints rarely change. The lower ones often change. And so your quarterly
reviews or your sprint reviews or your monthly reviews of your metrics are
usually evaluated in the lower ones. Are these
(32:39):
gauges, turn signals, and waypoints, are those actually still serving us well?
Are there some adjustments and tweaks or some wholesale changes we need to make
in those? All right. So here's the last part that I promised you. How to get
leadership to pay attention to you. And so I want to share four ridiculously
easy steps to sharing your progress and success with leadership.
If you went through the work of walking through this one, two, three, four
(33:00):
road trip method and thinking through how your team is designed to fit
in relation to the box and the rest of the organization and what you're
optimizing for and the incentives in your team, you will be well
served to take some time to think about how that's going to
accelerate the growth of your team and the attention that leadership gives you.
Don't skip this part. The clarity and progress as a team, your
(33:22):
clarity and your progress as a team is really short-sighted.
If your leadership is not aware of how your team
is successful, they're not aware and involved and attentive to how your team
is successful. And communicating your success to
leadership, that's the fastest way to not just get
attention and recognition, but budget, project approval, promotions, career
(33:43):
satisfactions. Listen, I'm a data guy. I spent my career
in data. I am for you, data leaders. I am a fan of
making you guys as successful as possible.
I want to see data managers, directors, VPs
grow and find ridiculous success in their career,
at peak success in their careers and with their data teams. I want to see
(34:04):
leaders getting on board with the data agenda and with the
opportunities and the value of data. And so I'm on your team here.
And that's why I care about how do you get leaders to pay attention to you. This
is what matters. So here's four ridiculously easy steps to showing
your progress. Number one, tell them how you are measuring success.
Tell your leaders how you're measuring success. Once you've walked through this
(34:25):
process, you will have clearly defined ways of
measuring success. Tell them. Tell them how. Tell them about what you've
defined. Second, tell them why you are measuring success this way. So
knowing how naturally that's going to start to come up.
Why are you measuring success this way? Talk to them about your thought process
for why you defined the metrics the way you did. Your rationale,
(34:45):
your decision points for picking the metrics that you did, how they
how they fell into this one, two, three, four grid. That gives them
not just knowing what's going on, understanding your
thoughtfulness and strategy behind why you're measuring what you are.
Number three, tell them when you will share progress with them.
Your leaders probably won't be regularly checking in on your progress.
(35:07):
Many leaders, especially as you go, if you think about leaders one, two, and three
levels above you, they don't have time to regularly check
in on your progress. So it is your job to communicate when
you will share progress with them. Hey, I am going to share with you every
month at our monthly one-on-one. I'm going to share with you at our quarterly
reviews. I'm going to share with you weekly at our one-on-ones, etc. Figure
(35:27):
out what the cadence is. Tell them when you will tell them. And then the last
step, tell them the progress when you said you would. Don't waste your
efforts on this process by then ignoring and failing to actually
deliver on what you promised. One of the biggest problems that people run into
is setting expectations and delivering on the expectations.
And so here's this point here. If you don't tell someone what to expect,
(35:50):
how will they know when they've won? And so a big part of that is telling them
what to expect, coming through on that so that they
know that they've won. If you don't tell someone what to expect,
how will they know when they've won? You are being very deliberate
and pedantic about how you are describing and explaining your team
success, regularly sharing that with them, offering them, making it
(36:14):
ridiculously easy for them to understand your success.
And it makes it so much easier for them to then share your success, give you the
recognition you deserve, and continue to invest a time and
attention into what the data team is doing because they can see
how it's moving the destination goal. They can see how it's
connected to business outcomes. They can see how
data is actually helping the organization as a whole.
(36:36):
So tell them how you're measuring success. Tell them why you're measuring success.
Tell them when you will share progress and then do it. Then tell them the
progress when you said you would. So measuring success launch pad. So if
you're looking for clarity for how to measure success for
your data team, confidence about your progress towards your goals,
renewed focus for prioritizing your work on your data team,
(37:00):
tangible tools for sharing success with leaders so they can understand the value
you bring, renewed energy from your team around tangible progress,
and then maybe some peace. I love this mental peace after removing the noise
and frustration of busy work, overwork, and overwhelming charts,
which is like the definition of most data leaders I talk with.
Noise and frustration of busy work, overwork, and overwhelming charts and graphs
(37:22):
about what success is and the progress of their team. So if this has been
interesting to you, if this is a compelling framework, interesting framework,
and something that you see would benefit your team, that you need a
definition of success. Currently you don't have one or you have an
overwhelming one or you have just been hoping for the
best and trusting your gut. If you want to explore this framework
(37:43):
in depth, I do have an offer that I'm putting together.
Measuring success launch pad. And what it looks like
is in two weeks we can work through this framework
and deliver success for your team, define and measure what success looks like for
your team. And so let me show you what this, how this breaks down.
The measuring success launch pad helps you define and measure peak success on
your data team in just two weeks so you can execute, prioritize,
(38:05):
and get noticed by leadership like never before. So defining peak success,
measuring it in two weeks so you can execute, prioritize, and get noticed by
leader. And here's how the program will work over the course of two weeks.
First off we do a strategy session and that's working with
myself. We call it, I call it the strategy of success 90 minute session and this is
where we do the bulk of the outline of the defining metrics for
(38:27):
destination, waypoint, turn signals, and gauges template.
All right so that's the strategy of success. That's step one in this process
is getting clarity on those and starting to work through. Then next you do a
take-home exercise in this workbook or this workshop. You do this take-home
exercise, takes you a couple hours where you identify
the incentives that are at play and how the metrics that you are
(38:48):
you defined during the strategy of success
session, how those are going to show up on your team, and are they the incentives
that you want. Next we'll do a define and design and
define session. This is our 90 minute session where we get into the weeds now
about how these metrics are going to, what shape, color, texture they're
going to have and how they're going to be
operationalized in your organization and in your team.
(39:11):
And after that you have a communicating success
template and this is another take-home exercise that you'll do on your own
one to two hours of work again where you're basically working through a
template of how to communicate success. A template for how to communicate
success to your leadership. Take home that step four. Finally there's
our 30 minute meeting where we work together to
(39:32):
finalize and launch. So we review all your work, answer any final questions,
make sure you have everything you need to launch out these metrics and start
operationalize success for your team. So I have room to work with three
data teams this summer through the measuring success launch
pattern really across these five steps. And if that's
interesting to you, if this framework looks like something that would benefit
(39:52):
your team and you need this clarity and confidence as a leader,
reach out, send me a DM, drop me an email if this is something you'd like to work
with. I have room to work with three data teams this summer
on this framework. This is called the measuring success launch pattern.