All Episodes

June 27, 2025 44 mins

Send us a text

In this episode…

How do you prove the value of your people programs when budgets are tight and leaders demand evidence? 

This week's guest is Dr. Alaina Szlachta, author of the book Measurement and Evaluation on a Shoestring.

In this illuminating conversation, Alaina shares practical approaches to measurement and evaluation that don't require a PhD in statistics or massive resources.

Drawing from her experience across corporate, non-profit and academic sectors, Alaina reveals why data literacy is the foundation of effective evaluation. She explains how her journey from "data-phobic" to "data enthusiast" shaped her practical approach to demonstrating impact with limited resources.

The game-changing "Impact Hypothesis" framework Alaina shares creates a logical chain connecting learning programs to business outcomes, without requiring complex statistical analysis. Rather than attempting to prove perfect causation (which is nearly impossible in complex organisations), she demonstrates how to identify meaningful correlations and trends that reveal program effectiveness.

Most provocatively, Alaina addresses the paradox many organisations face: leaders want ROI data from learning initiatives but fail to provide the resources needed to gather it. 

Her "build, borrow, buy" strategy offers practical ways to leverage existing organisational assets to overcome this challenge.

For HR professionals tired of fighting for budget without evidence, and business leaders questioning the value of people investments, this conversation offers a refreshing middle path. 

Alaina's approach makes measurement accessible to everyone, transforming L&D from a cost centre to a strategic driver of organisational success.

Curious?

🎧 Listen now

Connect with Alaina on LinkedIn and mention this podcast to receive a special gift. 

Visit Alaina's website

Buy Measurement and Evaluation on a Shoestring

Follow

Leading People on LinkedIn

Leading People on FaceBook

Connect with Gerry

Website

LinkedIn

Wide Circle

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Leading People with me, gerry Marais.
This is the podcast for leadersand HR decision makers who want
to bring out the best inthemselves and others.
Every other week, I sit downwith leading authors,
researchers and practitionersfor deep dive conversations

(00:22):
about the strategies, insightsand tools that drive personal
and organizational success.
And in between, I bring you onesimple thing short episodes
that deliver practical insightsand tips for immediate use.
Whether you're here for usefultools or thought
thought-provoking ideas, leadingPeople is your guide to better

(00:44):
leadership.
If you work in HR learning anddevelopment or run any kind of
people-focused program, have youever been asked what's the ROI

(01:06):
on this?
Or, worse, had your budget cutbecause there wasn't enough
evidence?
This week, I talked to Dr ElenaSchlachter, author of
Measurement and Evaluation, on aShoestring.
She shares a practical approachto measuring impact without
needing a PhD in statistics or amassive budget.
We talk about using the rightkind of data to show value, why
storytelling is critical and howL&D can become a strategic

(01:30):
enabler of growth.
If you've ever struggled toprove that your work matters,
this episode is for you, solet's dive right in and hear
what Alina has to say.
Alina Schlachter, welcome toLeading People.

Speaker 2 (01:46):
I am so excited to be here with you, Gerry.

Speaker 1 (01:49):
Well, thanks for joining me.
I believe you're coming in fromAustin, Texas, isn't that?

Speaker 2 (01:53):
right, austin, texas, where it is balmy and humid
today.

Speaker 1 (01:58):
Wow, there'll be people in the world wishing they
were balmy and humid.

Speaker 2 (02:02):
Yes.

Speaker 1 (02:02):
Come to Texas anytime , you'll get all the heat that
you could ever want.
Right.
So I attended a fantasticwebinar workshop you did a
couple of weeks ago actually,for the Transfer Effectiveness
Institute that's based here inmainland Europe.
And then, of course, I got toknow a little bit about your
work and you published a newbook last year and we're going

(02:26):
to get to that shortly.
But first to kick things off somy listeners can get to know
you better, how did you come tofocus your work on measurement
and evaluation, and what people,places or events stand out in
your journey, or were there someepiphany moments, particularly
that led you to write this book?

Speaker 2 (02:44):
All wonderful questions.
The easy answer, jerry, is thisI happened into it, not unlike
many people who fall intocorporate training.
I happened to be very lucky inthat early on in my career I
worked for companies and peopleand in situations I was a

(03:05):
competitive public speaker, Iwas a former athlete and all of
those dynamics required reallygood data to improve performance
.
And, jerry, you and I talkedbriefly about your experience in
music and how, when you teachsomething music or when you're
an athlete, or when you'recompetitive and you're

(03:25):
performing, you need really goodfeedback to be able to improve
and feel good and confidentabout what you're putting out
into the world.
So I grew up literally as achild.
I grew up in this competitive,performative space and feedback
is data and taking feedback andadopting it and applying it in

(03:47):
your life those are just thingsthat were intuitive to me.
And then, as I matured andmoved on into my career, I
worked for a company that wasvery data-driven my first job
out of college and then I wentand got my master's and PhD
where I was working with dataand statistics and publishing

(04:07):
and wanting also to teach and bea great professor, and the
feedback that I would get in allmy evaluations I took to heart
and would apply and improve mywork.
And then I worked in agrant-funded nonprofit dynamic,
and so our grants were verydata-driven and results-oriented
, and we had to leverage datanot only in our programming, but

(04:33):
be pretty adept at using datato tell the story, to make sure
our funders knew that we weredoing what we said we were going
to do with our grants.
So I spent literally probably15, 20 years.
If you think back to me being ayoung performer, that was the
world that I operated in, andthen all of a sudden, jerry, I
decided I want to go work in thecorporate world.
Before no one will take meseriously in the corporate world

(04:56):
because I spent too much timein the nonprofit sector or
higher ed, let me go take myopportunity in corporate, and
what I learned is that incorporate training, people
aren't using data, they don'thave feedback loops.
Data enablement's not a thing.
There aren't any logic, modelsor hypotheses built into our
programs, and I thought what isgoing on here?

(05:18):
This just isn't aligned withall the years leading up to my
experience in corporate, wheredata was so important and a
pivotal part of our work, and sothat was really, what launched
me into where I'm at today ishow can I help the corporate
world get better at working withdata, because it literally is
the one asset that makes ourwork really meaningful.

(05:40):
So that's what I do today.

Speaker 1 (05:43):
Right.
And for anyone out there who'sgoing oh no, no, data, data,
data.
I think what's going to happenover the next course of this
conversation is you're going toactually make it actually
present it in such a way thatit's actually accessible to even
the average person who isn't so, maybe, mathematical.
So we're talking about reallyquality information that helps
you improve or helps youdetermine whether things are

(06:05):
working, et cetera, et cetera.
So now the book is called ifI'm not mistaken, it's called
Measurement and Evaluation on aShoestring right, and it's part
of a series from the ATD, whichstands for the Association of
Talent Development.

Speaker 2 (06:21):
Yeah, they're one of the oldest, largest publishers.
I didn't know this until Istarted working with them in the
talent development sector.

Speaker 1 (06:30):
Yeah, and they have a series for professionals who
have limited time and budget,which is the Shoestring series.
So what we want to get to nowis why this book?
And why now?
And who's the book for, andwhat gap did you set out to fill
when writing it?

Speaker 2 (06:48):
So this book literally landed in my lap.
I couldn't be more honored tobe invited to write this book.
So ATD was looking for anauthor for this shoestring
series.
I had just done a presentationat ATD Core 4.
This was back in 2021.
I had done some of my ownoriginal research during COVID.

(07:08):
Every one of us has a COVIDproject that we did.
And for me, being a formerresearcher and a data nerd, I
wanted to do a study tounderstand why are people
struggling with measurement inthe corporate world.
Because, remember, I worked inthe corporate world for a couple
of years and then I left tostart my own business and it was
around that time that I didthis study.
So, presenting the results ofthe study in a conference

(07:32):
presentation, one of the editorsfrom the ATD team was in the
room, reached out to me and said, hey, would you be interested
in pitching to write this book?
So how could I say no to that?
And that was really what landedme really hyper-focused on
measurement and evaluation.
And, jerry, what's reallyinteresting is, as I wrote the

(07:53):
book and yes, we're talkingabout measurement and evaluation
and different models andstrategies and I tell all kinds
of stories to make it moreaccessible, because data,
measurement, evaluation justlike you said, jerry, these
things are intimidating topeople and my goal in the book
is to make them lessintimidating and more accessible

(08:14):
and to help people realize thatyou've got a lot of the tools
and thinking that you needalready to be able to do this
work.
Well, but as I wrote the bookand as I've been out in the
world speaking and doingpodcasts, talking to people, I
realized that measurement andevaluation is the outcome of
being data literate.

(08:35):
So it's not that we need moreinformation about measurement
and evaluation.
We don't need any more models.
There are so many incrediblemodels and theories and there's
such great research out there tohelp us become more strategic
and to apply other people'sstrategies to our own
measurement approaches.
But what makes all of thateasier is data literacy.

(08:58):
People hate data and in my bookI talk about how I was also that
way.
I hated data, I hatedmathematics and statistics
didn't really come naturally tome as I was coming up, and so I
realized that data liveseverywhere.
Data could be anything.
Data is information, it'sfeedback, it's what time is it,

(09:19):
it's how is the temperature andlearning from all of that
feedback that's all around uslearning and applying that in a
meaningful way.
It's as simple as that, and somy goal in the book is to help
people understand how to betterwork with data, and then I tell
all kinds of stories of how I'vedone that in my career with
limited resources, so thatpeople can walk away feeling

(09:42):
like I can do this.
And that's really the goal.

Speaker 1 (09:49):
On Leading People.
The goal is to bring youcutting-edge thought leadership
from many of the leadingthinkers and practitioners in
leadership today.
Each guest shares theirinsights, wisdom and practical
advice so we can all get betterat bringing out the best in
ourselves and others.
Please subscribe wherever youget your podcasts and share a

(10:10):
link with friends, family andcolleagues, and stay informed by
joining our leading peopleLinkedIn community of HR leaders
and talent professionals.
I think there are probably alot of people out there,
including myself.
You know it's not about like, ifyou take your example you

(10:32):
started off with, you knowAustin is humid today.
Well, I mean, you don't need tobe able to calculate the
temperature to know that 95degrees Fahrenheit or maybe 35
degrees Celsius is pretty hot,right, and if you get a
percentage for the humidity of60, 70 percent, you know, so you
don't need to be able tocalculate it, to use it.

(10:53):
Basically and this is somethingI discovered when I did my MBA
I was surrounded by all thesesuper bright engineers, you know
, and they were like superbright with maths and stuff,
because their whole studies andeverything was based on these
things.
And it was only after that, andthen, working in the corporate
world.
I ended up working in financeand strategy and that that I

(11:15):
realized that actually there'slots of great people out there
who can compute stuff andincreasingly today you're going
to have more and more advancedtools that can do it.
It's understanding what youcould.
What decisions can you makebased on that?
Can you turn the data intouseful information that will
help you make a decision rightand or tell you something you
didn't know, or or confirmsomething you thought you knew

(11:39):
right?
And maybe that's actually not abad way to segue into a
powerful tool you talk about inthe book called the impact
hypothesis.
Can you walk us through whatthis is and how it helps
learning and development teamsmove from vague goals to clear,
measurable outcomes?

Speaker 2 (11:58):
So I mentioned earlier in our conversation that
I spent a lot of time in highereducation and research and then
in the nonprofit sector, wherewe worked with grant-funded and
donor-backed programming, and soin those sectors, something
called a logic model is verycommon.
You cannot put a grant in frontof a prospective funder without

(12:20):
a kind of logic behind whatyou're doing.
Similarly, there's logic behindperformance.
If you think about sports,there are sports plays that say
this is how we're going to goabout trying to achieve this
goal, or navigating where we'reat in the game.
This is how I can become fasteras a runner.
There's a playbook, there's astrategy, there's a logic behind

(12:43):
what you're doing, and that'sthe impact hypothesis for L&D,
and it's not just applicable tolearning and development.
In fact, I use the sameframework in the nonprofit
sector with executive directorsand leadership development
companies who are like how do wecreate some kind of logic
behind what we're doing?
And so it's just that it'slogic.

(13:05):
Create some kind of logicbehind what we're doing.
And so it's just that it'slogic.
It's tying a few key datapoints together to tell a story
and to ultimately be able totest what kind of impact you're
having, and so it's reallysimplified version of a logic
model.
Anybody could go Google searcha logic model.
You'll see some pretty complexthings inputs, outputs, outcomes

(13:26):
, results and it seems a littleoverwhelming.
We really don't need all ofthose data points, though they
could be useful, especially ifyou want to calculate ROI.
You need to have a little bitmore detail with the inputs,
outputs, et cetera.
But in the end it's ahypothesis that we want to be
able to test and it links somecore evidence together.

(13:47):
So we have a program on the oneend, and that program is doing
something for our organizations,our communities, our people,
and we want to tell a storyabout what contributed to the
organization based upon ourlearning program.
And the core pieces of that arethe question I love these two

(14:08):
questions what becomes possible?
Or so what?
It's that simple.
So we have a program, hrinitiative, we're doing it, and
when people engage in theinitiative, what becomes
possible for them?
Then we have an answer to thatquestion.
And then so let's say, we havea learning program, let's call
it a leadership initiative, andyou ask yourself the question

(14:32):
well, what becomes possible forleaders when they participate in
this program?
Well, we want them to be bettercommunicators.
Okay, great.
So then if they're bettercommunicators, what then becomes
possible?
Well, maybe we'd have lessconflict between team leaders
and the team members.

(14:53):
We hope that when people havebetter communication skills they
can better navigate conflicts.
And then you ask the questionagain well so then what becomes
possible when there's lessconflict on the teams in your
organization?
Oh well, they'll become moreproductive.
Now we've got this really niceimpact hypothesis, we've got a

(15:15):
chain of evidence and now we cango test and prove it.
People participate in thisleadership program to become
better communicators, so thatteams can have less conflict, so
that the teams can be moreproductive.
Then we ask ourselves well,what data do we need to test
this chain?
How do we test that this dominoeffect actually took place?

(15:36):
And we figure out well, we needto have some kind of completion
rate, obviously.
If people aren't participatingin the program, well then, none
of the other things can becomepossible.
So we do need to trackcompletion rates.
Okay, great, that one's prettyeasy.
Well then, what becomespossible when people complete
the program?
Well, they are bettercommunicators.
Okay, well, what are some of theindicators that would tell us

(15:58):
that people are bettercommunicators?
There's probably some problemwithin the organization or on
the leadership team.
Maybe people aren't clear intheir expectations, so maybe
increasing clarity ofcommunication is an outcome.
So we do have to drill down alittle bit on, well, what do we
mean by better communicator,because that can look an

(16:20):
infinite number of ways, so wedo have to drill down on that.
But once you get clear on howdo we know that somebody is a
better communicator because theyparticipated in the program?
That's a data point.
Well, then we take that one stepfurther.
Well, how do we know that teamshave less conflict?
Maybe we can look at some ofthe meeting notes.

(16:41):
Now everybody is using an AItool to sit in on Zoom meetings
or real in-person meetings andwe can actually do a sentiment
analysis.
When you look at those meetingnotes, what kind of sentiment is
happening among the team?
And we could actually do asentiment analysis of a
reoccurring team meeting to seethat more optimistic or even

(17:06):
neutral.
Maybe it just goes fromnegative to neutral, right.
But that sentiment analysis isa great indicator that maybe
conflict is going down.
And then productivity is aneasy one.
Obviously, are people meetingdeadlines?
Are people having as many callsas they need to have with
prospective client leads, right.

(17:27):
So whatever productivitymeasures are probably already in
existence in your organization,pick any one of them that's
relevant to the team and thoseleaders overseeing that team.
Now we've got a chain ofevidence and some data to test.
Did these things unfold in theway that we imagined?

Speaker 1 (17:45):
You've actually just reminded me that something that
came up in the webinar that Iwas attending I attended with
you was this idea of perhaps youneed to gather some baseline
data or data just to know whatyour current situation is like,
what is your present state ofthings versus, because that
helps you know if you've shiftedthe dial now.

(18:06):
It doesn't necessarily implycause and effect, but it it can
help you say well, we startedhere and we're starting to see
something different, which isactually an improvement on what
we had before.
Um, what, what's yourexperience of of that with
working with your clients now?

Speaker 2 (18:25):
Yeah, your point is so important and I hate to say
this, jerry, but I forget thisbecause I've worked in data-rich
environments throughout mycareer.
Being a data nerd, I knowintuitively that we need a
baseline, and so oftentimes thebaseline data is already there

(18:46):
and we just have to pull it outin conversation.
So, going back to the sameexample of there's some
productivity is down, teams havehigh conflict and leadership
teams aren't communicating well.
Well, why did we pick thosethings?
We go back to the questions Iasked in the beginning.
Okay, well, we're doing thisleadership program.
When people go through it, whatbecomes possible on the other

(19:08):
side Better communication.
Well, why did you answer thatway?
You could have said any numberof things.
You could have said that wewant people to be able to
delegate better, or we wantpeople to hire, we want people
to have better performanceevaluation conversations.
You could answer the questionin an infinite number of ways

(19:30):
what becomes possible whenpeople participate in your
leadership developmentinitiative?
Your answer to the questionprobably comes from some
baseline evidence that peopleare poor communicators.
So I would ask you, jerry, ifyou were the one that said
people aren't communicating well, we want them to improve their
communication.
I'd say, jerry, what leads youto believe that our leadership

(19:55):
doesn't communicate very well?
Well, I've heard some anecdotalevidence from people.
They submit anonymous feedbackto our HR team that this
particular leadership groupisn't clear, they're not giving
clear directives to their team.
The team feels lost, they don'tknow what to do.
Okay, that's baseline data,that's baseline evidence.

(20:18):
We could go validate that anddo some informal surveys, or we
could do some focus groupsinterviews or put out an
anonymous survey that asks for alittle bit more data on this.
But baseline data alreadyexists.
It is in the intuitive gutcheck feeling, it's in anecdotal
conversations, it's in our keyperformance indicators.

(20:41):
So take that equation all theway out to productivity.
I would ask you the samequestion, jerry.
Well, what becomes possible allthe way at the end of this chain
of evidence is that the teamsare more productive.
Well, what leads you to believethat productivity is a
challenge right now?
Well, because people aren'thitting their deadlines.
Okay, what data are you lookingat to know that they're not
hitting their deadlines?

(21:02):
Well, we use Trello to manageour projects and I'm seeing this
one team is just not hittingtheir deadlines consistently.
Okay, great.
So if we're successful, Ishould look at that same Trello
management system and be able tosee that people are hitting
their deadlines if theyparticipate in this program and
they become better communicatorsand there's less conflict.

(21:24):
But that's the thing, jerry.
We can't just look at peopleparticipating in the training
and then let's look atproductivity.
I know we want to be efficient,we want to use our resources
wisely, but this is one of thechallenges that I see everywhere
In leadership initiatives.
Within some of the biggest andbest leadership development
companies.
They're doing this and it'sawful.

(21:45):
They say people take ourprograms and there's more
psychological safety inorganizations as a result.
I'm like BS there's not enoughdata that connects the dots
between people participating inyour programs and psychological
safety.
We've got a few more thingsthat we have to show that are

(22:06):
correlated, which is what arepeople doing differently to
influence psychological safety?
And did you focus on thosethings?
Did you teach people?
Did you give them practiceopportunities to be better
communicators or to be moreinclusive, or whatever it is?
So we need all of the evidenceand that impact hypothesis.
Do not ever cut corners bytrying just to say people

(22:30):
completed training andproductivity is up.
That's not enough data, it'snot sufficient enough and any
smart stakeholder is going toquestion you and say well, how
do I know that it was thetraining that contributed to the
change in productivity?
That's where those data pointsin between.
Well, people are bettercommunicators.
Well, do you know?
Because we did some baselinedata and now we can see that

(22:50):
communication changed.
Also, conflict changed.

Speaker 1 (22:54):
Yeah, right, so that that's a really important
connection that we don't want toskip yeah, and, and from my
experience working in financeand financial planning over in
my early career, one thing thatfinance people will look at when
they look at data outcomes orthe patterns in data is they
want to know what's driving thedata.
So you need some level ofgranularity.

(23:14):
That's right, because manythings could be contributing to
psychological safety.
That's right.
And you want to make sureyou're targeting the things that
have the most impact Right, andnot the things not spending
your money on something thatmakes a 1% difference when you
could be spending your money onsomething that makes a 20%
difference.
That's right.
So if you don't have some datapoints along the way, you're
probably going to get misguidedinformation at best right.

(23:38):
So you talk about measurementnot having to be perfect.
It just has to be useful, andI'm sure that's quite liberating
for many listeners.
What sort of mindset shifts doL&D professionals need to make
to start small and measuremeaningfully?
You're listening to LeadingPeople with me, gerry Murray.

(24:02):
My guest this week is Dr ElenaSchlachter, an expert in
practical evaluation and authorof Measurement and Evaluation,
on a Shoestring Coming up.
We explore how to connect datato decision making, why
engagement scores are just astarting point and how leaders
can build a culture of learningthat drives results.

(24:25):
Now back to our conversation.

Speaker 2 (24:29):
So I had a really awesome one of those light bulb
conversations with a good friendof mine and it was about
causation versus correlation.
She believed her entire careershe's got about the same number
of years in L&D as I do and wewere just having an informal

(24:50):
conversation over a beer and Isaid, liz, why is it that you
believe we have to show thatlearning caused this thing, this
change, this outcome?
And she said, well, if itdidn't cause it, then it wasn't
effective.
And I said, no, think about anykind of initiative that's out

(25:14):
in the world.
It's never going to be the solecause of the outcome.
Maybe in a randomized,controlled trial with a drug
study, where somebody takes aprescription pharmaceutical and
they get some kind of outcomeand we have a control group and
it's a very controlledenvironment, then, yes, we can
talk about causation, but in thereal world we are never talking

(25:38):
about causation.
It is just fundamentallyinaccurate.
And so what we're looking foris simply trend lines.
Are things trending in anincline or a decrease?
Whatever you're trying toaccomplish, we just want to see
trend lines changing, becausethere are so many other
confounding influences inworking with people.

(26:01):
How does someone feel that day?
Did their mom die?
Did their dog die?
Is their child going frompuberty into becoming a teenager
and they're just reallydifficult to deal with and it's
hard for them to show up at workright and be as productive as
normal.
So there's so many other thingsthat contribute to how people
perform that we can never provecausation and we shouldn't try

(26:25):
what we can do is just use thatchain of evidence and if we can
see trend lines.
Going back to the example Imentioned earlier, people
participate in a leadershipprogram.
It's designed to make thembetter communicators and better
deal with conflict on teams.
If we can see thatcommunication has improved, that
conflict has gone down andproductivity has gone up, and we

(26:47):
can see all of those trendlines have changed as people
engage in a program, that's apretty good indicator that our
program was effective.
Yes, we can drill down intomore data.
We could calculate the ROI ifwe really wanted to, but we
don't need that to be able toknow that the initiative was a
good investment of time andmoney.

Speaker 1 (27:09):
Yeah, it's probably this tendency in society to
misinterpret or misunderstandthat a lot of things are
systemic and people are lookingfor very black and white, linear
answers.
I mean, it's OK if I put an egginto a saucepan of boiling
water.
After several minutes it willboil, so I have a certain cause
and effect easy to observe andorganizations with the

(27:31):
complexity of people and thedynamics and everything else,
it's going to be hard put tofind the cause and effect
relationship in the same timeand space.
It could be that there arecause and effect relationships,
but they might not be easy tosee.
So you say the pattern is goingto be incredibly important
there.
Now let's talk about the roleof leaders, because many of our

(27:52):
listeners here are leadersthemselves and some may not be
sitting in L&D but they rely onit, perhaps to train their
people and develop and growtheir workforce.
Why should they care aboutmeasurement and what can they do
to foster a culture wherelearning and evaluation are seen
as strategic rather than somesort of optional thing you do?

Speaker 2 (28:12):
So there's this really interesting paradox that
I think would be helpful forleaders to just reflect on for a
moment.
So data from a variety ofsources going back for decades
says that leaders want to knowsome kind of return on
investment.
It doesn't have to be financial, but they want to know that

(28:33):
their investments and developingpeople are giving them some
kind of returns.
And they wish that the learningfunction would be better at
that.
And we've seen data that says,you know, 90 and above percent
of leaders want their learningfunction and their HR functions
to be better at telling theimpact story, the outcomes of

(28:54):
their programs, versus justoutputs.
And this is an importantdistinction, I think, for
listeners, leaders listening tothis.
This is going to be intuitivefor you.
But if you're in the learningfunction or you're in the HR
function, remember that anoutcome is the thing that
becomes possible because of anoutput.
So if somebody picks up thephone and dials 10 prospective

(29:18):
clients and has 10 salesconversations, that's an output.
The outcome of that couldultimately be conversion rates
and more sales and moreprofitability.
But the output is just thethings that those activities
that people are doing.
People complete programs, theydo workshops, they have

(29:41):
conversations those are alloutputs.
We really want to focus on theoutcome of all those activities,
because that's what helps us toknow what kind of return on
investment we're getting.
And so back to the leaderscomponent.
So leaders want to know theoutcomes of all their
investments and all the outputsthat are happening inside of the
organization.
Well, what's coming about?
Here's the paradox.

(30:03):
Well, our HR and our learningprofessionals.
They also care about the returnsof all the activities that
they're doing, but they say wearen't able to do those
calculations or tell thosestories or even to be confident
that we're working with theright data, because we're not

(30:23):
given the resources to do so.
So that's the paradox, jerry,that our leaders want some kind
of return on investment.
They want to know the outcomesthat are coming about because of
our programming, our learningand HR.
People want that too, but theysay the reason that we're not
focused on those things is wedon't have enough people power,

(30:44):
we don't have enough time, wedon't have enough technology or
resources to do this work well.
So, leaders, I would say, ifyou want to support getting more
outcome data, well then we haveto think about how can we
provide the right resources tomake sure that all of our
departments have that kind ofdata-driven capabilities,

(31:08):
because it's being data-driventhat gives us those outcomes and
gives us the returns on whatwe're investing in.
And helping people bedata-driven and data literate is
the thing that's going to makethat great change.

Speaker 1 (31:20):
Yeah, and that sort of probably segues nicely into
another theme in your book aboutyou propose a build, borrow or
buy strategy for anybody workingwith limited resources.
Maybe you can unpack that alittle bit, because that
probably relates to the L&Dperson going to the leadership
team asking for things, gettingpushback on what they want and
then having to rethink.

(31:42):
Either they give up or theyrethink.
How could I maybe do this eventhough I'm not given everything
I want?
So please maybe talk a littlebit about that.

Speaker 2 (31:52):
I think, an important conversation.
Going back to the paradox thatI mentioned our learning leaders
and the people in a learningfunction, I think one important
perspective, as you're talkingwith stakeholders and asking for
resources, whether that's timeor money or tools our leaders

(32:13):
want to know if the investmentsthat we're doing in these
resources are worthy of theorganization's resources or not.
So we have to be prepared tomake that case, for, yes, this
is an investment.
This is a good investment ofthe organization's resources,
and here's why.

(32:34):
And so to your question aboutthe build, borrow, buy framework
, one of the things I talk aboutand I have an exercise in the
book is do we build itinternally or do we buy it?
So here's a really good, simpleexample.
So a lot of people in thelearning and HR function.
They're not great at creatingreports and they're not great at

(33:07):
data analysis.
It's a unique our vendors.
So many organizations are goingto have a customer relationship
management system, anadministration management system
, an HR system, an LMS, right.
There's all these vendors thatsell tools that collect data and
they can do reporting and theyhave the capability of doing

(33:28):
analysis.
So one of the best things thatwe could do, and this is a great
way to use the organization'sresources wisely is to figure
out, use the impact hypothesisand say we want to be able to
test this chain of evidencecompletion rates versus
increased communication, versusreductions in conflict on the

(33:51):
team and increases inproductivity.
This is the impact hypothesis.
Take that impact hypothesiseither to your vendor and say,
hey, what data could we use inour learning management system
or other data systems.
Sometimes even Power BI is anincredible tool to lean on, and

(34:12):
you could go to whoever in yourorganization is overseeing your
business intelligence.
Bring this impact hypothesisand say we're going to need data
from a few different places inthe organization.
What can I do to make thischain of evidence possible and
how do I create a report thatcan help me to track and monitor

(34:33):
if these changes are comingabout?
The impact hypothesis of here'swhat I want to do.
Here's the data that I'd liketo use and go talk to your
vendors, talk to the businessintelligence people in your
organization that might evenjust be your CEO and say, hey,
how can I test this hypothesisso it uses the organization's

(34:54):
resources wisely without havingto go take expensive courses and
learn how to do data analysisall on your own, leverage the
assets that the organizationalready has, the people, the
vendors, the knowledge.
You might have a colleague who'sin marketing that's a super
data nerd like me and you couldask them hey, how would I go

(35:14):
about making a report, how areyou guys doing reports and
marketing, and how can I learnfrom you?
So the idea is, once we knowwhat we're trying to accomplish,
we can lean on the knowledgeand resources and supports that
are already in the organization.
But it's a lot harder to dothat in reverse.
I can't go to my marketingcolleague and say, hey, I want

(35:36):
to report without knowing what'sgoing to be in that report.
My marketing colleague is notgoing to be able to tell me the
impact hypothesis.
I have to come up with that onmy own.
But once I've got it, then itmakes it so much easier to ask
for resources or ask how do Iaccomplish this with the
resources the organizationalready has?
And that's how we can use thatbuild, borrow, buy perspective

(36:01):
to use the resources we have tomake accomplishments in our work
.

Speaker 1 (36:05):
So that's even a tip for people on our side of the
fence who provide services toorganizations.
It's something that in mycompany, wide Circle, we offer
to clients if we can give themdata that they can use, and so
in some cases we can.
I'm a big fan of that because Icome from the world of business
, outcomes and strategies andand that, uh, you know, and do

(36:29):
they actually deliver what wewanted?
Not, not not, like you say, theoutput per se, but do they
produce an outcome for theorganization?
And not too often theconversations I mean, you're
talking about the differencebetween outputs and outcomes too
often the conversations areabout inputs like how how many
hours does it would hours wouldyou take to do that, and how

(36:49):
many people can go on thetraining, and how many books or
course materials will I get,rather than thinking and working
your way back from the outcome,because maybe a course is not
what you need to achieve whatyou wanted, and if you're not
starting from the outcome,you'll never get there.
Now just to.
There's one last thing.
So, basically, you've never getthere.
Now just to.
There's one last thing.
So, basically, you've coveredthe data.

(37:09):
You don't need to be massivelydata literate to be able to get
data and to use it to makeinformed decisions.
What about getting stakeholderbuy-in, and especially when, in
some organizations, measurementand evaluation isn't a top
priority?

Speaker 2 (37:26):
Yeah, I would say that in the organizations where
measurement and evaluation isn'ta top priority.
Yeah, I would say that in theorganizations where measurement
and evaluation is not a toppriority, it's probably because
organizational leadersthemselves aren't incredibly
data-driven and I experiencedthis in one of the organizations
I worked for.
I have a very data-drivenmindset and I believe in the
power of data, and I want toknow how effective my programs

(37:48):
are.
How do I improve them?
How do I use the resources thatwe have wisely?
We only have so many hours totrain people.
Maybe I should cut this andinvest more in that.
Well, how do I know how tonavigate those investments and
resources without data?
So I think it's these kinds ofquestions that we can bring to
our stakeholders and we can saythings like hey, we've, for

(38:11):
example, we've had thisonboarding program.
It was one of the programs Iwas accountable for many, many
years ago.
I was the one that trained allthe new employees to do their
jobs and I wanted to know howcan I improve?
How do I know that our programis leading to the efficiency and
effectiveness of employees whenthey get on the job?

(38:33):
I need some data on that, whichsimply means my test scores on
how people's knowledge haschanged.
I had data on that, but thatonly told a fraction of the
story.
I want to know how people areperforming against the
expectations of performance oncethey leave training and they

(38:53):
get on the floor, and what arethe things that they struggle
with the most.
I needed managers to be boughtinto working with me to give me
that data, and so it's justapproaching your work from a
sense of curiosity, like I wantto be better.
I want to use the organization'sresources wisely, and training

(39:15):
and onboarding and coachingthose are expensive things,
whether you're building itinternally or you're leveraging
an external resource.
So we want to be asking thosequestions and the more we
approach our leaders andmanagers with I want to make
sure we're using our resourceswisely.
Especially today, wheneconomies are uncertain and God

(39:37):
knows what the future looks like, the best thing that we can do
is use our resources wisely.
And so, leveraging thatperspective and saying to
leaders I want to make surewe're using our resources wisely
, but I need better data and Ineed managers to get on board
and give me some of the feedbackthat they see when people go
into their jobs in the first 30days.

(39:58):
I need to know where theystruggle because then I can
change my training to maybe helpaddress some of those struggles
so that they're struggling lessand they're being more
effective on the job.
So that kind of perspective andthat healthy sense of curiosity
to get people on board, itworks really well.

Speaker 1 (40:15):
Okay, so there's so much great wisdom and advice
there.
So, coming to the end, elena,what are perhaps, if we could
get everything synthesized, whatare a few key insights or the
big idea you'd like my listenersto take away from this
conversation?

Speaker 2 (40:31):
Whatever it is that you're trying to accomplish
getting stakeholder buy-in,getting your managers to give
you feedback and data that helpsyou understand how employees
are performing after they leavetraining, whatever it is you're
trying to accomplish it's somuch easier when you know what
you're doing and why, and youcan organize it in the impact
hypothesis, and you could use mytool, you could use a logic

(40:53):
model, you could use the fivewhys framework.
Why are we doing this?
Why does it matter?
So long as you have somethingthat you can present to a
stakeholder, a manager, even alearner?
I love using the impacthypothesis at the beginning of
my learning program so thatevery single participant knows
why we're doing this.
You're doing this program sothat this becomes possible and

(41:15):
this becomes possible, and wewant you to be part of this
journey and giving us feedbackand helping us to make this
impact possible.
So, yeah, knowing what you'redoing and why, and use some kind
of organizing tool to organizeyour thinking and that kind of
clarity.
We're doing this so that thisbecomes possible and this
becomes possible, string it out,show the details, and when you

(41:39):
do that, you get a lot morebuy-in, resources and support
than you could imagine.

Speaker 1 (41:45):
One of the reasons why I do this is because I
always learn something, and nowyou've given me a very
interesting idea to explore howto take your hypothesis
framework and maybe explore thatwith participants at the
beginning of a particulartraining.
So that's got me thinking.
And finally, I'm sure there'slots of listeners out there

(42:06):
thinking now how can people getin touch with you, because they
might want to reach out to youto find out more, and do you
have anything special to offerthem?

Speaker 2 (42:15):
In fact I do, jerry.
I appreciate the question.
So the easiest way to get intouch with me is on LinkedIn.
I am Dr Elena Shlokta, justlike you see here.
You can find me by putting thatname into the search tool.
I am giving away a free chapterof my book.
So we talked a lot about dataliteracy and the impact

(42:36):
hypothesis.
I will give those chapterschapters, actually.
We'll make this plural.
So if you say to me on LinkedInI listened to the Leading
People podcast, loved yourconversation, can I please have
two free chapters of your book?
I will happily share them withyou right on LinkedIn and that's
how you can stay in touch withme and get that free offer.

Speaker 1 (42:56):
Well, that's fantastic.
I think I'll apply for itmyself.
Okay, as always, thanks, Elena,for sharing your insights, tips
and wisdom with me and mylisteners here today.
You are very welcome.
Thanks for having me.
Thanks, Elena, for sharing yourinsights, tips and wisdom with
me and my listeners here today.

Speaker 2 (43:09):
You are very welcome.

Speaker 1 (43:10):
Thanks for having me Coming up on Leading People.

Speaker 3 (43:15):
Lots of people go into management without any kind
of training.
Lots of people are selected forpositions as managers on the
basis of being good at somethingelse.
I'm sure that a you know, a lotof your conversations kind of
reflect that truth.
So a whole universe of peoplewho are in positions of
authority and don't quite knowwhat to do or are being managed

(43:36):
by people who plainly don't knowwhat to do, and that felt like
a very big opportunity.

Speaker 1 (43:40):
My next guest is Andrew Palmer, senior editor at
the Economist Bartleby columnistand host of the Boss Class
podcast.
In a fast-paced conversation wetalk jazz, power and delegation
and what Andrew has learnedfrom interviewing some of the
world's top managers.
It's a witty and insightfulepisode that you won't want to

(44:04):
miss and remember before ournext full episode.
There's another One SimpleThing episode waiting for you A
quick and actionable tip to helpyou lead and live better.
Keep an eye out for it whereveryou listen to this podcast
Until next time.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.