Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Michael Hartmann (00:01):
Hello everyone
, Welcome to another episode of
OpsCast brought to you byMarketingOpscom, powered by all
the Mopros out there.
I'm your host, Michael Hartman,flying solo today.
Joining me today to talk aboutwhat he calls money ball for
lead scoring is Lucas Winter.
Lucas is a lead scoringconsultant joining us today from
Oxfordshire I'm sure I'mmispronouncing that in the UK.
(00:22):
He started his Mopro career in2014 as a technical customer
support rep for MarTechProfessionals.
Since then, he's managed hisown team of Mopro professionals
and has even flown solo as thewhole marketing ops department
for a company of 3,000 employees.
He is now a lead scoringconsultant at the Inspired
Marketing Group.
So, Lucas, thanks for joiningme today.
Lucas Winter (00:45):
Cheers, Michael.
Yeah, this is a lovely way tospend a Wednesday night.
Michael Hartmann (00:48):
Yeah, oh yeah,
it's evening for you too, so
how badly did I butcher the nameof your city where you live?
I?
Lucas Winter (00:57):
don't think it
really matters.
But yeah, it's Oxfordshire, butI don't think anyone's going to
be testing you on that again,Michael.
Michael Hartmann (01:06):
Yeah, I think
I'm going back to the Hobbit,
you know, and the Shire, right,and I can't break that habit.
Well, good, you'll just makefun of us, you know, as typical
American mispronunciation.
So before we get into this.
So we talked about, like I said, this is going to be, we're
going to be talking about moneyball and money ball for lead
(01:28):
scoring.
So like, yeah, that was a termyou came up with here when we
were talking before.
What do you mean by that?
Lucas Winter (01:37):
Yeah, I mean you.
You just slipped up on thepronunciation of Oxfordshire and
now here I am, the Brit talkingto you about baseball, so the
shoe's immediately gone onto theother foot.
In terms of me.
I've never been to a baseballgame, I've never watched
baseball on TV, I've neverplayed the game myself, but I am
(02:00):
familiar with the concept ofMoneyball and this is something
that really got me inspiredabout league scoring.
For those of you who don't know,the story of Moneyball is about
the Oakland A's, who are abaseball team with the second
smallest budget in the league,and there was other teams that
had a budget three times thesize.
But the Oakland A'sconsistently made the playoffs,
(02:21):
consistently topped theirdivision, and the reason that
they managed to do that wasbecause they were evaluating
players differently to all theother teams in the league, and
the question everyone was askingwas why?
But they weren't really likingthe answer.
So that's kind of the approachthat I've taken, which is to
look at the leads in companies'databases, find out not which
(02:45):
contacts the sales team wants tospeak to, but actually which
customers are actually ready tobe sold to, and kind of flipping
that conventional wisdom on itshead.
Michael Hartmann (02:59):
Yeah, Well, I
am familiar with Moneyball.
I have played baseball, butvery briefly as a young boy.
I wasn't very good.
I still love the game and I donot understand cricket.
Lucas Winter (03:14):
I won't pretend to
understand every facet of
baseball.
Michael Hartmann (03:19):
That's all
right.
All right?
Well, let's jump right in.
So, yeah, when we talked before, one of the things you talked
about was how, in many ways,lead scoring is transitioning to
solutions that are AI driven,and, if I understand, your
position on this is that youdon't believe that AI-based
(03:41):
solutions are quite ready, thatthere's some issues that they
have today.
So, first off, did I understandthat correctly?
Second, if so, what kinds ofissues have you seen with some
of the AI-enabled AI solutionsfor lead scoring?
Lucas Winter (04:01):
Yeah, I mean one
thing I'd say to the listener if
you had four minutes on yoursweepstake about how long it was
going to take us to talk aboutai, you can go to the pay window
now.
Um in in terms of um, wherewe've come from, which was kind
of a feels right lead score togo into a data-driven lead score
.
You're absolutely right.
(04:21):
The next stage in thistransition would be to moving
towards an AI-based scoringmodel.
As it stands, there's somepotential pitfalls to fall into.
One of the examples that Ireally like to give is when
looking at job title, and what Iwas finding is that AIs were
picking out the best job titleto contact is retired and
(04:45):
basically what was happening waswas a company was selling to
someone who was in work, thenthey'd retire, then the ai would
run the lead score model and goright, everyone who has retired
has purchased.
Therefore, going forward, weshould be targeting retired
people, which, if you just applyan ounce of common sense,
you're going to go.
Well, that's nonsense, yeah,but the AI wasn't able to do
(05:08):
that and there's like countlessof other examples of of AI is
kind of just missing the pointon this Um, there's a.
There's like um, product returnsforms, customer.
First thing they do is theypurchase a product.
Second thing they do is is theyfill in a form that says I'd
like to return that product.
Therefore, the only people whocan fill in the form are those
that have already purchased.
The AI goes I found you thisreally great demographic of
(05:32):
leads.
They're people that havereturned your product and it's
like no, we already knew aboutthose.
They've already purchased.
So there's certain pitfalls tofall into.
So, whilst AI can speed you upin a lot of circumstances, you
do need someone who knows whatthey're doing, kind of
babysitting, holding hands andchecking the AI's homework.
Michael Hartmann (05:52):
Yeah, it's
interesting because I've thought
for a while, like one of thethings I think people get
concerned about with AI is canit replace everything that we do
as humans?
And one of the scenarios that Ithink of where AI I think has a
lot of potential is similar tothat and like doing a deep
(06:13):
analysis on data, looking forpatterns that would otherwise
not be obvious to us as humansor would take too much time and
effort to get, to much time andeffort to do it too.
At the same time, it mightidentify patterns, like you said
right, that actually are eitherkind of obvious and already
known, so not really thathelpful, or just simply not
(06:36):
something that would make senseto take action on right, and you
still need the human toevaluate that, even if you have
an AI platform that can generatethat and do a lot of heavy
lifting on the analysis yeah,you've nailed it.
Yeah, I mean, I've also, I knowI have personally and I've heard
many people also say likesimply that and I think this is
particular to the llms but yeah,that they're really terrible at
(06:57):
even some basic math yeah, uh,yeah, I, I have.
Lucas Winter (07:02):
I've seen that too
.
Um, where where they've kind offailed to work out basic
multiplication, where you'llyou'll be scoring in units of 10
but you're setting yourthresholds in units of 33.
Well, you're, you're, you'regoing to be points behind where
(07:30):
you're actually at.
So, um, sometimes, uh, uh, lefthand's not talking to right
hand within within the ai, butuh, you, you, I.
The bit I'll find strange is isthat people are often kinder to
AIs than they are to humans.
When a human makes a mistake,they're like oh well, you
shouldn't have made that mistake, you should have been able to
(07:51):
foresee this, you should havebeen able to document, you
should have been able totroubleshoot ahead of time.
And then, when the AI makes amistake, we go well, the model's
learning, and it's like why arewe kinder to the AI than we are
to the human?
Michael Hartmann (08:05):
kinder to the
ai than we are to the human.
That's a really interestingthought and as soon as you said
that, I was thinking to myselflike I am when I use mostly I
use chat, gpt, but um, I, I will, I will like, please, thank you
, right, I'm doing all that andas if it's a human um, and maybe
it's your point is sometimesmaybe kinder than I do to actual
humans.
Lucas Winter (08:24):
Can I, can I just
pick up on something that, um,
this is something that I foundfascinating.
Um, when you start looking atcontact us messages, what people
are typing in and then um,sending out to sales something
that I wish wasn't true, butI've seen this to be true on
multiple customers is the peoplewho are being polite and, as
(08:46):
you mentioned, he's sayingplease and thank yous are often
less likely to buy than thosewho choose not to use those.
So I don't know if that'ssomething to do with people more
likely to progress in theircareers and therefore more
likely to be decision makers.
I don't know the why behindthat, but that's just a
fascinating observation.
I've found in lead scoring thatthe ruder someone is, the more
(09:08):
likely they might be to sort ofget to that sale.
Michael Hartmann (09:11):
That's
interesting.
Maybe it's something aboutbeing decisive.
That's a fascinating one, too.
Which we've talked about beforeis something that people in
marketing ops probably need tounderstand better, right, both
in terms of customers, but alsohow to work more effectively
with other people within theirorganizations, but that's a
(09:35):
whole topic in and of itself.
Okay, so there's some potentialchallenges with AI.
Maybe it'll get better howquickly, we don't know.
Probably faster than I thinkmany people expect, but still
some work to be done there.
So is it safe to say that yourpreference or your
(09:56):
recommendation is to use anon-AI-based approach or
solution, and have you seenthose outperform other models,
right?
Yeah, so what's your take onall that?
Lucas Winter (10:08):
yeah, I mean the
chachi bt boom.
Was that christmas 22?
Now um, and in terms of how farthat we've come, talking now in
2025, the leaps and bounds.
Improvement isn't where I wouldhave expected it to be.
So, in terms of being able todo everything entirely as AI
(10:30):
first, that's a tricky problem.
So, yes, I definitely wouldsuggest that you go with a
data-driven solution, but that'snot synonymous with an AI
solution.
Will it get better?
It's got to.
It's not going to get worse.
Solution Will it get better?
It's got to, it's not going toget worse.
But if you're coming from aposition of I've got no scoring
model to jump straight to AI,you're probably going to find
(10:51):
yourself with a model that's notperforming as well as it can.
And in terms of what you can do, just by doing some best
practice scoring, you can findyourself where the position that
I find myself in quiteregularly, where I'm able to get
a lead in front of asalesperson which is four times
(11:12):
more likely to turn into apaying customer.
Michael Hartmann (11:15):
Okay, yeah, so
I mean, is that the kind of?
It's almost not a metric, butare those the kinds of metrics
that you look at in terms of theoverall effectiveness and
quality of a lead scoring model?
Is it like leads that will turninto an opportunity or speed to
close and win, like?
What are some of the metricsyou look at?
Lucas Winter (11:37):
It's really
important that it's really kind
of meta question.
You've got kind of how do youreport and whether your
reporting is any good, and,similarly, how do you score, if
your lead scoring is any good.
Ultimately, the dollars that itputs back into the business is
going to be a great metric atthat.
It's about improvingsalespeople's performance, and
(11:59):
you touched on psychology, andit is a really difficult thing
to ask a salesperson toacknowledge that they've managed
to increase their sales abilitybecause they've been given
better leads as opposed tobecause they've been performing
well as a salesperson.
When it comes to sort of thatmoneyball mentality, one of the
mistakes that the scouts weremaking was they were going to
(12:21):
see a player once, they'd watchthem once and then they'd go
right so this player's good orthis player's bad and it's not
representative of the entiretyof their college career.
And when a salesperson isfailing at their job somewhere
between 80% to 99% of the time,I mean that's a terrible failure
(12:42):
, right?
You wouldn't accept that fromany other job.
If there was a bus driver whofailed 99% of the time, you'd be
like, okay, you're not a busdriver anymore.
But sales only has to have avery low success rate to
actually be very good at theirjob so they can find themselves
going well.
I've had a success here and Inow know what a looky like
(13:05):
success looks like.
But it's the same problem thescout had.
They're doing it from sort of aone viewing of a player, one
viewing of a lead, as opposed tobeing able to have the sort of
zoomed out view of the entiretyof every customer that's ever
existed, and that's somethingthat you can do with scoring,
and that's something that youcan do with scoring.
And then, when it comes to thatreporting and that metrics,
you've got this very difficultthing where you put something on
(13:28):
a bar chart that goes from 0 to100.
And in the old world thesalesperson has a success rate
of 1% and then you move them toa success rate of 2%.
On that bar chart it doesn'tlook like they've really
increased by that much.
What you've actually managed todo is double that salesperson's
ability to sell to customers.
(13:48):
So it's a really difficultthing to sort of prove that
value when it doesn't look likea massive spike, even if you've
managed to double the salesthey're making.
Michael Hartmann (13:59):
Yeah, I mean,
what you're just describing also
is part, part of why Iencourage regularly our
listeners, our audience andanybody to learn the basics of
statistics and understand them,because that difference between
one percent, two percent, is onepercentage point but, as you're
(14:21):
pointing out right, it's ahundred percent increase on that
rate which is significant.
And if that you know, if the,then if you add in right, what's
the value of a, of a, of a winis, you know, say, a hundred
thousand dollars or a hundredthousand pounds, and you know
one percent, uh, close rate, orclose one weight rate, is five
(14:43):
wins and now you're getting 10wins.
You know that's.
You're going from half amillion to a million.
You know from that, and sothat's significant and so that's
there's.
The other part of this is likeno way that, like understanding
that, but also understandingthat there's more to any one of
these metrics, and sometimesthere's a basket of metrics that
(15:04):
matter and they, they areinterrelated in some ways, and
understanding how thatinterrelationship works and
understanding that sometimesthere's there's.
You can't get all of them tomove in the ideal direction you
would for any one metric as abasket of those, but you can try
to optimize for them overall.
So long winded way of sayinglike, this is like, but it's
(15:28):
worth understanding that if youdon't yeah, again, using the
money ball philosophy.
Lucas Winter (15:35):
It's not about one
single metric, it's about
understanding what you want todo and what your objectives are,
and the example I would give issort of when it comes to nearly
said bowling.
When it comes to pitching, it'snot all about miles per hour.
(15:55):
It's not just about how fastyou can throw the ball.
It's about, well, do they getplayers out?
And this is one of the thingsthat the Oakland A's struggled
with was they were saying, look,we need to get the best bowlers
.
And then they put a player upon the board and say I recommend
this guy, and they'd say, well,he doesn't, he doesn't throw
that fast.
They'd be like, well, he getspeople out.
(16:15):
And this is, um, something thatcan be sort of looked at.
When I have a really commonobjection and I reckon this
one's gonna resonate with a lotof people right now listening I
don't want to look at anyGmail's.
I don't want anyone who's agmailcom.
I don't want all these Gmail'sbecause they're students and I'm
(16:37):
sure, as, as everyone can agreewith once you graduate from
university, you have to hand inyour hat, your gown and your
gmailcom email address.
There's no such thing assomebody with a Gmail that isn't
a student.
I hope it's clear that I'mbeing sarcastic.
Now what I then do is, when Iput Gmails in front of sales, I
(17:03):
don't just say, okay, these,these are good.
I make sure that there's enoughbehind it that they're better
than good.
So I might put a business emailaddress in front of someone and
say, okay, this is a certainlevel.
Now the Gmail has to work evenharder.
They have to have a great jobtitle.
We have to know their companyname.
They have to have been to ourcontact us form.
(17:24):
They have to have visited thejob title.
We have to know their companyname.
They have to have been to ourcontact us form.
They have to have visited theprice list.
They have to have requested aquote.
Then, when I put that Gmail infront of a salesperson, they're
able to say, okay, well, thisone looks all right.
Now the data-driven thing to dowould be to look at things pure
and say this lead is equal tothat Gmail because they have the
(17:48):
same criteria.
With Gmail, I make them workeven harder than they needed to.
I make them have even morecriteria and that way, despite
it not being perfectlydata-driven, it's that
psychology and that perspectivethat the Gmail has to work even
harder or otherwise.
People are going to lose faithin your model when they start
looking at the leads that arecoming in and going.
(18:09):
All these Gmails are rubbishand it's about to change that
narrative, because people arecoming from a position of Gmails
are bad, that they have to bebetter than everyone else.
Michael Hartmann (18:19):
That's
interesting.
Yeah, I've worked at companieswhere that's been.
Something that we were asked todo is to not allow personal
email addresses Gmail, yahoo,msn, whatever and in some cases
I would push back because I hada little more insight into the
(18:40):
customer base.
Like so, very often, if you're,if one of your target audiences
is relatively small businessesstart early, early stage
startups.
I mean, it is a cheap way to getan email address for your
fledgling businesses, to justuse a gmailcom email address,
right, and it doesn'tnecessarily mean that you're not
a qualified or potentialqualified, you know, prospect or
(19:01):
customer.
It's just that you're you're.
You're making decisions abouthow, where to spend your cash
right, cause it's usuallylimited.
So, um, I'm with you on that,that, that, that that's the case
.
It's interesting that you bringup that.
Like the, the perception isthat they're there, they won't,
aren't as valuable period, andthen you need to come up with a
high like there's a higher barto to, to go over to, to get the
(19:24):
, the sales team, to payattention to them.
So it makes sense yeahdefinitely so.
Um, we've talked a little bitabout I just want to make sure
we cover off a little bit so wetalked about so I think you've
even said it like a feels right,score right, basically kind of
like what we think is are theright things to be scoring on
(19:47):
something that's more like whatyou ascribe to a data-driven
money ball kind of model.
And then we've talked a littleabout ai.
Are there any other model, liketypes of lead scoring models
that you've seen and that wouldbe competitors to these
different ideas?
Lucas Winter (20:05):
can I tell you
about my favorite scoring system
on the planet?
Michael Hartmann (20:09):
Can I say no,
all right, so let's go to the
next question.
No, no, no no, I'm being smart,it's all right.
Lucas Winter (20:17):
So you go into a
restaurant, you get a menu and
you have a chili indicator onthe menu and if it's got no
chili, you know it's got nospice.
If it's got one chili, you geta menu and you have a chili
indicator on the menu and ifit's got no chili, you know it's
got no spice.
If it's got one chili, you knowit's a bit of a kick to it.
Two chilies I know that's goingto be quite hot for me and
three chilies I'm just not goingto order what's on the menu For
(20:37):
me.
This is the best scoring systemthat's ever been devised,
because I can go into anyrestaurant and I understand it
immediately.
Other restaurants might usesomething which is just way too
detailed and you'll read adescription and it will say
something like um, um, sour, hotto taste.
It's like I don't know whatthat means, but just having the
visuals of chili peppers and I'msure I've lost some listeners
(20:59):
at this point when they go, youwouldn't order something with
three chilies oh, I'm afraid not, that's beyond my personal.
Michael Hartmann (21:05):
Uh, yeah, yeah
, or if it's some sort of um,
well, I like, I like indian food, and sometimes you go to an
indian restaurant and be likethere's the one chili, two chili
, three chili, and then there'sindian hot, indian spicy, yeah,
and I'm sure that's true inother cuisine, but yeah, anyway,
but yeah, I, I'm with you,right, right, it's a pretty
(21:25):
universal, easy to understandmodel.
Lucas Winter (21:28):
And then you've
got these numbers which don't
really correspond to any meaningto the salesperson, you end up
(21:52):
with these questions where theylook at two leads and they go
what's the difference between a53 and a 54?
And the best answer you cangive them is oh nothing, that's
one point.
Yeah, it's not really asatisfying answer for anyone
just going yeah, it's one point.
But if you could change that toexactly the same thing a chili
pepper thing what's thedifference between a 53 and a 54
?
They're both two chili peppershot.
This is how hot your lead is.
It's two chili peppers hot.
(22:12):
I take learnings from that and,in terms of my favorite way to
go about scoring is to say gold,silver, bronze junk is to say
gold, silver, bronze junk.
And one of the hardest thingsis driving adoption If you have
(22:33):
a scoring system that says theseare the gold leads everyone's
immediately going to grasp thatconcept straight away.
And if they run out of golds,silver still sounds pretty good.
And when driving adoption is sohard, someone's going to turn
around and say, yeah, I don'tknow, I'm not feeling it.
I know what a good lead lookslike.
I just want to take the goodones, based off of my own
definitions.
At which point you can say, allright, fair enough, do you
(22:55):
still want the junk?
At which point they'll probablygo well, I don't really want
the junk.
And then you can just send themgold, silver, bronze.
Now, in an ideal world, you'lljust send them the gold leads
and they'll be able to increasetheir ability to sell by four
times.
If they don't want to use yourscoring model, you can at least
(23:16):
get them away from the junk.
Now, even if they just increasetheir sales by avoiding the
junk by 1.5 or two times, that'sstill a huge increase.
So, yeah, open up your menu.
There's a lot that can belearned from there.
Michael Hartmann (23:33):
Yeah, so I'm
with you on simplifying the way
you communicate those tranchesright of leads, um, tranches
right Of of leads.
I think what I would expect isa little bit of a pushback, and
maybe you've had this is okay,but that means there's some sort
(23:53):
of range for each of thosetranches.
Like how do I know which onesare at the top of the range or
at the bottom of the range,Right?
So do you run into that?
Um, cause I know I've run intothings, maybe not that exact
scenario, but something likethat before.
Lucas Winter (24:09):
Yeah.
So there's many different waysto play this.
You can just go, and thiseasiest thing to do is just to
put the numeric score back onand just go.
Well, it's just sort,numerically.
Other things you can do is youcan put a filter over it where
you then have meaningfulcontacts.
So now you're saying let's havethe gold leads that have a
(24:33):
meaningful interaction on top ofit.
Have they gone and submitted acontact us form?
Have they said actually, yeah,I do fancy being spoken to?
Have they gone and requestedthat quote, have they?
Maybe you've got a e-commerceelement.
Have they abandoned their cartrecently?
You can then put that filter ontop to make sure that once
(24:53):
they're gold, they have otherthings going on about them.
And I was going to saysomething else that was going to
be really fascinating, and it'sgone yes, it'll come back to
you, I'm sure.
Michael Hartmann (25:04):
Yeah, I think
that makes sense.
Um, I mean, there's a part ofme like this is the kind of
thing to maybe this is where aikind of models would need to go
to which is, maybe there's alsosome level of a sort of a
confidence score that goes withit.
Right, we believe this is a,you know, a gold in the
(25:27):
confidence level, and it is 90right, versus a gold, one that
we're 50% confident in, orsomething like that.
That may make it more I'mtalking myself out of it, maybe
even, but this feels like nowI'm adding another layer of
complexity that may just get inthe way of the salesperson
(25:48):
moving quickly to the next leadand moving it on.
Lucas Winter (25:50):
Yeah, it's
technically a good idea.
Um, what I like to do is justhave one score and then not a
score of the score, that.
That way, you can say lookthese, these are goals, contact
them first.
These are silver, contact thosesecond, you in a world of
infinite capacity.
Then, yeah, I'd love to say,and this is how confident we are
(26:11):
in this goal, and this is howconfident we are in this gold,
and this is how confident we arein this silver, but that
there's a stereotype thatsalespeople are lazy, and I
don't think it's a helpfulstereotype and it doesn't really
matter if it's true or notRight.
What I think is more importantis that I can tell you that a
hundred percent of salespeopleare people um, people like a
(26:32):
good user experience.
So if they know that when theylog in they can just find out
where the gold leads are, thenthat's really good.
And then, on top of that, whatI like to do is sort of show my
workings, and by that I meantell people what's going on.
So I was working at anorganization where someone was
really struggling to driveadoption with their scoring
(26:52):
model and they had this elementto it whereby it would look at
the website of the lead, thenlook that website up against a
list of 100.
And if it met any of those 100websites, then it was one of the
top 100 targets for that yearand salespeople were getting
(27:14):
these leads through and going.
I've got no idea why this isgold.
And then they weren't workingthem and it was a case of what
you need to do is to show yourworkings and say it's a gold
lead and then have the box forshow your workings why
(27:34):
underneath and say look, thislead leads gold because it's
part of this website.
This website means it's thiscompany name which is one of
your top 100 targets for theyear, and there's nothing wrong
with showing your workings.
I think it's actually a reallygood way to work.
Like uh, michael, you're quiteopen and honest with this
podcast.
This isn't the first time we'vehad a chat.
You've done some research andpeople aren't folding their arms
going.
I can't believe he's done someresearch before he's gone on
(27:55):
this podcast.
Michael Hartmann (27:56):
You're going
oh right, they're going to put
some effort in, they'reexplaining what's going on,
they're being truthful aboutthings, and then they kind of go
oh, it kind of helps make abetter experience for everybody
yeah, yeah, no, I think I thinkthat's good and, um, I mean for
a, for a period of my careerI've had like an inbound SDR
team and you know, luckily Iunderstood the inner workings of
(28:18):
some of this stuff, so I knewwhere to look for those signals
that could come in, show up inthis case Salesforce but so it
was kind of natural for me to dothat.
I'm not sure that's the case,and I think it would.
When I had opportunities totalk to actual sales teams, I
like I try to make take thoseopportunities to just kind of do
(28:38):
one-on-one um training's notthe right word but like help
them understand, like how theycould find this stuff themselves
, because I don't think thattheir typical, you know,
training and enablement thingsincluded that, because it wasn't
something that was top of mindright there, like.
So I think there's like there'sa, there's a a change
(29:00):
management component to this too, and a communication that needs
to be a part of it needs to berepeated.
Lucas Winter (29:05):
That's a natural
part so, yeah, and I love myself
, I'm with you.
Um, I was thinking that nobodywants to be going on a treasure
hunt for every lead that theyget in.
Okay, this one's a gold, is it?
Now?
Let me check three differentdatabases to find out why.
Just get to the point.
Michael Hartmann (29:25):
Yeah, so you
mentioned at some point earlier
about the idea that there's likesome best practices for lead
scoring, and I know you've alsosaid like there's some
opinion-based models, oftenbased on a sales leader's
opinion or somebody else with abig title, right.
So, um, you know, assume that'ssomething you want to avoid.
(29:48):
What are some of the do's anddon'ts that you have as maybe
sort of principles that you youlean on when you're building
building out models?
Yeah, you don't have to gothrough everything, because I'm
sure you've got a, because ofyour experience you've got an
extensive list, but maybe youknow a couple of do's, a couple
of don'ts, uh, whatever youthink is most appropriate, sure.
Lucas Winter (30:12):
I like to be
really unpopular in the
configuration and really popularin the results.
So how do you get there?
You look at what the existingmechanism is and you might find
that a company isn't actually inthe industry that they think
they're in.
(30:32):
And I'm going to steal a storyfrom someone on the Mopros
community, so this isn't mystory, it's theirs, but it
illustrates it really well.
There was an organization andthey sold horse x-rays and an
airport security looked at theirx-rays and went it's big enough
for a horse, is it big enoughfor a car?
(30:54):
Their x-rays and went it's bigenough for a horse, is it big
enough for a car?
And they went if we can x-raycars, we'll be able to do that
in the airport and find outwhat's going on inside and if
there's any security concerns.
And then it turned out that thecompany was then selling much
better to airport security thanthey were to vets.
Now what you can do with thatis then when you're looking at
(31:17):
what defines a good lead andthey go okay, so we know what
industries we're in.
We'll actually have a look tosee who you're really closing to
and it might turn out thatyou're not selling to vets,
you're selling to securitycompanies and then with that
information you'll be reallyunpopular in the configuration,
but then you'll be reallypopular in the results.
And there's going to be thesegems out there that people
(31:39):
haven't considered and theymight not necessarily want to
consider, because it might meanthat their whole marketing, the
whole brand identity, iscompletely wrong if they're
targeting the wrong people, thatthere'll be some resistance to
even think of that as being true.
Michael Hartmann (31:52):
So could I?
Just I want to make sure I'mdoing this right, right, so,
this being unpopular in theanalysis, I think you're saying
that to kind of tie it to theother side of this, but it feels
like you're not saying you wantto be intentionally
antagonistic about this approach, but you want to make sure that
you're as simple as it is right.
(32:15):
You want to be data driven, youwant to let the data tell you
kind of what, where, where youshould go with this is that.
Am I understanding that rightexactly?
Lucas Winter (32:25):
it's.
It's not a case of beingconfrontational for the sake of
being confrontational, okay, andthe what you'll find a lot of
the time is who you think yourcustomers are.
They are who your customers are.
Sure, a lot of the time,there'll be some things that
aren't surprises.
I had this issue recently wherethe link between contacts and
(32:47):
opportunities and close onedeals it broke and I kind of had
to do everything as it feelsright, so I kind of did it in an
isolated hole, waited until thebug was fixed and then went and
checked retrospectively andit's like, oh okay, a lot of my
assumptions were correct.
A lot of them are wrong.
I need to fix those.
But you will find that a lot ofthe time, you're not really
surprised.
I've worked at one organizationwhere online events were better
(33:11):
than in-person events, and thatis the exception to the rule.
I've only ever seen that once,and most of the time, yeah,
in-person events.
Of course, they outperformonline events, and you will find
this sort of it's about beingready to be open to say this is
unexpected, but unfortunately,this is the truth.
Michael Hartmann (33:34):
Okay, that
makes sense.
So are there any?
So you know, make sure you'redriving by the data.
So that's good.
Anything that you say don'tlike avoid doing this.
Lucas Winter (33:51):
Any don'ts on
there?
Yeah, don't overreact.
Um, when you get, when you getfeedback on your model, don't do
these sort of wild swings.
What you can fall into the trapof is getting feedback such as
um, um, I'm trying to pick apick on a different example than
(34:16):
Gmail's.
Now you might get some feedbackof.
This is a bad lead, becausethey're not a decision maker and
you'll have a certain job titleor they'll say, right, so
anyone of this level or below,they should be going straight to
junk.
Of this level or below, theyshould be going straight to junk
.
Now don't overreact and say,okay, now, anyone who's part of
(34:42):
this job title, they go to junk.
If anyone's part of one ofthese bad industries, they go
directly to junk.
Make sure that you're reactingappropriately so you can adjust
the model.
It can be iterative and it canget a bit of a knock on the head
and a bump downwards, but makesure that you're not taking them
straight from gold down to junkbecause there was other
attributes along the way thatare of value and you don't want
(35:03):
to end up in a situation whereyou're having these sort of wild
swings based on sort of one ortwo things, that they end up
going the wrong way.
Uh, things that they.
They end up going the wrong way.
And that's true for the otherside of things, which is where
someone will say my marketingtactic is the most important
marketing tactic.
Therefore, anyone who goes toone of my events must be a gold
(35:26):
lead by default.
Yeah, just because they'veattended your event doesn't
necessarily mean that they'rethe single most important thing.
So they don't go from a wildswing, from being junk straight
to gold.
Michael Hartmann (35:36):
Yeah, so I
think like a single data point
does not mean it's something youshould react to is kind of what
I the way I interpret that, andI agree with that.
The other point that I just wantto hone in a little bit on what
you talked about is likethere's this iteration idea.
Right, and this is one of thethings I've run into when I've
done lead scoring at differentcompanies is when you get the
(35:59):
people in the room you thinkneed to be involved with
providing input on that, the,the scoring model.
There's often this I senseoften that those people think
like this is my one and onlychance to have input, and what
I've learned to do in thosescenarios is, like you know, I
want us to get to at the end ofthis session.
I want us to get to somethingthat we can all, we all feel
(36:21):
comfortable with we don'tbelieve it's 100 but also
something that we can.
We know that we can give it alittle bit of time to monitor
and see what the results are andthen um make changes over time
based on the results, and Ithink that helps not only with
the immediate building of it butalso the ongoing, as long as
(36:42):
they continue to bring them inwhen we're evolving it.
That way they know about thechanges.
Lucas Winter (36:48):
I think that's
true, and there's some advice
out there which might soundcontradictory, but I think it's
also true.
Two things can be true at thesame time.
Absolutely.
One thing that you mightdiscover if you start to look up
lead scoring is that you shouldjust get started, just start
making it Now.
(37:09):
That doesn't mean that you takeyour first draft and put it in
front of sales leaders and thensay here's what I've started
with, how do we go from here?
It's a bit like I don't knowscripting anything, a Hollywood
movie.
You wouldn't take your firstdraft to a producer.
You'd make sure that it's draftfive or six and then when that
producer then starts making yourmovie, the one that ends up in
(37:30):
the cinema might have little tono reflection at all of draft
five or six that you originallysent to them.
So, yes, it's true that it'siterative and it can be changed,
but what's also true is whatyou said about having a model.
That's good, that you can thensit down and say this is what
we've got, what do you like,what do you don't like?
(37:51):
This is what we've got.
What do you like, what do youdon't like?
One thing that I like to do isshadow sales and watch what they
do on a normal basis.
Which fields are they lookingat first?
If they're looking at sort ofthree fields, then one, you know
what's important, and two, youknow exactly where their blind
spots are.
Michael Hartmann (38:08):
Yeah, I will
tell you, when I've done stuff
like that, sitting with somebody, the hardest part is to keep my
mouth shut, honestly, because Idon't want to influence what
they do.
I really want to observe andthat's actually really really
hard, if you know it well.
So you were kind of runningclose to a time when we're going
(38:28):
to have to wrap up, kind ofrunning close to a time when
we're going to have to wrap up.
But I do want to hit on onething, because this is something
that in our earlierconversation it sort of
surprised me and not didsurprise me a little bit.
But you, if I, you told me thatyou like to use spreadsheets to
build your lead scoring models,and I think you have a template
(38:50):
too, or at least maybe a, a, a,a good structure for doing it.
But like, so, first of all, Ithink that surprised me.
I suspect it will surprise manyof our partners, because most,
like most marketing automationplatforms have some sort of way
of doing it.
Um, I'm gonna assume salesforcehas it.
I don't, I'm less familiar withthat but like, um, why it's the
(39:14):
spreadsheet?
I think Excel or Google sheets,whatever you use.
And then, um, how do you like?
I'm curious, like how do yourclients react when you you start
doing that?
Lucas Winter (39:28):
I've uh, I've lost
you on the technical.
I'm going to guess what the endof that question was.
I've lost you on the technical.
I'm going to guess what the endof that question was.
And essentially, when it comesto building the scoring model,
the marketing automationplatforms have all the technical
stuff in there that you need tomake things.
Functionally, you can say 10points for this, 5 points for
(39:49):
that.
What you can't do is actuallyevaluate whether you should be
doing that or not, and that'skind of the element that you
have to sort of take down.
If you can do it in python,fantastic.
If you have a datavisualization tool, fantastic.
Um, what I want to do is is, uh, crunch as many numbers as I
can in in mic Excel, because Ican just get there a lot quicker
(40:12):
.
Michael Hartmann (40:14):
Yeah, yeah,
and so like full, full
transparency to our audience.
So in the middle of me askingmy question, I think my
connection was wonky and soLucas didn't get all of it, so
hopefully that will still workout.
But yeah, I mean, I think thereason it surprised me is, I
(40:34):
think, because I think a lot ofpeople who are listening, you
know, we have these otherplatforms and we kind of think
in that model, but at the sametime, excel continues to be one
of the most common tools in theMarTech stack, if you will Right
.
So it makes sense and it's easyto share it, it's easy to
(40:54):
communicate it, it's easy forother people who are familiar
with Excel to understand howeven a complicated Excel model
might work.
Lucas Winter (41:06):
Whilst we're on
the topic of technology, I think
it would be rude for me to wearan in-cycle t-shirt and not
mention in-cycle for me to wearan in-cycle t-shirt and not
mention in-cycle.
So if you've got dirty data, Ifound in-cycle really helpful to
sort of clean that up.
In terms of when you'rebuilding that lead score model,
one of the first products thatpeople will want to go to is a
data enrichment tool.
But if your data isn't clean,if you don't have good data
(41:31):
quality, then you're going tohave to cleanse that data first.
If you want to have a good,have good data quality, then
you're going to have to cleansethat data first.
If you want to have a goodscoring model, make sure that
you're cleansing your data andthen make sure that you're
enriching and cleansing again.
Michael Hartmann (41:43):
Yeah.
Yeah, I'm torn on that becauseI tend to believe that you can't
wait for your data to be quoteclean end quote right before you
start on some of these things.
Sometimes just the process ofdoing that will reveal where
there's data issues and then youcan go address them as they
appear.
But in general, ideally thebetter the data, the better the
(42:07):
outcomes.
I tend to agree with that.
Lucas Winter (42:10):
Yeah, the better
your data, the better you can
have an impact on your scoring.
If you're staring at yourdatabase and everything's blank
I've been talking about industryif it's blank, that doesn't
help you.
I've been talking about jobtitle.
If your job title is blank,that doesn't help you.
If you're looking at number ofemployees, etc.
Etc.
Etc.
I wouldn't recommend just goingout and getting a data
(42:31):
enrichment provider if you'vegot a fully fledged database.
If you're just staring at abunch of blank fields,
backfilling that is going tomake you in such a stronger
position.
So you don't have a model thatbasically says what's your job
title?
Well, it's good if we know it.
Like that's such a such a lowlevel way of determining quality
(42:52):
versus not.
Michael Hartmann (42:54):
Right,
interesting.
Well, lucas, we are kind ofcoming up to the end of our time
.
We covered a lot here.
Is there any big nuggets thatwe didn't cover that you want to
make sure the audience hearsabout?
Lucas Winter (43:08):
Well, that's flown
by for me.
If you want me to chat moreabout lead scoring, I'd ask the
listener to put a comment downand just say bring him back, and
I'll come and do this all overagain.
Michael Hartmann (43:20):
That'd be
great.
Well, so in that vein.
So, first off, lucas, thank you.
This has been a funconversation and I think it'll
be interesting for our audienceto get this, because it's a
little bit contrary to thecommon narrative, right?
So it's a good example of whatI would say.
Is you know why?
I say often that there's thisfallacy of best practices, right
(43:40):
?
Because if everybody was doingthings every way at the same
time, no one would bedifferentiated.
So, for saying thank you, ifpeople do want to you know,
don't want to wait and put incomments and ask for your input
what's a good way for them toconnect with you or learn more
about what you're doing?
Lucas Winter (44:04):
Nice, I'm not on
social media.
No, that is a lie.
I am on social media.
I'm on the MoPro Slack channel.
You can find me there as LucasWinter, and if you find yourself
where you're liking what I'msaying and you want to build
yourself a first lead scoremodel and you're still not sure
where to start, I'm lucas atscoremyleadscom.
(44:26):
And if you're in a situationwhere you've had a lead score
model for years but nobody wantsto use it Again, lucas at
ScoreMyLeadscom, give me a shoutthere.
I'd love to connect.
Michael Hartmann (44:37):
Terrific Well
again, thank you, lucas.
Thank you to our long time andfirst time listeners and
audience.
We always appreciate that.
As always, we are open to ideasfor topics and guests, and so
if you, if you have an idea fora topic or a guest or want to be
a guest, like Lucas did, reachout to us and let us know We'd
be happy to talk to you Tillnext time.
(44:59):
Bye, everybody.