Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello everyone,
welcome to another episode of
OpsCast brought to you byMarketingOpscom, powered by all
the more pros out there.
I am your host, flying soloagain, michael Hartman, and once
again I will say I look forwardto having Naomi and Mike rejoin
, probably as we get closer into2025.
This may be one of the lastones we're recording in 2024.
(00:21):
So looking forward tocelebrating our four-year
anniversary in early January2025.
So joining me today is ClemensDiamond, an AI growth consultant
at Elgo Marketing.
Prior to his role at ElgoMarketing, clemens has held
various marketing leadership anddemand generation roles at
several companies and he startedhis career as a foreign
(00:42):
language teacher and performancecoach, which I don't know that.
We'll have time to get intothat, but I might have to follow
up with him just one-on-oneabout that.
So, clemens, thank you forjoining me today.
Speaker 2 (00:52):
Thank you, michael,
thank you for having me.
Speaker 1 (00:54):
And so, for our
listeners, he is staying up late
on a Friday to do this.
So I appreciate that, clemens.
All right, so you have workedin marketing and demand gen for
many years and you told me thatrepeating what worked in prior
years usually does not work longterm, which is very much
(01:16):
aligned with what I've saidbefore, that the term best
practices is.
Well, I just hate that because,like, it implies that there is
a way you could just, you know,cookie cutter, replace copy and
paste and repeat what you didsomewhere else.
But when you say that, whatdoes it mean to you?
Speaker 2 (01:35):
yeah, that's a great
point, michael.
And yeah, I just a bit aboutyeah, my background.
I worked in smaller scale-upsstarting from 100 people, and
also large enterprises have theopportunity to work for LinkedIn
and Google, and what I noticedis that the marketing planning
(01:57):
process is very much similar, nomatter if you work for a small
scale-up or a large enterprise,and the way it works is that
normally, marketers look at theperformance from last year and
there is a knock-on effect forhow marketers plan to operations
(02:18):
, which I'll get to in a minute.
So marketers look at the pastyear and see what works,
leveraging historical analysis,and then they go ahead and plan
their campaigns for thefollowing year and their
channels, and the challenge theyface very normally is that the
(02:41):
targets tend to increase, notdecrease, year on year.
Of course, and what worked lastyear, even though if you repeat
it and double down on it, itmight not get you to the targets
in the upcoming year.
So there always tends to be thatgap in the planning process
that marketers need to make upby placing gaps uh bets sorry,
(03:05):
uh, placing strategic bets onand hoping that they will pay
off and that these strategicbets or campaigns that are
strategic bets, uh, get them totheir targets.
So there is a, there is anelement of hope involved in the
um, intuition involved, uh, andintuition is great, but if you
if're relying on hope, then thatcan be a business risk as well.
(03:25):
And what that means then overthe course of the year is that,
yeah, if the targets then arenot attained, or the pipeline,
the marketing pipeline, isfalling short of expectations,
that marketers just keep onadding new campaigns to make up
for the gap in revenue orperformance, or pipeline
(03:46):
performance, and that reallyputs them marketing operations
teams under pressure, becausemarketing operations teams start
scrambling with all theadditional ad hoc campaigns.
Speaker 1 (03:58):
Yeah, and I mean my
experience is when you get that
like high volume, pressure buthigh volume, and said, but hey,
we need to just generate amassive amount more leads or
whatever it is that your, yourgoal is, but also the increased
volume of activity is whenthings start to break down.
So if you're, if you're reallythe discipline about tracking
(04:18):
what works and not where itdoesn't work, right kind of goes
by the wayside and often getsgets missed as well.
I the one.
The one thing you said is thatyou know these marketers tend to
look at past performance, and Ithink that's true to some
degree, but I don't think in alot of places that actually
(04:39):
happens.
So, like I've been at placeswhere, well, either that or or
they don't look at.
Here's one of the challenges Isee the performance is really
sort of a basket of metrics,right, and there's not, I don't
think there's a single metric.
And so if you're not looking atthat basket of metrics to see
what's performing, you know, interms of each different sort of
(05:03):
I don't want to even saydimension, but maybe different
components of a customer journey, then I think you're you're
going to potentially makedecisions based on, you know,
incomplete data and I've saidmany times I don't believe it's
quote right or wrong data, it'sjust incomplete, but I think,
recognizing that.
(05:24):
So what I found is that a lotof people just don't they
actually don't rely on past data.
Have you seen that too?
Or is it just my bad choices ofcareer stops?
I don't.
Speaker 2 (05:37):
Yeah, it's definitely
true.
I think the historic challengereally is that if you want to
make data-driven decisions,which of course is better than
not making, or, let's say, youincrease your chances of success
if you make data-drivendecisions, historically speaking
and I'm speaking kind of up toAI possibilities that AI
(06:04):
provides to us so, before wehave those possibilities which
we'll go into in a second, Ibelieve all you have really is
historical data to makeassumptions on and to inform
your decision-making.
Of course, there is also theother model of skipping that
(06:28):
step, not looking at the dataand just trying to do more and
more faster.
And I think, yeah, what youmentioned is what data points to
look at.
Of course, if you're going totake a pipeline, which tends to
happen under pressure, youforget about the other metrics.
So, of course, what happens isif you keep on adding a lot of
(06:49):
ad hoc campaigns, then theexecution gets scrappier.
You expand your target audiencewithin your database, you get
less engagement to yourcampaigns, to your campaigns,
and the database is becomingless responsive, which of course
, turts you because youaccelerate database decay and
(07:10):
that creates a longer termproblem for your marketing
organization as a whole.
So, um, yes, the the challengeis very much that if, yeah, up
to now, like, or if, if you usedto rely on historic data,
that's, that's probably betterthan not relying on data at all,
sure, yes, um, it doesn't eventhat that never enabled you to
(07:35):
close that, or made it verydifficult for marketers and
market operations leaders toclose that gap in terms of
revenue required yeah, and youknow you use the word.
Speaker 1 (07:47):
I think intuition and
talking about what happens
either in the planning stages orin reaction to underperformance
, is that we'll tend to lean onour intuition or in air quotes.
What has worked in the past inair quotes, right?
What has worked in the past,right?
And and um, chasing, chasingsome metric that we've committed
(08:07):
to, um, what?
So?
I mean, I feel like it's apattern where I think a lot of
teams get into, uh, and probablyhave more so in the last few
years is is there've beenchallenges in the economy and
everything else.
I mean, I'm curious, and so Ithink there's a place for some
(08:33):
intuition, right.
At the same time, I think I seethis pattern play out over and
over where we're not making ourmark, and I think there's also
this assumption that youmentioned pipeline.
I would back it up.
Even if the marketing team'sgoal is goal is based on, say,
mqls or sales accepted leads, orsomething like that, um, it's,
it's relatively easy to generatemore MQLs or sales accepted
(08:55):
leads, but the quality in theconversion rates downstream then
tend to to not be as good.
So, but yeah, I, what have you?
You see this pattern play outtoo in you know, either your
your work in you know some ofthese organizations, or in your
client work with algo, marketingor or what's.
And then what?
Have you seen anybody sort oflike recognize that they were
(09:19):
doing that and then be able tochange course to be a little
more data I don't like jadadriven, but data informed on
their decision-making.
Speaker 2 (09:28):
Yeah, I personally
what I found or how I personally
try to address that challengebecause I believe I completely
agree with you.
It exists in every organizationand, to your point, the tension
always comes in at the MQL toSAL conversion rate or MQL to
opportunity conversion rate,where the handover to sales
(09:49):
happens.
And the very common frictionthere is that if the pipeline,
if sales, doesn't generate thepipeline, then of course the
challenge that is posed to themarketing organization or the
conversation can easily beframed that the mqls aren't of
high quality enough or that thequality of mqls isn't high
(10:12):
enough.
So, um, and the way Ipersonally always try to address
the challenge is by looking atthe data and and looking at what
the data tells me.
So so, for example, one way Iaddressed the challenge is when
there was a drop in conversionrate because the sales
organization scaled very quickly.
(10:33):
In one of the organizationswhere I was leading growth the
growth marketing side of thingswhat I noticed is that the
conversion across channels wasdipping.
So it didn't matter whichchannel the MQLs came from,
whether they were organic orpaid or direct or referrals,
(10:56):
irrespective of the source ofthe channel.
The conversion rate droppedfrom MQL month on month with the
quickly expanding sales team,so it seemed so.
In other words, I was trying touse data or I use data in the
conversation.
So rather than I think likethose conversations can get very
(11:17):
, can get emotional sometimes,right, and data can be a good
(11:39):
way to object hints at what'sactually be more in the quickly
scaling sales team than in thequality of the MQLs, because it
would be a big coincidence if,across all channels, whole MQLs
lost a similar amount of qualityover a couple of months.
Speaker 1 (12:00):
Yeah, it's
interesting.
It brings to mind I rememberworking in an organization where
, in this case, the SDR-BRfunction was under the marketing
leadership and the organizationhad a pretty solid structure of
capturing, sourcing of whatwe'll call a pipeline.
But it was very much built onthis very linear model from MQL
(12:22):
to SAIL to pipeline, qualifiedpipeline to opportunities and so
on.
And, to your point, goalsdidn't go down year over year,
they went up in the aggregate.
We had to.
You know, when we were gettinginto planning for the upcoming
(12:46):
year, we already knew that wehad been, under that pressure,
done a lot of volume stuff and Iwas trying to convince our
leadership team that what weneeded to push back on was not
it was.
I was like focus on the endright, we want revenue to grow
right.
Ultimately, it's like if youwork backwards from historical
conversion rates, it's goingwant revenue to grow right.
Ultimately, it's like if youwork backwards from historical
conversion rates, it's going todrive to a larger volume, just
(13:08):
simply a larger volume of MQLsthat we'd have to get.
But to get that we couldtotally get that, but the
quality overall is going to godown.
So then the conversion rateswon't match and we're going to
get in this sort of overall isgoing to go down, so like, like,
then the conversion rates won'tmatch and we're going to get in
this sort of disappointmentcycle and that's that was my, my
concern there.
(13:29):
So, like how you, so I guessthis is it right.
So you get into this, into it.
You know this, this cycle wherea sort of you like you're,
you're, you know, chasingsomething that is not really
going to ultimately benefit theorganization potentially, yeah,
and you could argue it'smarketing sales, like where's
the process or where's the,where's the challenge, but yeah,
(13:54):
to me it also adds anoperational risk.
We touched on this a little bit, right?
So you know you, you makemistakes and things like that
which exacerbates the problem.
Do I mean I, I, what I justdescribed?
Right, that scenario, thatcompany, is that a common thing
that you've seen with yourclients?
Or you know in space, wherethat, that that planning process
is based on historicalperformance?
Speaker 2 (14:15):
but you know that
going forward like that's not
going to hold yeah, um, exactlyso I I the channels that you
outline, yeah, I believe everymarketer or marketing operations
person has seen that and yeah,that's just generating more
volume of MQLs which, if youknow kind of how the system
(14:36):
works, you can always hack yourway to more MQLs.
I believe it's kind of it mightbe a tactic that's working short
term, but I think it's creatingthen challenges in the long
term, which we already went into.
So I do believe the opportunityis to look at the full funnel or
(14:56):
definitely up until thepipeline, so deeper into the
sales cycle, and, um, yeah, fromfrom that, really understand
what, what, uh, yeah, how togenerate the right mqls that
like generate opportunities, um,so you don't end up with vanity
mqls which don't, yeah, whicheffectively deteriorate the
(15:19):
quality of the marketing.
Um, so, yeah, I, I believethere is also an opportunity for
anybody in marketing operations, for instance, to step up and
take more of a revenueoperations lens and and not try
to necessarily defend marketingbut really objectively look at
(15:40):
what's what's going on acrossthe funnel and where
improvements can be made, not asa bias, because the goal is to
assign blame, it's more toreally understand what the
challenge is and how to resolveit.
So marketing operations teamsor individuals often have that
opportunity to take on thatneutral perspective and also
(16:03):
elevate their profile, quitefrankly speaking, by bringing in
those data-driven analyses andconversations.
Speaker 1 (16:12):
Yeah, I think this
hits on two points that I try to
encourage people in marketingops to do, which is one
understand the context of howmarketing fits into how your
company goes to market and howit makes money.
Right, it's like if you canunderstand that better, it will
give you a different lens intoyou know, prioritization and how
(16:34):
you make decisions about thefinest, about how you, you know,
support or not support thingsthat you're asked to do in
marketing operations or come upwith better alternatives.
The second is, uh, this idea ofunderstanding how other teams
work.
Right, so, buildingrelationships and maybe even
doing shadow days or whatever,in with your counterparts and
(16:56):
say, sales operations orcustomer success or sales, even
right, um and uh, I thinkthere's a lot of value to doing
that.
So if you don't, you know, ifyou don't understand that other
stuff, it's a.
It's really easy to discountthe challenges that they're
going through and think thatyou're the only one who has the
challenges.
So, uh, off my, off my highhorse for a minute here.
(17:20):
So I guess, really, what we'vetalked about, right, is this
reliance on historical data andanalytics is part of the part of
the challenge that we have.
Um, and then, if I understandright, you you and I've talked
about before that you believethat you can.
We could now and maybe this isbecause of ai or other tools,
(17:41):
right that you can incorporatepredictive analytics into what
we're doing in terms ofcampaigns and tactics, with a
goal of improving whatever keymetric it is or set of metrics
that we want.
So maybe first for our audience, who may not be familiar with
what the term predictiveanalytics means, if you could
(18:01):
provide a definition and thenmaybe how that can help teams
that are doing yeah, whetherit's planning, actually
executing on you know, doing the, you know campaign projects or
tactics, things like thatgetting the right performance,
like it's a long-winded question, but if you go and take a shot,
yeah, sure, so sure.
Speaker 2 (18:23):
So I mean, predictive
analytics are really, it's
really a technology or a way ofanalyzing data that is forward
looking in nature instead ofbackward looking in nature, so
making predictions on outcomesof your campaigns based on
historical data and, uh, and, ofcourse, with the integration of
(18:48):
AI, or the foundational modelfor this being AI-based, what
this enables you to do is, well,there's two sides to why it
works or it's adopted veryquickly by different
organizations at the moment.
At the moment is because onewith AI, you can integrate data
(19:10):
much faster.
So all the manual integrationthat makes it normally very
challenging and complex to comeup with holistic reporting that
can be done much faster now.
And then, by looking at what hasworked in the past and just
aggregating more data thanbefore, you can look at
(19:31):
opportunities or closed oneopportunities from the past that
were generated throughmarketing and create lookalike
audiences and have thoselookalike audiences be based on
your criteria that your databasesegmentation captures.
But even beyond your databasesegmentation, let's call it
(19:53):
dynamic segmentation that isinformed by AI.
It would allow you to justidentify, in a much more
targeted and better way,audiences that tend to convert
really well and the messagingthat allows these audiences or
that speaks to these audiencesin a way that enhances their
(20:17):
funnel conversions effectively.
And with this resonatingmessaging and by identifying the
audiences, you can then buildcampaigns around your campaign
plan around that.
So, yes, there is an element ofknowing what worked in the past
, but you can also become muchmore hyper-focused and
personalized by specificallyaddressing those audiences and
(20:40):
building campaigns and channelsout speaking to those audiences.
So you can not necessarilyagain to the previous point
generate more MQLs, but generatethe right MQLs that have a high
likelihood of converting andwhich, again, is very likely
then to get the organization totheir revenue target as a whole.
Speaker 1 (20:59):
Yeah.
So, at the risk of using ananalogy that might not come
across, well, is like going fromusing a shotgun to a sniper
rifle right, correct, cross.
Well, is like going from using ashotgun to a sniper rifle,
right, all right, yeah, so theonly so, the so I'm familiar
with the term predictiveanalytics, going all the way
back to my first start intomarketing is database marketing,
(21:19):
a large organization where wehad what today would be called
data scientists who were doingpredictive modeling.
Um, so I think the challenge Ithink people would run into is
and maybe you've seen this isthat any predictive model is
going to have assumptions builtinto it, right, and if the
(21:41):
assumptions are flawed, themodel itself will be flawed and
its predictive capability.
So I know what I would tend towant to do if I was doing a
predictive model, say I wasbuilding it on my own, was I'd
I'd want to do at least, say,three scenarios with different
assumptions built in, so you cankind of see what, like you know
(22:04):
, are, are, as are theyconsistent or the inconsistent?
What's the range of outcomesthat we might expect as opposed
to?
This is exactly what we expectif we do this thing right.
So do you have that same kindof viewpoint, and I'm thinking
really more like human-basedpredictive modeling as opposed
to we can get into AI here.
(22:24):
I think I'm curious to hearthat.
Speaker 2 (22:26):
Which is exactly why
the way to go is with a
human-led and ai-assistedapproach.
Um, okay, because the theknowledge of, yeah, the
overarching knowledge that amarketeer or a product marketing
operations professional hasaggregated over the years um,
(22:47):
yeah, is critical to to successand may not be taken into
account by AI, or there might besome false assumptions that the
model needs to be trained on.
So I would approach it, butwhat I think the near future
will be is that, instead ofnecessarily expanding our team
(23:07):
exponentially all the time whichis difficult anyways because
the budgets don't support it andpeople are very overstretched
the realistic scenario willprobably that there will be
different AI models.
That would be the directreports of marketing and
marketing operationprofessionals, and, just as any
person that you have a new team,you would need to train those
(23:29):
models as well and make thembetter at what they do.
So the feedback there is veryimportant and I definitely agree
as well that the expertise, orthe human expertise, is critical
, especially when it is aboutoutcome projections.
(23:50):
Outcome projections to look, forexample, are we actually
looking at the right metricshere?
Um, that could what?
What does really constitutesuccess in the long run, beyond
our annual kpis, let's say, orokrs, so?
So those are importantquestions that, again, the
seasoned professional will beable to uh answer and have a
very kind of thoroughunderstanding of informing the
(24:14):
AI model to behave in line withthat right.
So, where the generalresponsibility comes in, yeah.
So, in short, I do believe that, yeah, there is always, as a
professional, we need to takeinto account that it might take
some training time.
I would always go with theworst outcome prediction.
Speaker 1 (24:37):
Prediction yeah.
Speaker 2 (24:38):
And not with the best
outcome prediction, to be on
the safe side.
Speaker 1 (24:42):
No, I agree.
It's interesting that you didthis analogy of training the
model is not all that differentthan training a human resource,
which is.
I'd never thought of it that way, but it seems to me that the
risk is that, if we're there's afine, it seems to be like
there'd be a fine line betweentraining the model and ignoring
(25:03):
the model right, or totallydiscounting the models output
because we didn't like it wasthrough our own biases or our
own fear or whatever.
It literally just occurred tome, so it's bouncing around in
my head right now, but it seemslike that would be a challenge
as well.
So you and I think I've heardthis from folks at Algo
(25:30):
marketing as well before, butyou I hear the term next best
action.
It's sort of that, is it?
It seems like that's the, theoutcome you're striving for from
predictive analytics, whetherit's ai driven or not.
But yeah, I have in my headwhat I think that means right
next best, what next best actionmeans.
But what is that?
How do you think about that?
(25:51):
And if you have any examples ofwhere that's coming to play, I
would be curious to hear aboutthat.
Speaker 2 (25:58):
Yeah, sure, I think
there is a head start when it
comes to next best actions,probably on the sales side at
the moment, because it's justvery easily applicable to the
sales side at the moment.
Because it's just very easilyapplicable to the sales side.
So the way I would explain itis that sales have to follow up
(26:20):
with a lot of MQLs or have to doa lot of outbound prospecting
and for them it's very difficultto decide okay, or they
practically don't have the timeto look at every lead
individually and see whatindustry they're from and have a
custom follow-up to that lead,what decision-making level and
so on and all other criteriathat are important.
(26:43):
There's just not the time to dopersonalized follow-up and so
there tend to be these standardoutreach cadences that that mqls
are being put into oh yeah, Iget them all the time yeah, and
also, like there's often adisconnect from marketing, it's
not always clear what the leadsexperience on the marketing side
, so the sales follow-up can bedisconnected from the marketing
(27:06):
side and what the actualexperience was for that person,
and that creates a friction, ofcourse, in the customer
experience or in the userjourney of the prospect.
And that's where a next bestoption model, what it can
effectively do, is there's againthe predictive analytics aspect
(27:30):
to it to decide to understandwho this person is, what that
person experience, what thatperson is looking for, based on
lookalike audiences that havebecome customers in the past,
and then create a path for themthat is personalized to much
higher degree than the standardoutreach cadence.
So how that would look like isthat an SDR?
(27:53):
Instead of getting a lead andputting it into a sales log
cadence, it would pre-populate acadence outreach email by email
and effectively tell the SDRokay, based on lookalike
criteria and what has worked inthe past and personalized
(28:13):
messaging, you should follow upwith these 10 or 20 leads today
and or in the next hours or so,and then they would go into
these MQLs and then open themand then open the sales loft
window for them, basically, orany kind of interface that
allows them to directly messagethem and that would already be
(28:36):
pre-populated.
So, instead of coming up with apersonalized message from
scratch, it would just takeloads of information into
account, even search online ontheir company and what they're
going through at the moment andwhat their priorities are, and
then pre-populate a message thatseems relevant, and then the
sales rep can, of course, editthat message or change it.
(28:56):
But that makes it very quickand repeatable to reach out to
prospects in a way that is waymore personalized than the
standard sales, loft outreachcadences, personalized than the
standard sales and loft outreachcadences.
So of course, that use case isvery sales-centric on that
(29:18):
one-to-one communication, but itkind of shows the potential for
making personalization scalable.
On the marketing side, itprobably won't be to that degree
that, if we translate thatconcept, it won't be to that
degree that every we translatethat that concept.
It won't be to that degree thatevery person has their
personalized journey.
But you can, through yeah,elements of yeah, ai, generation
(29:41):
of assets and copy, generatemany more variations of one
campaign or one email than youare able to do with yeah, people
, resources or just by your ownum.
So what that would mean, then,is that that you can uh, yeah,
maybe have like, instead of oneemail, you can have 20 emails or
(30:05):
30 emails, which are all autosegmented.
The first draft is beingrobo-built from a campaign
perspective, but also from acopy and asset perspective, and
then, instead of again, I wouldrefer to the manager analogy
instead of being the doer thathas to provide all the copy or
(30:25):
build all the campaigns, theperson managing that process
would become more the manager ofthe process, rather than having
to build everything fromscratch.
So reviewing the copy, qaingthe campaign before it goes live
and, yeah, that would enablethat next best action model to
be translated to the marketingside, where also the timing then
(30:47):
is adjusted to the yeah.
What.
Which leads are the priorityleads that need to, and what?
What do they need to experienceand see next, in order to make
the conversation very relevantand convert them down the funnel
.
Speaker 1 (31:04):
So I want to.
I want to get back into thissort of robo build stuff, but I
have a follow-up question aboutthis.
I want to get back into thissort of robo-build stuff, but I
have a follow-up question aboutthis.
So I love the idea of whatyou're talking about.
On the sales side, myexperience has been mostly on
(31:24):
sort of large ticket items thathave a relatively long sales
cycle.
One of the things I found iswhether it's through churn and
turnover in the sales team orjust because there's more of a
focus on things that are nearterm, to close, that ongoing I
won't use the word nurture, butfrom a sales standpoint of these
opportunities that are in earlystages doesn't seem to get a
(31:46):
lot of focus or activity, and soI could see a lot of value for
something like that.
So do you think that same modelapplies well for both sort of
sales processes or customerbuying cycles that are really
long because we have anexpensive product that is a
long-term kind of thing, versusthose that have a shorter one?
Speaker 2 (32:08):
Yes, that is a great
point and question.
So definitely I would say adifference is when you have a
kind of high velocity, lowersales price or annual contract
value, then it tends to be moreof a volume game and therefore
(32:32):
of course you have a lot of datato inform your decisions and to
hyper-personalize and so on.
When we go more into the highACB or annual contract value
products and with longer lifecycles, probably also larger
(32:52):
enterprises that are the targetcustomers, there is, I think it
probably moves a little bit inreality from next best actions
or like the next best action canbe still there, but I think it
would be more a recommendationbasis still there, but I think
it would be more arecommendation basis.
So what?
What?
Because often also the, thesales people that are involved
(33:15):
in that are very senior salesexecutives that have a lot of
experience in that particularindustry, maybe even
long-standing relationships withthe prospects that they're
working with and they are.
You probably want to give lessdirection, you want to give more
relevant insights.
So what that means is youaggregate a lot of information
(33:37):
and I think that's whataccount-based marketing is often
trying to do, but it's verydifficult, kind of again, with
traditional methods because it'svery easy to aggregate
information, but the quality ofthe information might not always
be there.
Again, I would apply a similaranalogy of a sales executive
(33:59):
training assistance, so to speak, on aggregating information and
what good information lookslike, what not good information
looks like.
So they effectively personalizeor specialize the products
directly focused on their clientbased and prospect based over
time to provide them with therelevant information.
(34:20):
Or train models to provide themwith the relevant information
that really allows them to moveconversations forward with
relevant insights the, the, I.
Speaker 1 (34:32):
I think I like that.
So it's a little lessprescriptive and more of a like
I.
The word that's coming to myhead is a nudge right, yeah,
you've got this opportunity.
That's you.
You don't think it's going toclose for another 12 months, but
you haven't interacted withthem in six weeks.
You know, should probably goreach out.
Something like that is thatkind of what you're yeah, I mean
that that could be.
Speaker 2 (34:53):
I think what, what,
what, like this could even, I
would imagine, be possible withstatic reportings.
That just sure you.
Yes, but I think what it can do.
Yeah, that that would be onecomponent.
I think the other componentwould be to come up with a
compelling message like, forexample, you get an insight that
, um, a decision maker haschanged in the company, or there
(35:15):
was a reorg, or, um, there wasa shift in their go-to market,
or anything that that might beimportant there.
Or it's just like, withouthaving to go through everybody's
LinkedIn profile and so on,right, or or reading the news
yourself right, you can getrelevant talking points of why
you're reengaging instead ofjust reengaging.
Speaker 1 (35:37):
Got it.
Okay, that makes.
Okay, that makes a lot of sensethere.
Okay, I get it.
So so, going back to this ideaof I think you used the term
robo build of of multiplevariations of stuff, can you
talk a little more about what?
Are you actually doing thiswith any of your clients today,
or what does that look like?
Speaker 2 (35:58):
Yeah, so yes, the
next best action on the sales
side is something we recentlyimplemented with Cisco and, yeah
, that was oh, yeah, or is goingvery well.
I'm not sure how much I'mallowed to reveal about the
numbers, but uh, yeah, like that, that um is a successful pilot
(36:20):
um program for them that they'relooking to scale now.
And yeah, that use case was aninside sales team.
Or is it an inside sales teamthat has every sales rep?
It's truly a sales rep and eachsales rep has thousands of
accounts and they tend to getaround.
Or the challenge is how do yousee the trees with all the
forests around, right?
Yeah, and so for them.
(36:44):
It basically is a way toprioritize which leads or which
accounts to follow up with or toreach out to with what message,
and and to streamline thatprocess, uh, almost kind of
coach them and mentor themthrough through this, um, yes,
uh, sales cycle.
Um, that that is one one, um,key, yeah, like the start or
(37:10):
like one example of how we,specifically at Algo Marketing,
implement next best actions Onthe marketing side.
I think, yeah, kind of startwith insights and actually look
at the data and aggregate allthe data in marketing that sits
(37:34):
across the organization and notonly have the results of what
the performance looks like asoutputs, but also have a
narrative to it that is AIgenerated, so to not only know
why MQLs are going up or down,like over, no, that MQLs you not
only know that MQLs are goingup or down, but why they're
(37:56):
going up or down, and have that,yeah, or look at a specific
region and have nuancedexplanations and you can drill
further down and have additionalexplanations and so on.
So that gives executives andregional marketers and, uh, yeah
, a lot of insight into what'shappening and why it's happening
(38:17):
in real time, instead of havingto do manual analysis.
The analysis is, uh, you know,much more custom and and
generated on the spot, and also,uh, there is the data
storytelling aspect to it.
So that, yeah, and, and thenyou again in the role of more
verifying does that soundaccurate or is there something
(38:38):
that the mobile doesn't takeinto account?
Of course, but, yeah, startingwith insights that's a common
approach and then moving frominsights to what we call
direction, where, based on whatyou're seeing and what's working
or not working, what should yoube doing next?
So this would be very helpfulfor any marketing managers that
are in the position of planningtheir programs, planning their
(38:59):
campaigns, to getrecommendations of what they
should be doing next.
So that sits mostly on theanalysis side.
The opportunity, or like theway you can take it full circle,
then, is to integrate it intomarketing automation, so that
you actually, from inside'sdirection, you get into action,
(39:25):
where recommendations areactually implemented for you,
where the audiences are pulledin, where the emails are
generated for you, the campaignsor programs are generated for
you and you then review them.
That is the deepest level ofintegration, probably, which is
why we see organizations oftenstart out with the insights and
(39:46):
direction first, before theymove into the action element of
Next Best Action Right.
Speaker 1 (39:50):
And going back to the
analysis and insights piece,
something that you talked aboutwith me is sort of
differentiating between processand funnel type metrics, but I'm
not sure I totally understandthe difference between those two
and why that's an importantdistinction there.
Speaker 2 (40:10):
So if you can
elaborate on that, yes, I feel
like in especially marketingoperations, that can be quite
helpful to distinguish it In away, if you imagine the quadrant
and on the upper side or how itwould look like you know, like
the x-axis or the horizontalline would be like high impact
(40:36):
or low impact on the business.
So, yeah, let's say you get youroperations, requests in or
tickets in, so, um, you can sortthem by how, like how, if the
impact would be big in apositive way on the revenue or
or any, yeah, nql or or pipelinethat would be generated, if
(40:58):
that would be a big impact orlow impact, right, and then the
other side of the quadrant orthe y-axis would be is it easy
to do or hard to do?
So, um, ideally, like, if easyis on top and uh and and um,
high impact is on the right,what you probably want to
prioritize is requests that areeasy to do and have a high
(41:21):
impact on the organization, andI think that's probably how I
would outline the differencebetween process and metrics.
From a process perspective, itcan be easy to do, but it
doesn't have a high impact onthe organization or the metrics,
how it impacts the metrics,what you're doing right.
(41:41):
And I feel like when I was moreon the marketing operational
side of managing that side ofthe business as well, that's at
least how I try to manage theamount of requests, by
acknowledging that, yeah, theseare like things that should be
done, but they just don't havethe impact or they're just very
hard to do and that's why theyshould be deprioritized probably
(42:03):
.
Speaker 1 (42:04):
Well, or you could
take that and say you want us to
do something that is a highcost slash level of effort,
right, but it's going to end.
We believe we all agree it'sgoing to have a big benefit.
But maybe there's a way to doit that gets close to the same
benefit for a much lower cost,right?
This is where you can get intoa sort of consultative approach
to how you react to those?
(42:24):
Okay, Absolutely, yeah, yeah,that makes sense.
So let's kind of wrap up hereOne of the things that I think
is a challenge for a lot ofpeople is that and you and I
kind of hit on this right evenat the beginning of this that
very often we're asked inmarketing ops to do analyses and
(42:48):
we turn those around fairlyquickly.
If we're lucky, there's a datascience team, but usually the
data science team is tied upwith senior executive requests,
right, and we don't really havethe kind of cycles or things
with them.
So is there you use the term, Ithink, of democratization of
(43:10):
some of these capabilities.
So how is that going to helpour audience, who are being
asked to do more and moreanalyses and insights with
larger and larger volumes ofdata and not the time or the
skills sometimes to do those?
Speaker 2 (43:27):
Yeah, that's exactly
a great point and, yeah, I know
the word democratize is probablya little bit overused, but I
think the scenario to imagine isthat if, let's assume, your
organization is taking stepstowards implementing insights,
so you can get through testinterface and you're prompting
(43:52):
this engine to produce reportsfor you and explanations or
narratives around, or datastorytelling narratives around,
the trends that you're seeing,that basically enables or allows
everybody to have data analystsin their pocket or like, yeah,
(44:13):
as an agent working for them,right.
So, and that was exactly thechallenge that I experienced
definitely that it's just theaccess of analysts and, because
data pulling takes so long andlike having the data
storytelling element to it, thatwhat naturally happens is that
the analyst resources arereserved for the very senior
(44:35):
folks in the organization and byhaving access to this AI
analyst, it allows justeverybody, from a specialist
level upwards, to do their ownanalysis or impact analysis that
maybe they didn't have accessto.
So, like we were speaking beforeabout whether you should
leverage data or you shouldn'tleverage data to inform your
(44:58):
decisions, often, challengesthat I also had, and many people
have, is they just don't haveaccess to this real-time data or
they don't have the time toanalyze it and inform the
decision making.
But by having that real-timecapability of understanding
what's going on and why it'shappening, it allows everybody
(45:19):
really to come up with their ownanalysis and also formulate it
in a way, or articulate it in away that it is understood by
your manager and your manager'smanager and their manager.
So what I believe will happenis that it doesn't matter so
much anymore how senior you arein the organization.
(45:39):
It enables a bottom-upcommunication that if you have a
real good insight, you are ableto communicate it upwards very
easily and yeah, it doesn't,yeah, it doesn't get lost, or
there isn't this disconnectanymore with senior management
looking at data and everybodyelse underneath is just
(46:00):
executing and focusing onprocess only.
Speaker 1 (46:04):
Yeah, I think I'm
very hopeful for AI tools to
enable faster, more repeatable,more I don't want to use the
word elaborate, but maybe morecomplete I'm not sure what the
right word is analysis, so thatpeople can focus on the
storytelling piece of it.
(46:27):
And then also, naturally, whathappens with a lot of these
things when you start reportingthis stuff out, is there's
follow-up questions right to go,either drill a little deeper or
go somewhere tangent to that,and I think that you know it's
just another thing that addstime that's drawn away from, say
, these other uh priority items,and so that's that's what I
(46:48):
really hope comes out of this.
I still believe that we we stillneed to encourage people in our
listening audience who are inops.
If they have not gottenthemselves familiar with uh
analytics terminology, datamanagement, statistics they
still should do that, because Ithink that's that's still going
to be a critical skill, evenwith these tools.
So again, off my soapbox,whatever, but it's a message
(47:14):
that I feel is really, reallyimportant for our audience to
hear on a regular basis.
All right, so we cover a lot ofground, clemens, I know we're
kind of up against the clockhere, but any final thoughts
about all this that we haven'tcovered yet covered yet yeah,
(47:41):
maybe leading on from what youwere just saying, that I
definitely see the ability tohave real time data insights and
storytelling tools around.
Speaker 2 (47:49):
Those data insights
can be a massive accelerator to
everybody's career, because whatyou're suddenly doing is you're
speaking business instead ofprocess language and
familiarizing yourself withthose, those business metrics
and the terminology around themand how you communicate success
on a business level rather than,let's say, on an operational
(48:12):
process level.
That, I think, is theopportunity or the shift that
allows any marketingprofessional marketing
operations professional toaccelerate their career most
quickly, because executives willnever know how Marketo works or
how a marketing operationsplatform works.
What they care about isoutcomes and outputs and, yeah,
(48:36):
of course, the quality of theseoutputs as well.
But yeah, that, I think, is theopportunity to elevate one's
profile inside the organizationby speaking the same language
that business executives arespeaking.
Speaker 1 (48:52):
Yeah, it becomes an
and decision, not an or decision
.
Right, I could either learnmore about process and marketing
technology, or I can learn moreabout data and analytics and
storytelling.
I think now it becomes a.
I can do both.
Yes exactly Um, which is great.
I love that.
So thank you, clemens, forsharing.
I know it's staying up late uh,on a on.
(49:14):
So thank you, clemens, forsharing and staying up late on a
Friday.
I appreciate that.
If folks want to connect withyou or learn more about what
you're doing, or what you'redoing at Algo Marketing, what's
the best way for them to do that?
Speaker 2 (49:24):
Yeah, please send me
an invite request on LinkedIn.
You can, of course, kind ofcontact Algo Marketing as well,
or us at Throttle Marketing ifyou're interested in predictive
analytics or next best action orhear anything more about that.
Yeah, please contact us throughalgorithmmarketingcom.
If you want to have a chat withme.
(49:45):
Yeah, I welcome anybody thatwould like to have a chat with
me.
Please send me an invitethrough LinkedIn and quickly say
that, yeah, this is the way youheard about me and then, yeah,
I'll look forward to having achat with you.
Speaker 1 (50:02):
Fantastic.
Thank you for that.
Thank you again to our audiencefor continuing to support us
and provide your feedback andideas.
If you, as always, if you dohave a suggestion for a topic or
a guest or want to be a guest.
As always, if you do have asuggestion for a topic or a
guest or want to be a guest,reach out to Naomi, mike or me
and we would be happy to talkthrough that with you until next
time.
Speaker 2 (50:21):
Bye, everybody thank
you, bye, bye.