Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
hello everyone.
Welcome to another episode ofopscast brought to you by
marketingopscom, powered by allthe mo pros out here.
I am your host, michael hartman, again flying solo.
One of these days we'll getmike and and naomi here.
Uh, as I mentioned uh before,uh, spring fling is around the
corner, in may of 2025.
This is uh we're recording thisearly april.
(00:23):
So if you haven't uh checkedthat out and you're interested
in joining that, you should gocheck that out.
But joining me today is EricWesterkamp.
Eric has 20 years of executiveleadership spanning sales and
entrepreneurship.
Eric understands what it takesto drive B2B growth.
His foundational strength liesin building world-class
enterprise sales teams,recruiting, developing and
(00:43):
motivating them to succeed.
Class enterprise sales teamsrecruiting, developing and
motivating them to succeed.
As CEO of CaliberMind for oversix years, the leading B2B
marketing data analyticsplatform, eric now applies his
deep go-to-market knowledge toempower marketers with the data
clarity needed to partnereffectively with sales and prove
their impact.
Building on prior executiveroles at FrontSteps, opentext
and EasyLink, eric is also acompelling public speaker, so
(01:05):
we're glad to have you, eric.
Thanks for joining us.
Speaker 2 (01:07):
Yeah, thanks, glad to
be here.
Speaker 1 (01:09):
Yeah.
So I don't know if I want to.
I think what we're calling this, the title of this episode, is
going to be something likeTaming the Data Dumpster Fire,
so this ought to be a fun oneand will probably resonate with
most of our listeners.
So what we're going to talkabout and most of our listeners
(01:31):
are operations professionals andthey're out there supporting
their CMO or a marketing leaderof some sort and they're trying
to help with day-to-dayoperational stuff getting stuff
out to market, but also withhelping to tell the story
internally about what impactthat marketing is having on
their business.
So just from your perspective,I think, both as one of those
(01:54):
executives you're trying toconvince, as well as you're
selling to CMOs and things likethat what are you seeing as the
major challenges facing CMOs andmaybe the ops teams that
support them in this area?
Speaker 2 (02:13):
Yeah, I'd say there's
two challenges.
One is that the CMOs are beingasked harder and harder
questions to justify what areyou and your team doing?
I'm giving you X millions ofdollars to spend on ads and lead
generation.
Help me understand as a leaderhow that is really having an
impact and affecting my business.
(02:36):
And as organizations scale upbigger and bigger, there's a
bigger gap between the C-leveland the different individual
units and what they're doing,and so it's more and more
incumbent upon the CMO to reallybe able to figure out that
story and just not just justifybut to, like, really show the
organization that we're havingthis major impact.
And here's how.
(02:57):
The challenge is that many ofthe rest, a lot of the other
business units, are very datadriven.
Speaker 1 (03:03):
Yeah.
Speaker 2 (03:04):
Sales is actually a
very data-driven organization,
typically because it's very easyto get data on what's going on.
I made so many calls, I bookedso many meetings, I had so many
X, y and Z and at the end of theday, I booked so much new
business Very discreet, veryeasy to see.
That Marketing it's harder, butmarketing is being asked by
those, those other leaders, tosort of tell the story about how
(03:28):
their impact is grounded indata, right, and and then being
able to bring that back into theorganization in an incredible
way.
Now the challenge the cmo hasis that the data they get is
very messy.
Unlike sales, that's very gooddata, very discreet data.
A lot of other operationsfinance things like that.
They have very good data.
Marketing has huge amounts ofdata, but it's often very messy.
(03:52):
The second challenge is thatthey've been asked to tell this
story.
That's a challenge and now theylook at the data and
information they're trying totell the story from and to some
extent, it's often just a mess.
That's why we were talkingabout it's like a dumpster fire.
It's like here's all this stuffand I got to extract out of
that a story and it falls onmobs often to be the one that
(04:16):
bridge between that informationand data and the CMO and it's
really their job to try to tamethat and figure it out and get a
story that's credible and teachthe CMO how to tell that story
how to translate.
Speaker 1 (04:25):
It just occurred to
me, as you were describing this
also, that part of the challengefor CMOs might be that because
the data is a mess, it'sincomplete or it's inconsistent,
you know, for a variety ofreasons, right, and there's lots
of valid reasons why that couldbe is that they lack the
confidence that the data can betrusted to tell the story.
(04:49):
I think there's like somethingin there as well that maybe like
so hard for them to go in andgo like with confidence to say
this is what we're doing,knowing that, yeah, there's
maybe maybe one of the legs onthe three-legged stool is going
to collapse at any given time.
Speaker 2 (05:07):
You know, a classic
example that we've seen is the
CMO walks into a meetinghopefully not a board meeting
and they say we generated thismuch pipeline because that's
what the reporting said.
Then the salesperson stands upthe leader says, yeah, but
that's 50% more than the actualpipeline we started the quarter
with, or built, that we showedin Salesforce.
(05:28):
And then the CMO is sittingthere kind of like, well, why is
that?
It's probably because of doublecounting.
You know, they had somealgorithm that was building up
and trying to do it.
It was double counting revenueright, and it's often really
hard.
It's that data trust.
That data trust issue is, fromour perspective, the core really
of what you need to start toset up with these organizations
(05:51):
yeah if you want any of thatreporting to be believable and
for them to then be able tobuild those stories on top of it
.
Speaker 1 (05:58):
Right, it's super
important so I mean, one of my,
my beliefs is that, yeah, Iagree with you, the trust in the
data needs to be there.
I think one of the things I'vetried to do as a marketing ops
leader whether it's talking tomarketing leaders or sales
leaders or sales ops or whateveris lower those expectations.
(06:21):
Though, too, data quality isnot great, and there's a lot of
reasons why.
Again, some valid, some not andI think to me, what's important
is starting to do thisreporting and just go like I
know that this is not I hate touse the word right or wrong
(06:41):
right, it's the data is notright or it's.
Or you know, I feel maybe it'syou put a confidence right or
wrong.
Right, it's the data's notright or it's.
Or you know, I feel maybe it'syou put a confidence level on it
.
Right, I feel 75% confidentthat this is, this is, is valid
or directionally correctwhatever phrasing you want to
use but I do think it's reallyimportant to start reporting,
even when you think the data isnot great.
Speaker 2 (07:03):
Yeah, well, you the
way I would phrase it is yeah.
If you don't start to addressthe data and start to try to
figure out the reporting, youmay not understand where you
need to fix things and where youneed to work on things, so like
.
So, starting the project ofdoing reporting is super
important.
We often find organizationsthat are like, hey, we want to
go and do something on reportingsuper important.
(07:25):
We often find organizationsthat are like, hey, we want to
go and do something on reporting, but we know our data is a mess
and so we want to go fix thatfirst and then we'll come back.
And what else?
My response that often is thatalmost never works, because
unless you start to couple thereporting with the data fixing
like you often don't get itright, because the reporting is
(07:46):
sort of the barometer you use tounderstand if you're getting
this stuff right.
So we said don't go and try tofix all your data and then come
back to this, because you'regoing to come back in a year and
your data is going to be 20%fixed and you're still going to
be in the same spot.
Instead, start small.
Think of a set of reports, ofinformation that you can start
(08:07):
to generate from this data thatare meaningful, but they're not
trying to boil the ocean.
Right, let's start there andthen let's get the data right.
Let's fix the data to supportthat report.
It may be as simple as I'mmoving my data into a data
warehouse now and I want torebuild some accurate pipeline
(08:27):
and opportunity reporting overhere Now.
This may really duplicate whatyou're getting out of Dynamics
or Salesforce or something likethat, but you start there and
you build that and you're likeokay, these reports look right.
So now you know a few things.
Now I've got some reportingthat marketing can use and often
, depending on the organizationthey're, they struggle to pull
their reports right out ofSalesforce.
You can do it over here in your, in your, in your marketing
(08:50):
data warehouse.
But you also know that the basenumbers that you're working
from reflect the numbers thatthe rest of the organization may
be using as a source of truthout of Salesforce or some CRM.
And then you start to kind ofwork out from there.
Right, maybe you build a firsttouch attribution model.
Again, in my opinion, firsttouch they have their usage, but
they're not super impactful.
(09:10):
But they do tell you someinteresting stories about how
things are coming in, but it'seasy to debug them and see if
they're right, and you do thatbefore you may move to a
sophisticated MTA or somethinglike that.
Speaker 1 (09:21):
Yeah, it's
interesting because I'm with you
I think maybe I even sharedthis when we talked before is
that I often would be asked, asa marketing ops leader, like I
need a marketing dashboard?
And I'm always like, do youreally?
I think what you need is youneed a handful of reports,
because if we go spend a bunchof time building a dashboard,
(09:45):
it's never going to answer allthe questions.
I mean, I think one of thethings that I find is I'm with
you, I do a little bit, learnfrom it, get that right, then go
on to the next thing, and so on.
The other thing that I find andI don't know if you see this too
is very often two things happenwhen you start to present this
data, even if it's a singlereport.
One is again, especially at thebeginning, when maybe data is
(10:14):
not as clean, there'll be somesort of outlier and instead of
people focusing on the generaltrend, they see this one thing
that just stands out.
It's above or below or whatever, and then they start to
question it.
And if you don't have that'swhen I coach people I was like
you need to be prepared forthose questions if they come up
right.
So like being able to explainthat the best you can um is one.
(10:35):
The second, if that's not thecase, is typically we want to go
.
I see this like help meunderstand why they want to do a
drill down kind of thing.
Are you seeing those twophenomena, or any others, when
people start doing thisreporting?
Speaker 2 (10:48):
One of the things
that when we're working with
marketers is that we really tryto help them understand what's
going to kind of happen whenthey bring these reports out to
the rest of the organization.
Is that certain areas of thebusiness CEOs finance often what
they're really good at, becausethey're inundated with data and
(11:08):
reporting.
What they figure out how to dovery quickly is look for
anomalies in reporting and data,because that's where they
really go, focus right.
So you know, but my own teamwill tell you that the worst
thing is I'm sitting herelooking at a report going like,
well, how come that number andthat number don't sum to that
number, Because they're supposedto.
Speaker 1 (11:26):
Right.
Speaker 2 (11:26):
Right.
And then the teams are like Idon't know, and again, you got
to be prepared for that.
Sometimes the answer from amarketer is simple.
It's like listen, these numbersare not supposed to equal this
number because they need to beable to tell that story and be
credible about it.
But they need to know that thatquestion is going to come Cause
.
The worst thing is like well, Idon't know, I got to go, We'll
go look into that.
(11:47):
But then the rest of the thingsyou say about the report, you
know or suspect, because nowpeople don't trust your data.
Speaker 1 (11:54):
Yeah, your
credibility.
So sometimes this data,especially in marketing, they
want some and you want to callthat out.
Speaker 2 (11:59):
These numbers are
indicative and they're
directional.
They're not going to sumbecause they're coming from
different locations anddifferent data sources and
they're not meant to sum to someto always equal the number
sitting over here in Salesforce.
But here's what they do tell usand here's a story about what
this information is telling you?
Speaker 1 (12:18):
Yeah, I find it.
You know it used to be, andmaybe this is another thing I
think people could learn from.
It used to, really, I used totake that kind of pushback or
challenge personally, right, andI think people need to realize
like this is not.
These questions are reasonablewhen people are asking Unless, I
mean, there are some people whoare assholes, right, but in
(12:41):
general they're just trying tounderstand.
Speaker 2 (12:43):
That's really it.
Yeah, they're trying tounderstand what's going on.
Speaker 1 (12:49):
Yeah, okay, so let's.
So.
One of the things that I thinkthis does right is this starts
to expose issues.
Sometimes the issue with thedata is hey, in fact, I have a
number of stories where Istarted doing reporting whether
it was Salesforce or database orthe market animation system
that showed there's someweirdness in the way that the
(13:10):
sales team is operating Right,cooperating right.
Yeah, um, that I don't want toget into details that led to
reports in this case.
The ones I'm thinking of areattribution ones mostly, and I
go like this doesn't make senseto me, but it led me to go not
question it.
Didn't question the data like Iwant to like.
I wanted to understand why.
So are you?
I mean, is that like to me?
(13:32):
Again, this is like exposingthis data through reporting
gives you an opportunity to gofix upstream problems.
I mean, if it's marketing right, are you always putting UTM
codes on the links in your ads?
Or do you understand when salesis updating opportunities right
, things like that, and why?
(13:53):
Opportunities right, thingslike that, and why?
Yeah, are you like, are you, doyou see?
Speaker 2 (14:05):
that as something
that, when you're with Calibram
in mind your customers it soundslike you're you help them go
through some of this process.
Is that something you do aswell?
Yeah, one of the things.
When we started building outour application, you know we
were coming at it from a, froman assumption that the
customer's data was needed help.
Right, and I've seen a lot ofother products out there in the
market and what they ask is theyask their customers to fix the
data, then put it into thesystem and then they'll have
(14:28):
accurate reporting, and wedecided to kind of take a
different approach.
I was like listen, your data isprobably going to be kind of
messy.
A different approach I was likelisten, your data is probably
going to be kind of messy, sohow do we get you to good,
accurate reporting when that'sthe case?
Right, so we built rules,engines and data manipulation,
deduplication all this stuffright into our system so that we
(14:52):
can actually build the rulesthat really help our marketers
get to good reporting and it maynot be that their data is messy
because of a flaw in theirprocess or anything.
You can imagine a situationwhere you were on one map,
eloqua and you migrated toMarketo?
Yeah, it happens all the time,or vice versa, right?
(15:12):
Sure, the data from Eloqualooks different and maybe you
had a different way of namingchannels.
That just changed.
You've migrated, you've maturedand you've named channels
different.
But you now want to runreporting that looks at sort of
how things have changed overtime.
But all these names and thingshave changed right.
So you need a way of being ableto say I'm looking at that data
(15:34):
but I want to map all thesechannels to this new naming so I
can now run reporting.
That looks like a continuum.
The data doesn't actually matchthis, but you want to look at
this continuum.
What's really like?
these channels have gottenbetter over time, but back here,
we named it like this, wementioned it like this and it's
changed here and changed here Ifyour system can't handle that
(15:54):
type of transformation of datayou're going to end up with gaps
in there and and what's goingto happen is those data problems
, those mismatches, are going tostart to surface up into your
reporting because suddenly yourchannels are going to look like
a consistent level of.
They're not going to be thechannels you're reporting on
today, so you can see all theseother channels in your in a
report that came from previousand it's just going to look like
(16:15):
a kind of a jumble of data yeah, it's going to look like it's
like a bunch of noise, whenactually, if it was uh kind of
refactored to be consistent,yeah, you would actually see
something more useful okay yeah,it's a great way to put it like
how do you refactor on the fly,right?
so your system should be able tohandle that if you want to get
(16:36):
to some accurate level of codingand information.
Speaker 1 (16:40):
I mean, this sounds
like I hate to call it
lightweight, but it's a type ofETL kind of capability that's
built into CalibreMind.
Is that what you're saying?
Speaker 2 (16:51):
Yeah, elt, right.
So we extract, we load it andthen we transform the data.
Speaker 1 (16:56):
Oh okay, elt okay.
Speaker 2 (16:58):
That's kind of the
difference between ETL and ELT.
Yeah, yeah, we're basicallytaking all that raw data,
putting it into our system forour customers and then we move
it through these processes toactually help refine and fix and
whatever, so that now you havedata where the data is kind of
all stitched together in a waythat you can kind of report on.
But you need to have rulesengines in there, right.
And we take it a step furtherwhere we'll actually allow our
(17:20):
system to take some of those, um, those rules we put in
de-duplication, datanormalization, things like that
and push that back out to thesesystems.
So we'll say um a great exampleis um titles.
We actually did an analysiswhere we took 1,800 titles and
we threw them in and we said howmany different titles are in
(17:40):
this batch of 1,800?
And it was something like 80%of those titles were actually
different, whether it's the VPor vice president or VP, it was
just different.
Speaker 1 (17:53):
Yeah.
Speaker 2 (17:54):
So how do you take
that?
And then create a segment whereI'm like I want all VPs of this
right, I want all executives ofthis right, and so we have a
rules engine that actually doesthat inside of our system, and
what we found was that and then,I could report.
I could have a dropdown of showme VPs of marketing.
The rules engine is extractingsort of the level and the role
(18:15):
in the department.
Speaker 1 (18:16):
Normalizing right.
Speaker 2 (18:18):
And everyone was like
can we put that back into
Salesforce so that I can do mySalesforce reporting on these
same things?
Speaker 1 (18:25):
Yeah.
Speaker 2 (18:25):
And that's where a
tool that has these
normalizations needs to be ableto potentially write this data
back out to the source systemsdoing lead-to-account matching,
doing account deduplication,things like that.
Speaker 1 (18:37):
So do you think of
that as because, going back a
little bit, we talked aboutexposing data gives you the
opportunity to fix it Is thislike a part of that, built into
the tech ecosystem by doing that?
Is that kind of what you'redescribing, then?
Or at least optionally?
Speaker 2 (18:57):
optionally, right?
Um, we found a lot of customershad a variety of tools to do
this like.
So they may have purchased atool to do x, fixing and this
way here and and, but thechallenge was that not everybody
had it.
Um, the level of maturity ofusing those tools was highly
variable across the customers,and so we said we really need to
(19:20):
kind of build this into ourmodel directly, and so that's
kind of kind of built from it.
Our architecture lended itselfto being able to create rules
and do things to kind of get tothis data you know.
(19:42):
And then, and then kind ofafter the fact, we found
everybody's like we want you totake what you've done here and
send it out, fix these othersystems too.
That was sort of a follow on.
That came after they saw whatwe were doing internally with
all the data.
Speaker 1 (19:53):
All right, that came
after they saw what we were
doing internally with all thedata.
All right, this is maybe alittle bit.
When you do that pushback, doyou typically overwrite what was
there, or do you typically findthat they say push the cleaned
up data into another field?
Speaker 2 (20:07):
It depends, it
depends on the customer, right.
So usually what we do is we'llactually create reports that say
, for instance, we're going todo lead to account matching,
right, yeah, we'll actuallycreate a report that says here's
all the leads and here's allthe accounts that we're going to
do, validate that.
So we'll do these steps wherewe actually don't actually run
(20:27):
the final step and actually kindof we kind of give them like
here's what our system thinksit's going to do, thinks it's
going to do, and we run througha bunch of steps to validate
that with customers.
Yeah, okay, and then once it'svalidated, we'll actually turn
it on Other ones.
We'll actually write into Ourdata normalization.
We actually write them intoprobably new fields for
(20:47):
customers.
So here's the caliber mindtitle.
We actually take the titles andthe first thing we do is
actually normalize them into astandardized format, everything
that says vice president or VP,like they all get.
Then we run through processesnow, extract out kind of the
level of the department and thenwe'll write those into new
fields potentially.
Speaker 1 (21:07):
Right, yeah, yeah,
okay.
I mean I know that's.
That would be my preference ifI had had the decision on that,
like I.
Just I don't like overridingwhat's there.
Speaker 2 (21:16):
Particularly.
If someone we do that but forlead to account matching and
deduplication, you'd actuallymodify the yeah, yeah.
Speaker 1 (21:23):
Yeah, I mean that
makes sense, right, I mean those
are two different, prettysignificantly different things.
But like I can imagine withtitle, right, either you know
salesperson got a business card.
If anybody gets any more, theyput in the person's title that's
actually what they say they are, and then we go in and
overwrite that and somethingdifferent and then we use that
(21:44):
to send it.
Speaker 2 (21:44):
We usually
personalize email.
Speaker 1 (21:47):
Yeah, yeah, that
makes sense to me.
Yeah, okay, so the, the, the,the rules, stuff.
Well, maybe we come back tothat.
There's a piece of this, though, that I still think we haven't
really touched on that.
I'm for longtime listeners,viewers would know for me Like
I'm a big believer thatstorytelling, this quest to be
(22:08):
data driven and use that termfor some of the other teams,
like finance or even sales teams, like finance or even sales
yeah, um, I I feel, like manymarketing teams, marketers, cmos
, that they've, with their questto be data driven, seen as data
driven that they're, they'vegiven up on the idea of
storytelling, because I thinkthere's still a component.
(22:30):
I think there's a component ofthis, like, again, maybe it's
because I believe that the datais just not going to be complete
, accurate, high quality,whatever, that you need
something else to stitch this,stitch it together.
That helps bridge that right.
Yeah, uh, to make it believable.
Like, are you seeing the samething and like, are you seeing
it changing?
Now, is there sort of a shiftback to that that you're seeing?
Speaker 2 (22:52):
um, just curious yeah
, no, we're seeing a lot going
on kind of in that space.
I'll say what we saw was first,marketers really, you know,
years and years ago didn'treport on a lot of data and then
, as things moved toward digitalwebsites and things, they would
start to expose a lot more ofthat sort of raw data.
(23:14):
Here's how many website hits wegot.
Speaker 1 (23:16):
Yes.
Speaker 2 (23:16):
Yep, yeah, okay, and
at one point in time there was a
relatively strong correlationbetween website hits and
downstream leads and things, butthen that correlation really
kind of fell away over time.
So then they said look, okay, Igot these.
These now became MQLs, whichwhich are people filled up form
fields and things.
Speaker 1 (23:34):
Or their lead scoring
became an MQL.
Speaker 2 (23:37):
Or their lead scoring
Form lead versus hot, lead.
Speaker 1 (23:39):
Yeah, okay.
Speaker 2 (23:41):
And they kind of
tended to move, gravitate or
just kind of reporting those rawnumbers right.
And I think what's happenedover the last X years is that
the rest of the C-suite, therest of the organization, has
started to believe less and lessof those raw numbers.
And how does that impact thebusiness?
Right, and that's kind of whereI would say the state of the
(24:05):
business is right now.
And then you have compoundingthat.
More and more systems pumpingin more.
I'm not just using the website,I've also got outreach firing
and I've got emails going for mymarketing.
I I met some more and more dataright, it's crazy to get lost
in it.
A lot of these teams brought in, you know, to be honest, like
data science, marketing analysts, people like that.
Those people are really good atanalyzing the data, yes, and
(24:28):
they would say this data meansboom.
But that translation from thedata to that, what the analyst
is saying, to a story that themarketer, the CMO or can tell to
the rest of the organization,I'd say that's where the gap
really is.
And it's hard right, becauseyou can see, like you know, I've
got a data analyst softwarebackground, but I've been in
(24:51):
sales and marketing almost mywhole career, right, and so even
for me it's really hard oftento bridge the gap between the
data and the story.
And we've seen customers nowstart to really migrate to where
they're really more interestedin telling the story than the
data, because they feel thatresonates better.
I'll give an example we have acustomer and we were providing
them these buyer journeys.
(25:12):
So here's accounts that you wonand then we were showing them
here's this that you won.
And then we were showing themhere's a buyer journey of all
the things that happened forthis account and why you won it.
They started turning that intoliterally a.
They would pick out selectaccounts.
They won each quarter.
And they would create thisgraphic, an infographic of the
whole buyer journey, literallyon this thing.
Speaker 1 (25:33):
It's like Candyland.
Speaker 2 (25:35):
Almost exactly like
Candyland On it, they would put,
at this point in time, here'sthe roles.
And they interacted withcontent, and then these things
happen, and then these thingshappen, and these things happen.
Then they would use that andgive that to sales.
Sales teams would then use thisas sort of a playbook.
Speaker 1 (25:51):
Where is my?
Speaker 2 (25:52):
account on this
pathway.
What have they done?
What are they not done right?
Yeah, I love that made us kindof step back and think like,
well, how can I do that kind ofin an automated fashion?
right right and that's how do Itell, how do I take data into a
story?
And I'll be honest, that's whenchat, gbt and gemini and claude
all started to hit the marketand we realized you know, along
(26:13):
with others, that those systemscan't do math, like, don't even
try, like we've tried.
I'm not going to run anattribution model through Gemini
that time in the future, butcan it take a buyer journey with
a thousand touch points andturn it into a credible story
that a marketer can then pass tous SDR or BDR?
That's where it's really goodand that certain.
Speaker 1 (26:36):
What I'm seeing now
is that these systems are
helping bridge the gap betweenthe data and the story and
that's where they're reallypowerful yeah, I mean what you
just described like I think Ishared with you like this, and
it's really where my viewbecause I'm like you, I'm a, I'm
an engineer by training neverreally practice it.
So my like, when I come intothis, I think about data and I
understand it sometimes toodeeply, right, and it's easy to
(26:59):
go down a rabbit hole.
But what hit me is when andthis is I had marketing ops and
I had an inbound team and Istarted having my inbound team
track leads that were handed offCause I I was.
I don't even really know whattriggered me to do that, it's
just sort intuitively like itwould be a good thing to know
how that resonated.
And it wasn't that I stoppeddoing reporting, otherwise, but
(27:21):
what I found was if, like, if Ijust watched the body language
in the rooms when I presentedback to sales on what we've been
doing, the numbers were fine.
Nobody like really I didn'treally got challenged on it.
It was, so it was very, was avery new this particular
organization thing about.
But when, like, I saw peopleleaning in right when I'd start
we start doing the stories aboutwe got this deal and this is
(27:42):
what we happened and you know wegot it.
This is what we did.
We quickly got it to the rightperson, yeah Right, and it
became.
It became.
I realized like that piece ofit was building more credibility
than the raw numbers,especially when the number.
There's a problem with the rawnumbers, it's not the raw
(28:03):
numbers, it's that they have nocontext.
Speaker 2 (28:06):
Yeah, is it good?
Speaker 1 (28:08):
is it bad?
Is it like how do we know,right, what are our competitors?
Like the question like what'shappening with our competitors?
Yeah, I went up 100 or 200?
Speaker 2 (28:15):
is that with our
competitors?
Yeah, I went up a hundredpercent or 200%.
Is that from because I wentfrom one to two, or because I
went from a thousand to you know, whatever Right.
Speaker 1 (28:22):
Or is it a technology
or like, is it a bug?
Right, I mean, that would bethe more common question, I
think, because they don't, theyweren't trusting it.
But yeah, I mean website visits, like.
I mean website visits like Imean, it's been the blessing and
the curse of all this digitalstuff is that you get, you can
quickly get insights, yeah, butthe downside is you get drowning
in data and just like, how doyou, how do you figure out which
(28:45):
ones are important and can helpyou be, like, inform what you
do, the same or different.
Speaker 2 (28:55):
That's the part.
You know, are you going tochange tactics?
That actually brings up aninteresting example we just had
from another one of ourcustomers where they were seeing
a shift, a drop in theirorganic search traffic, and they
saw this because of attribution.
And again, attribution plays agood role.
(29:18):
It's not the end-all be-all,but it does give you indications
of directionality and thingsstuff that's going up, stuff
that's going down and they sawthis and they came back and said
, well, why they started doinginvestigations into that data,
right, and what they found wasthat information they were
shifting to direct search fromLLMs.
(29:39):
And so then what they wanted toknow was like what LLMs?
And so we actually ran a reportfor them.
That's super interesting.
We actually had a report forthat customer that shows what
this increase in traffic comingfrom LLMs by LLM Right, and
probably no surprise, it wasprimarily ChatGPT and Google for
(30:02):
this type of, for this type ofcompany, right, and that then.
So the answer is like I saw somechange in data.
I went and investigated, Ifound that something really had
happened and we're getting achange.
And then what they did is theystarted looking at how do we
optimize content and things onmy websites to help those
engines do a better job ofgetting our message out right.
(30:24):
So then at the end of the daythey changed tactics and then,
because they have good reporting, they can come back and see
like, is that working?
Is that increasing, is itcontinuing to increase?
You know, and then byunderstanding which engines they
could kind of go okay.
So I really need to optimizefor these two.
Speaker 1 (30:39):
I don't really need
to worry about these it's so
interesting I was sitting heresmiling, uh because this is not
the first time I've heardsomebody give that same story
that they've seen a drop inorganic search, in a pickup in
LLM-based traffic, and actuallyearlier today I was working on
(31:01):
something and I have definitelyI've not totally gone over to
just using LLMs for search, butI this is total tangent, but I
have found that probably moreoften than not, I am using them
because typically, when I'mdoing something where I'm going
out and searching unless it'slike a clear, pretty clear fact
(31:23):
kind of thing yeah, you know,I'm typically it's like a like
I've got a longer question toask or I need to give it more
context I'm finding that I getmuch better results.
You're right, though.
Speaker 2 (31:36):
Not so good on math.
We're still.
I mean, you know, in thebackground, we were constantly
running experiments with thesesystems and what we found is
that when we need the LLM to domath, what we do is we tell it
what the data is and what wewant the output to be like, and
to write a python program thatdoes that, and then run the
program.
Speaker 1 (31:55):
Does that pretty well
.
Speaker 2 (31:56):
So it's like I want
to do a projection.
I want to do, uh, quickpredictive analysis of x, y and
z.
You can't ask it and give itthe data to do that, but you can
say here's the type of data,write a program in python that
does that and then just run theprogram and see the output and
that actually works pretty well.
Speaker 1 (32:14):
That's really
interesting.
Yeah, and I've heard it does areally good job of generating
code.
Speaker 2 (32:21):
We just did a
training internal here for what
we call hands on Thursdays.
We just did an internaltraining and we actually did a
training by business unitinternally on how they're using
LLMs in different, in differentways and what we found some
really interesting use cases.
Sales is, of course, using itto do research and companies and
things like that before they dooutreach.
Very, very typical.
(32:42):
But our customer successorganization is doing it to
write, write responses to tocustomers as they come in.
They're using it a lot fordoing data debug.
So you've got a whole managedservices team, sure, and they'll
have a customer come in and belike I'm seeing something weird,
blah, blah, blah.
And they'll actually have theLLM look at the data and compare
(33:03):
it to other things andhighlight anomalies.
So it helps them figure outwhere to go faster.
Speaker 1 (33:09):
Oh, okay, yeah,
that's actually really really
good.
It just occurred to me becausesomewhere along in my career I
helped a company come up with anapproach to how to enable the
organization to do more customersupport without growing their
staff at the same rate.
(33:29):
Typically it was in parallelwith the revenue growth and it
came down to a methodology thatinvolved a knowledge base and,
you know, getting that made it,exposing it quickly, I can
imagine that lom would be areally good thing to put on top
of amazing tools for some ofthis stuff.
Speaker 2 (33:45):
Right, where you can
um, we can we're actually
building it directly into ourconfiguration stuff.
So a customer can come in andsay, like how is my
configuration different thanstandard?
Speaker 1 (33:57):
What have I done to?
Speaker 2 (33:58):
this how do I set up
rules in your system?
Because maybe I'm seeingsomething weird, and you know
these systems get complicated.
Sure how?
What have we done to it?
What does it look like?
By the way, I'm having thistype of issue.
Oh well, we're noticing that inthis section, you're mapping
your account object from thisfield, but it probably should be
(34:18):
this field, and so we're seeingreally big usage in that area.
Speaker 1 (34:24):
So exposing the
butterfly effect?
Right, yeah, interesting.
So I just wanted so the this,this um AI.
I think it's AI based thiscustomer journey dream mapping
stuff that's built into it andit, like it, actually generates
a visual audio tool.
Is that what it does?
Speaker 2 (34:44):
We can generate a
visual right out of a reporting.
What we're actually doing for alot of customers now is um,
we'll go in and we'll set up arule in our system that says,
like I don't know, every time anMQL, you MQL, create a lead
right, or every time an accountgets to this level of engagement
, go look at their buyer journeyright, and that buyer journey
(35:06):
can be 10, 100, 1000 touchpoints, depending on your
organization and now figure outwho the most impactful people
are in there, who are the corebuyers, who are the core people
you're interacting with, whatare the core types of content
and marketing, events and thingslike that they've been
interested in they've done, andthen give me a quick summary of
the timeline of things thathappened right and we'll
(35:26):
actually push that right into,for instance, a field right on
the lead in Salesforce.
So that's the answer.
We're also using it to say a lotof our customers they've got
Apollo or Outreach or Sales Loftor something like that plugged
into their sales system, so allthis email content is in there.
So we'll be seeing all that andwe'll say, by the way, pick out
(35:47):
the.
Are there any next steps that Ishould be aware of?
They'll actually look at thatstuff and say, oh, by the way,
they've asked for a demo, Makesure you do this.
This person asked for asecurity review, blah, blah,
blah.
And we actually push all thatinto Salesforce or we make it
available right now forcustomers.
Speaker 1 (36:02):
That's awesome.
Okay, I love that.
Okay, so I don't want ourlisteners to walk away from this
thinking that we should stoptrying to be data driven or data
informed as way I like to thinkof it.
In it like marketing shouldn'tdo any more reporting, like so
I'm just curious.
Like you know, I thinkmarketing still should be held
(36:24):
accountable and have metrics ofsome sort.
Like do you have any thoughtson, especially in your, in your
role as a CEO, like what do youlook for from a marketing team?
Or what are you hearing yourclient, your customers and
clients say?
Like this is what we wantmarketing to be reporting on.
Speaker 2 (36:39):
Yeah, you know, I'd
say that more and more marketing
is being held to the same, to apipeline number Now, whether
it's closed one or created isdifferent.
But I'd say that things areshifting to where the
organization is really holdingmarketing accountable for
metrics that tie directly to thebusiness, that are measurable.
Speaker 1 (36:59):
Okay, like marketing
sourced versus influenced
Marketing sourced revenue.
Okay.
Speaker 2 (37:05):
Marketing sourced
revenue or marketing influenced
right, okay, and so we're seeingthat a lot or marketing
influence right, depending onwhere you're looking, and so
we're seeing that a lot.
Now, the challenge formarketers is that that may be
the metric they're being held tofrom the organization, the
entire thing too, but theyreally do need to look at a lot
of other metrics and indicatorsto be able to get to influencing
that number.
(37:25):
Right, so just knowing how muchyou do.
But again, website hits may notbe as impactful if it was, but
if your website hits aredropping, you want to know that
yes right, if they're going up,where are they coming from?
what's my organic search volume?
Look like, what is you know?
So you really do need to put, Iwould say, you instrument, um,
(37:46):
your funnel and your differentstages.
You're really looking atdifferent metrics that impact
that right At the top of thefunnel it's really am I gaining
awareness?
Well, how do I know that?
Yeah, boom, boom, boom.
Right, am I showing up on?
If you're a software company,am I showing up on G2?
How many website has you know?
How many MQLs or MQAs am Icreating?
Okay, what are the metrics?
(38:06):
Usually those are moreengagement metrics, it's like
are people downloading content?
Are they looking at things onthe website?
Are they responding to outboundemails?
Are they?
So you look at those differentmetrics, but it's really driven
by stage to the point where themetric that you're probably
going to try to tie your storyto is the pipeline metric for
the organization.
Speaker 1 (38:24):
Yeah, I mean I've.
The way I think about it is II'll skip the lowest level.
Lowest level to me is like isthe system working as expected?
Right data, consistent,complete, um, yada, yada, uh,
which really doesn't go toanybody, typically, unless it
identifies there's a a syncproblem with another system or
something.
Um, but I think a lot ofmarketers miss in this goal to
(38:49):
kind of go to the storytellingor or even attribution or
contribution.
Where they really have impactis on the day-to-day activities
they're doing, right, so, um,which they historically would
have.
You know, we, we published Xnumber of emails, we did a
number of blog posts, we didthis many ads, yeah, and I, and
(39:09):
I think people have gone like,oh, that's seen as vanity
metrics, but I still thinkthere's value To me.
You brought up website visits.
I agree, you should be watchingthat Now.
Do you need to report that tothe C-suite?
Probably not, right?
Do you need to use it as abarometer, like, oh, hey,
there's a problem that we needto get in front of, because
(39:31):
three months, six months, ninemonths, whatever it is down the
road like this could be an havean impact, and if you're doing a
marketing tactic, an email ad,maybe it's a you know, using
multiple channels.
You should, you should bemonitoring that in the early
stages like a hawk to go like isit working?
(39:52):
Because if it it's not, youneed to cut it, adjust whatever.
And I see a lot of teams, um,they're so, they're moving so
fast, they're not doing that,that stuff on the tactical stuff
, because I think they've beentold like it's just vanity stuff
.
To me, part of it is whothey're, who the recipient of
(40:13):
that reporting matters and to me, like the gap is that, yes, you
should be doing that and youshould have expectations about
it.
Now, again, it's like do youshare that with everybody?
No, do you need to payattention?
If you're the demand gen person?
Absolutely.
Speaker 2 (40:29):
Yeah, yeah,
absolutely.
I mean you know, if you're thedemand gen person, absolutely,
yeah, yeah, absolutely.
I mean, if you're the cmo, youwant to know how you're probably
reporting on here's my, here'smy core channels that we work
with and their efficiency, andif they're, and how much, and
are they working, and how we'reinvesting in these channels to
increase.
And you know, because usuallythe cmo talks a little bit about
what happened in the quarteryou know a lot about what
(40:51):
they're doing for the nextquarter.
How am I going to help get to X?
Well, here's the things thatwe're doing here.
I got these channels and I'mmaking some changes.
We've noticed that we're addinghere.
But that's a super high levelreport, right?
You don't?
run your business off that right.
Because, underneath that you'relike okay inside this channel
are all these tactics and thingsand stuff.
Speaker 1 (41:10):
You need to know
what's happening there so you
can actually have an effect onthat channel yeah, right, I mean
, and that's to me like when youstart getting into the
attribution or contributionthing, that's where you start
exposing stuff to others.
But this is what I used to be areally big, huge fan of
attribution and I've reallycooled on that.
(41:31):
I don't think it should stop,because I think it can be.
It's useful to identify whatchannels, what audiences, what
tactics are most effective andor least effective, so it can
inform how you know what betsyou place.
Speaker 2 (41:48):
Yeah, absolutely To
some degree.
Yeah, it's a tactic.
It tells you interesting thingsfrom the data um that that can
really help you with yourbusiness.
But as a marketer, thatshouldn't be the only thing
you're doing.
You should have some level ofengagement.
Reporting isn't becauseengagement tells you something
different and engagement canoften give you signals earlier
(42:10):
than attribution.
Attribution is great, but theproblem is that it's a it's a
it's kind of a retroactive lookat it Absolutely Boom.
But but it gives you withoutattribution.
It's hard to create that finalstory that goes to the CMOs
Cause that's how you tie thisstuff back to revenue.
That's something you need to do, yeah, prior.
Yeah, there's a need to do it.
Yeah, but prior to that,kicking off all these tactics.
Speaker 1 (42:31):
They may not,
depending on your sales cycle,
they may not end up showing upin attribution for 30 days, 45,
60 days, right?
Well, I have an example of acompany I worked for had a deal
that closed two years after thefirst interaction, which was at
an event.
Speaker 2 (42:48):
Yeah, yeah, right,
which was at an event.
Yeah, yeah, right.
But you want to know if thosetactics are generating those
early metrics, right, are they?
I'm running a, I'm running anevent.
Did I get enough meetings, like, did we get real meetings?
Speaker 1 (43:02):
out of it, yep.
Speaker 2 (43:04):
Right, you know, and
then, and then, if I have
meetings, to those meetings havea set, you have a follow on and
and and.
Okay, now I'm seeing cause, if,cause, if, if those early
things are no, I'm not seeingany of that you're never going
to see an attribution.
Right, you want to know thatearly because you're like, okay,
this isn't working, I want tokill this and start something
else, because these earlymetrics and get the early
metrics right, Then they'llstart to show up in attribution
(43:24):
later on.
Speaker 1 (43:26):
Well, I mean, just
like an event's a good one,
right?
If the goal is to get meetingsright, that should be what
you're focused on, and, ratherthan say, like attribution might
come back and say we shouldn'tdo that event again, yeah, what
it may be telling you, though,is that the way you were present
at that event was wrong.
You didn't have the rightpeople in the booth, you didn't
(43:48):
have like, yeah, that kind ofstuff.
Speaker 2 (43:50):
And so you didn't
have the right people in the
booth, you didn't have like,yeah, that kind of stuff, and so
prep for it and have people tryto set it means before they got
there or whatever it was.
Yeah, yeah.
Speaker 1 (43:54):
So I think, like
that's why you need not just one
, right?
There's a basket of thingsanyway.
Um, okay, we've covered a tonof ground in a shorter period of
time, but is there anythingelse that we haven't covered
that we wanted to?
You think we wanted to sharewith our audience?
Speaker 2 (44:12):
No other than I think
that there's a lot of what's
happening with AI is causing ahuge amount of, I wouldn't say,
concern, but confusion in themarket.
What's real, what's not real,what can I do with it, what
can't I do with it Right.
And then you also have this isalmost do with it, right.
And then you also have this isalmost a completely different
(44:32):
topic.
But you also have marketingteams that are running.
They wanted to leverage thesetechnologies.
They're running smack intotheir own legal departments you
can't touch this stuff becauseright so I
think the only thing that's outthere is that I'd like to say is
that the ai stuff is real.
It's.
It has some very strong powerin certain areas.
You need to make sure thatyou're focusing in the right
(44:52):
areas.
What we're seeing is that theorganizations that are starting
to leverage it early arestarting to gain advantage over
organizations that don't.
Their teams are getting moreefficient.
They're getting faster.
If I have an SDR that can cover20% more accounts per week,
(45:13):
that's an advantage.
Right, I either have to spendless on SDRs or I get more out,
but you know whatever that is,and so companies really need to
start thinking about that stuffnow, but they also need to have
a very critical eye on like whatcan it do, where is it good,
and really pick projects thatreally nail where it can have
the most impact on the business.
Speaker 1 (45:33):
Yeah, I mean I think
we've got somebody potentially
coming on too in the near future.
We're going to be sort oftalking about the difference
between what is like AI, machinelearning, automation, like
these are.
All terms are out there butthey sort of have different
meanings, right, but they getconflated a lot.
So I think that's and I know,like from my own journey, like I
(45:54):
was a relatively slow adopterbut now I'm like I'm finding
daily uses for it, you know, indifferent ways.
So, eric, lots of fun.
If folks want to hear moreabout you, what you're doing,
what's going on at CaliberMind,what's the way for them to do
that?
Speaker 2 (46:11):
Well, check you're
doing what's going on at
CaliberMind.
What's the way for them to dothat?
Well, check out our website,wwwcalibermindcom.
You can reach out to medirectly.
I'm on LinkedIn.
I think I'm slash EricWesterkamp on LinkedIn.
Reach out to me directly.
Those are probably the two bestways to get information on what
we're doing.
Speaker 1 (46:27):
I appreciate it.
Well, eric, again thank you itto get what we're doing.
I appreciate it.
Well, eric, again thank you,it's been a lot of fun.
Um, I get, I get excited aboutthis stuff, so, um, hard for me
to control myself.
Thanks again to our audiencefor always continuing to support
us.
If you have suggestions fortopics or guests, or want to be
a guest, uh, feel free to reachout to Naomi Mike or