Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Anthony Karls (00:00):
All right, here
we go.
This is Revenue Roadmap, wherewe talk about sales and
marketing for localentrepreneurs.
My name is Anthony Karls,president of RocketClicks, and I
am with Mr.
James Patterson again.
So thanks for joining me again.
James Patterson (00:13):
Yeah,
Anthony Karls (00:17):
paid media topic
today, and we're going to talk
about Versus optimization.
So, uh, James, before we getinto that, hit me with
something, hit me with somethinginteresting about what you do
outside of work,
James Patterson (00:34):
outside of
work.
Um, I'm a pretty avid runner,back in the, uh, you know, COVID
time period, you know, I think alot of people were looking for.
new ways to spend the time orwhatever and really found
myself, uh, missing the gym.
So I wanted to find a way to,you know, kind of stay fit and,
and be active and startedrunning outdoors and kind of
(00:54):
turned into a kind of a passionof mine.
So been kind of
Anthony Karls (00:58):
you do marathons,
half marathons, you just run for
fun.
James Patterson (01:01):
mostly for fun
right now, but aspirations to do
something more serious one day.
I've got an Ironman in my tenurepicture, I think is something I
want to get to, but definitelyhave a lot of, a lot of
milestones leading up to there.
So probably do a marathon in thenext year or two and then I'll
go from there.
So
Anthony Karls (01:18):
Okay.
Time, time for honesty.
When you and Lisa run, who, who,who wins the belt most often?
James Patterson (01:27):
so I'll give
Lisa credit on the short
distance.
She's very speedy and those, youknow, kind of quicker ones.
I definitely have a little bitmore of the endurance.
So sometimes on the longer runs,you know I'll beat her out, but
she definitely has my has mynumber when it comes to the
shorter distances for sure.
Anthony Karls (01:43):
I mean, I'm at
least you can claim one of them
when I, when I do anything withChelsea, if it's over like four
or 500 meters.
Not a chance.
She's a swimmer in college.
She kicked my, she just kicks mybutt a whole time.
Embarrassment at the CrossFitgym is for me.
James Patterson (02:01):
Nice
Anthony Karls (02:03):
All right, so
let's talk, let's talk about
What is, what are we talkingabout here?
So what's the different?
What are we talking about whenwe say testing versus
optimization?
Tell us generally what we're,what we mean by that.
James Patterson (02:16):
Yeah.
So the biggest distinctionbetween testing and
optimization, since both, youknow, kind of terminology, I
think it's thrown out a lot inour industry is really
understanding like the impactand the degree in which you're
evaluating two different things.
Right.
So both.
In testing and optimization,you'll be effectively trying to
understand if some type ofchange is performing better or
(02:38):
worse than whatever theoriginal, you know, kind of
condition of whatever you'relooking at is.
So, when we kind of weigh the 2different elements between
testing and optimization, wethink about.
Anthony Karls (02:49):
Yeah.
Let's, let's like draw aspecific use case so that we can
like create some clarity in thisindifference here.
James Patterson (02:59):
yeah.
So for an optimization, anexample of that might be
something that's, you know,definitely a little bit lower in
terms of, um, you know, degreeof change.
So, you know, if you think abouta search campaign, right, maybe
you're tweaking one headline ina search ad and testing that
against obviously what you'vehad in market for a previous
period of time.
Or maybe something on thelanding page, you're changing
(03:19):
just the color of a button,right?
So it's these kind of more minortype, um, you know, variances
and, and what you're, you'rekind of evaluating from the
original version.
Um, but still obviouslymeaningful things to look at and
can help you over time kind ofbuild up momentum with these
kind of, you know, a little bitmore minimal impact, but still
beneficial learnings to thenultimately get you to a bigger,
(03:41):
you know, improvement long termby doing this on a consistent
basis.
Anthony Karls (03:45):
Yeah.
So last in our last podcast, wetalked about buying data and
like, this is, is this anexample of what we mean by that
testing is optimization.
How does that play in?
Talk a little bit more aboutthat.
James Patterson (03:57):
Yep, exactly.
So like we talked about in ourlast podcast, buying data.
So this is really falls rightinto that, right?
So we're basically, we're, we'reputting part of our dollars or
marketing spent and to try andunderstand, can we get.
You know, some, you know,increase in whatever performance
metric we're looking at.
So that's why it's, you know,also important between testing
and opposite and optimization toreally understand what metrics
(04:19):
you want to look at.
Obviously, you know, we'vetalked a lot on the podcast
about, you know, evaluating thewaterfall and things like this,
generally speaking, especiallyon, on the optimization side of
things, since they are a littlebit smaller type tests, you are
going to be a little bit morefocused than maybe some of the
platform metrics or Things likeMicrosoft clarity and stuff like
that.
If it's a landing page test.
Um, but that's ultimately what'sgoing to build you up to get to
(04:41):
an end.
And then hopefully over time, asyou're reviewing things with
more impactful business metrics,like your waterfall, you're able
to draw a connection betweenobviously this improvement
you've made over time to seeingnumbers, improving your business
data as well.
Anthony Karls (04:55):
Got it.
So, so really the, so whatwe're, so the concept here is
testing.
These are large changes to.
To a campaign or potentiallymaybe an ad or a landing page or
something.
And the optimization is smalltweaks.
So maybe we're going to tweakthe headline a little bit, or
we're going to make a smalladjustment on the landing page,
like changing the color of abutton or something like that.
(05:19):
So when you, when you thinkabout how, how to do this in the
paid media landscape, like what,what percentage of the time are
you running tests?
Versus what percentage of thetime are you running
optimizations?
And like, how much should wethink about allocation in terms
of percent of traffic or percentof budget?
Like how, what's a good way tothink about this so that we're
(05:39):
not shooting ourselves in thefoot and doing unnecessary
damage to ourselves
James Patterson (05:45):
Yeah, so
basically the maturity of your
marketing program is going toinfluence if you're in this
optimization side of things orif you're in this testing side
of things more frequently.
So a more mature marketingprogram or business that's
obviously, You know, been inmarket for a while.
You're hitting your revenuegoals pretty consistently.
We have a proven process wherewe can look back at your
(06:06):
financials that, you know, themarketing strategies are working
to help achieve the businessgoals.
You're probably going to fall inmore of the optimization bucket.
So it's basically not joltingthe, you know, the boat that is,
you know, moving on the ocean ina very smooth fashion, uh,
towards the goals we want toachieve.
Whereas a business that's, youknow, a little bit newer and,
um, maybe, um, you know, hasn'tinvested as many resources and
(06:27):
time into kind of looking at thewhole marketing strategy, those
are definitely going to be morein the testing bucket and kind
of truly, truly trying tounderstand, are these, you know,
more significant changes?
Like maybe it's like, we'venever used a landing page before
we've been just driving trafficto the, you know, regular old
website does driving traffic tothis landing page now for the
first time ever.
You know, drive, you know,significant improvement to the
(06:49):
results that we're trackingagainst it.
So that's kind of the way tothink about it.
So like both businesses, bothmature and, you know, kind of
more in that startup and growingphase are both going to do a
combination of optimization andtesting at different times.
But the degree of maturity inwhich you have for your
business, probably the less ofthese massive tests that you're
going to be doing, um, on arecurring basis.
And it's going to be more ofthat kind of slight tweak again.
(07:11):
It's like kind of like
Anthony Karls (07:12):
is the.
Is when you, when you saymaturity, is that more about
like achieving industrybenchmarks for like top
performance?
Or are you talking about, I'vebeen in business for 10 years,
so I'm a mature business becausewe've been running for a while.
Like talk a little bit aboutthat.
Cause it sounds, it sounds likeit's the former and it may be
situational depending on we'veonly ran on Google, but we
(07:34):
haven't ran on Facebook.
So we an optimization onFacebook and optimization on
Google, or are we One versus theother.
So talk a little bit about that.
James Patterson (07:43):
Yeah.
Good distinction for sure.
So obviously just because you'vebeen a business for 10 years,
doesn't necessarily mean thatyou're at a good point to where
you'll be.
Basically, mostly in thisoptimization phase.
So it's really going to comedown to again, like, you know,
we talked back to our earlierpodcasts about building the
waterfall and reallyunderstanding how our marketing
program is going to help, youknow, influence our business
metrics and KPIs, it's reallybasically looking at it from the
(08:05):
perspective is, you know, howlong have we had these platforms
active?
We've had them active for, youknow, I, you know, a number of
years generally is going to bekind of more so in the
optimization category versusmaybe, you know, to kind of your
example, Tony, if like, youknow, we just started our
Facebook ads account, um, lastweek.
Um, so that's definitely kind ofmore so what I'm talking about
on the maturity front, justbecause you've been in business
(08:27):
for a while, or I've been doingdifferent degrees of marketing
doesn't necessarily mean that,you know, Oh, we've got it all
figured out.
So it's going to depend a littlebit that way.
Um, I would say generallyspeaking, when you get to this
phase to where you're more sodoing the optimization side of
things is basically your, yourvendor or if you're an in house
team, whoever, you know, is kindof looking over and overseeing
(08:47):
your marketing strategy.
If they're getting to the pointwhere it's like, you were having
conversation about monitoringand.
And, you know, we're evaluatingthe performance on a particular
channel and we're, you know,better, you know, giving good
feedback that we're hitting ourbenchmarks through set and all
these types of things.
That's generally a goodindicator that you're probably
in this more mature time periodto where you're looking for.
(09:09):
Kind of little to kind ofcontinue to move the needle just
a little bit further up.
Anthony Karls (09:15):
So if a business
introduces that's been in a
mature place and a channelintroduces a new offer, how
would we think about that?
James Patterson (09:25):
Yeah.
So if a business introduces anew offer that we would
definitely consider that to bein the testing bucket.
So that's a pretty large, um,shift in potential value to your
customer and obviously a, youknow, key component to your
marketing strategy and drivingthat new business.
So, um, that's something that wewould definitely put into the
testing bucket and you'd want tobasically look at that from, you
(09:46):
know, the difference fromoptimization to testing is
really the length of time inwhich you're looking at it.
Yeah.
And the amount of, you know,resources and efforts are going
into that test to make sure thata, you've kind of set down
what's your goals for that, howyou're going to measure it, and
then ultimately follow the loopand determine from there, um,
you know, kind of the best formof this, right.
Is if, you know, you go througha larger test, you, you set your
(10:07):
goals.
Over the course of, you know,varying time, usually at least
four weeks, we're looking at, ifnot longer.
A lot of times you were todetermine the test was
successful.
A lot of times those willactually then inspire other
optimization kind of smallertest.
Right?
So it's like, we know this offerreally works, but have we tried
the offer in our email program?
Have we tried the offer insocial?
(10:28):
So maybe it's just tweaking Someof the strategy there to test
out new messaging around thatoffer in different platforms.
Anthony Karls (10:34):
Yeah.
I know one of the things thatI've used in the past is like.
Is understand kind of what arethe industry benchmarks for a
related offer in a similarindustry and how far off are my
metrics is if I'm achieving a 20percent call rate on a landing
page, probably in anoptimization place for like law
(10:55):
firms specifically.
If, if I'm at like 5%, Iprobably should do more tests in
terms of how that page is laidout and how it's designed and
what my messaging is and allthat.
Because I know like based onachieving industry, excellent
standards, like there's quite abig Delta and opportunity that's
missed there.
So that's, that's another way tothink about it.
(11:17):
Um, again, like we talked aboutnew channel, a new offer that
we're going to go to market withwhat else, what are some other
examples of when we would wantto tweak this?
James Patterson (11:29):
Yeah.
So, I mean, I would say like,you know, just to give some kind
of examples of the differencebetween the two is like, um, you
know, when the law firms we workwith, they've, um, really built
up their marketing program overthe course of time, their paid
search strategies in a reallygreat spot, quite frankly, uh,
most of the time we're havingconversations about like, should
we leave it on or not?
Cause lead flow is really great.
And, um, Usually it's because,you know, something that may be
(11:52):
in the sales process thatthey're working out and stuff
like that.
So it's like from a channelperspective, we're driving
really great lead flow.
Very happy with the program.
So this is really a channelwe're looking at for
optimization.
So like one of the things theteams are doing right now is
they're looking for slighttweaks in the ad creative, um,
as for opportunities to maybesee if we can just drive a
little bit more click throughrate, ultimately bring down
(12:13):
CPCs.
So again, that's where I'mtalking about where we kind of,
you know, we want to alwaysfocus obviously on the waterfall
reporting to evaluate marketingholistically.
Um, but then when we're talkingabout kind of these smaller
tests, Optimizations, you'regoing to generally pick out some
of these more, um, you know,upper funnel type metrics, like
a click through rate or thingslike that.
However, you want to kind of gothrough it.
(12:34):
Whereas a different law firmthat we're working with, um,
basically net new to a lot ofthese platforms haven't looked
at anything.
So we're actually working withthem with the kind of the
example I mentioned previouslywas, um, They've been just
driving traffic right to thesite all along.
And, you know, honestly, fromtheir perspective, they're like,
Oh yeah, you know, you feel likeour paid search program's going
okay.
(12:54):
I'm looking to, you know, us togo in and find ways to improve
it and whatnot.
And, um, this is somethingreally looking forward to.
So we've gone through a wholeconversation about, you know,
we, what we want the landingpage to look like, what are the
different elements on there thatare going to kind of play into
our strategy to ultimatelyinfluence conversion rate and
get folks to.
Ultimately take their concernsor questions and reach out to,
(13:16):
um, you know, their intake teamto ultimately get an eval and
consultation and move throughthe funnel.
Um, so this is much larger, um,program from them, from going
from just, you know, sending tothe regular website to
ultimately building out a wholenew place for their customers to
potentially see first as theirfirst interaction with this
brand.
so it takes a lot more workfrom, you know, thinking through
here, images to again, kind ofstructure of the page.
(13:38):
Um, Um, even button, you know,button colors obviously right
now are matching that more tothe brand look and stuff like
that.
Um, but ultimately those arekind of two examples where like
we have this law firm that's alittle bit net new to their
marketing program and one that'slike kind of high flying.
Um, that's kind of the bestexample of like the difference
between the two types of teststhat may exist.
Anthony Karls (14:01):
Awesome.
So big picture when we're doing,when we're in paid media,
there's always opportunities tolearn.
One of the ways we can buy datais through intentionally testing
or intentionally optimizing,making sure we're balanced
there.
Um, I guess one more, one more.
Question before we wrap up.
If we, what's the risk of doingtests?
(14:23):
So they don't,'cause they don'talways win.
James Patterson (14:26):
Yeah, so it
goes back to buying data, right?
So, um.
The way that we always like toframe tests, you know, right.
It always sucks when you thinkof this great idea, right?
I think this ad creative isgoing to absolutely smash.
We've have some data to maybesupport why we want to test this
yada, yada.
Well, sometimes it doesn't work.
Right.
Um, I think the most importantthing to really fall back on
again, is that buying dataconcept, um, you know, by
(14:49):
learning from this, although itwas, you know, it didn't go the
way that we wanted.
We can now be more confident inour current strategy and now
look at kind of what's next.
We can, you know, if we're doingour jobs correctly, we're taking
note of this test and why wefeel like it failed and doing a
wrap up, wrap up on it and stufflike that, um, and that can
really help you go from, youknow, which isn't always the
(15:09):
most fun thing in the world,which is say, client, this
didn't work out the way that we,um, initially expected and take
into a more positive, um,conversation moving forward.
Um, you know, some of the conswith testing, right.
Is like, like I said, it's justreally, really crucial that you
go into.
A test, especially and andidentify what's the timeline
(15:29):
going to be like in the goal.
I think that's a lot of timeswhere these things go wrong is
that sometimes you'll be havinginteractions with the clients or
your vendors and stuff likethat.
And they'll be saying, oh, youknow, 1 weekend, we're smashing
it or whatever and looking atall these different things.
And then.
You know, by the end, maybe it'snot looking that way anymore.
And it's like, well, it's goodto keep a pulse on things, but
it's also important that like,you know, going into it, we
(15:51):
agreed that at the end, know,that's how we're going to value
it.
So even though the start wasstrong, we have to stay true to
the fact that we said, when wecompiled all this data over the
course of the next four weeks,that's really, we're going to
make our final decision.
So that's really the cons is Ithink that there is, um, a lot
of discipline that needs to be,um, kind of ingrained in your
(16:11):
team when you're going throughthis forward to actually.
Do what you want to do.
Because I think, you know, Ithink all humans, right?
For being honest, we always wantour cool ideas to work out.
And that's just not always thecase.
Um, so I think that's thebiggest thing.
And not testing too much, too.
I think that the flip side ofthis, right?
If you work with, I thinkeverybody's worked with a person
like this.
Um, where it's like, I want totest nine different things at
(16:32):
once.
Or, you know, have this test.
Going and then an element ofthat test going in a different
area, and then that can, youknow, effectively ruin really
the results because now you'vekind of blended it with
different things.
Um, so again, disciplines, Ithink, probably 1 of the most
important concepts I would sayfrom testing that I think
doesn't always, you know, comeacross when you're kind of going
(16:55):
through the wrap up phase isthat really need to have that to
make sure that your testingprogram is successful.
Anthony Karls (17:01):
Awesome.
Cool.
So cool.
So to wrap up, uh, when we'rethinking about buying data in
the marketplace, obviously wecan buy data and understand
what's, what's working andwhat's not.
One, drilling down deeper intothat, uh, some methodologies we
use is we run tests and we runoptimizations.
Tests are big changes.
(17:23):
When we do those, we'retypically isolating those.
So we're not throwing everythingat a test or running a split
test.
Um, We're splitting the trafficso that we're knowing we're not
putting the business at risk.
Um, and then there'soptimizations, which are
smaller, smaller changes.
Again, we're going to run asplit test on those.
And then we're going to look atour results over time.
We want to be disciplined.
We want to be accurate on thedata and how we feel about it.
(17:46):
Because there's been plenty oftimes in my career, I'm sure in
yours as well, where we don'tlike the results of the test.
Even if the test won, but wehate the design or we hate
whatever it looks like.
And it's like, why do users likethis?
I don't know.
It doesn't matter.
Data wins, not my feelings.
So anything else to add before,before we wrap up here?
James Patterson (18:09):
No, I think I
think it was a great discussion.
Yeah, I think closing commentreally is just that when you're
going to testing, make sureyou're going through the process
of really identifying how youwant to test it and for how
long.
Keep it clean.
Don't again, be disciplined,fight the urge to test it.
too many things or maybe havesomething overlap.
Um, really be intentional withhow you're setting it up and
(18:31):
ultimately expect that a lot oftimes your tests may not be
correct.
If you're doing the right thingsand wrapping up correctly, it
should, should ultimately helpyou lead to the next test on the
line where, you know, fingerscrossed, maybe that's the time
where you get kind of your, yourgolden idea comes across and
ends up being a big winner foryou.
Anthony Karls (18:47):
Awesome.
Well, thanks, James.
Appreciate it.
Appreciate it, sir.
James Patterson (18:50):
Yep.
Thanks.