All Episodes

September 19, 2025 45 mins

Today on the show we have Casey Hill, CMO of DoWhatWorks, a patented growth experiment tracking engine that reveals which website changes actually drive results. Casey brings experience from ActiveCampaign, his work as a Stanford instructor and advisor, and years of research into how leading companies like Slack, Shopify, and Asana run experiments. 

In this episode, Casey breaks down why most A/B tests fail and how to focus on the few elements that truly move the needle. We explore why two CTAs often outperform one, why customer logo bars underdeliver, and why expectation-to-reality alignment is the hidden driver of both conversions and retention. 

Casey also shares how DoWhatWorks blends large-scale data with human research to surface reliable best practices, why expansion revenue has become its biggest growth lever, and how enterprise clients are tackling churn by setting clear expectations from day one. 

We also discuss how onboarding experiments reduce early churn, why traffic sources should shape your CTA strategy, and why simplicity always wins on pricing pages.

As usual, I'm excited to hear what you think of this episode, and if you have any feedback, I would love to hear from you. You can email me directly on andrew@churn.fm. Don't forget to follow us on X.


Key Resources:


Churn FM is sponsored by Vitally, the all-in-one Customer Success Platform.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_02 (00:00):
In your hero section, should you have one
call to action or should youhave two calls to action?
We can look at thousands andthousands of companies that are
testing this exact phenomenon,one versus two.
And then we can assign thislevel of confidence, this level
of certainty in whether youshould have one or two.
And in most cases, the answer isyou should have two, by the way.

SPEAKER_01 (00:23):
This is

SPEAKER_03 (00:29):
Cher.
The podcast for subscriptioneconomy pros.
Each week, we hear how theworld's fastest-growing
companies are tackling churn andusing retention to fuel

SPEAKER_01 (00:42):
their growth.

SPEAKER_00 (00:45):
Strategies,

SPEAKER_03 (00:55):
tactics, and ideas brought together to help your
business thrive in thesubscription economy.
I'm your host, Andrew Michael,and here's today's episode.
Hey, Casey, welcome to the show.
Yeah, thanks

SPEAKER_02 (01:08):
for having me back, Andrew.
Glad to be here.

SPEAKER_03 (01:11):
It's great to have you.
For the listeners, Casey is theCMO of Do What Works, a patented
growth experiment trackingengine that helps you get more
impact faster.
Prior to Do What Works, Caseywas the senior growth marketing
manager at ActiveCampaign.
He's also a Stanford instructorand advisor and was a previous
guest of the show on episode235, where we discussed the life

(01:31):
cycle of loyalty and tacklingtrend critic stages in the user
journey.
My first question for you today,Casey, is what works?

SPEAKER_02 (01:41):
What works is a broad question.
I think that when we'reapproaching conversion rate
optimization, usually my bestguidance is start by being very
selective about the pages thatyou focus on at all.
Websites are typically,especially like for larger B2B
SaaS, websites are really broad.

(02:01):
There's tons or thousands ofsmall tweaks you can make.
So you typically want to zoom inon the things that are most tied
to revenue.
Often it's your homepage, it'syour from a paid standpoint,
like any landing pages, you'redriving substantial traffic.
And then typically the higher upthe page, right?
The above the fold, the herosections are going to be the
most impactful and there's goingto be kind of diminishing as you

(02:24):
kind of go down.
So it's a very broad top levelthing and we can definitely get
into the nuance of some of thethings that we've seen kind of
like both the highest certaintyand the highest impact happen.
Because I think, by the way,both of those are really
important.
So one thing we'll look at ishow certain are we that a
certain thing works on awebsite?

(02:45):
So for example, in your herosection, should you have one
call to action or should youhave two calls to action?
We can look at thousands andthousands of companies that are
testing this exact phenomenon,one versus two.
And then we can assign thislevel of confidence, this level
of certainty in whether youshould have one or two.
And in most cases, the answer isyou should have two, by the way.

(03:07):
We could talk about that.
But so that's one thing.
But the other question is like,degree of impact.
So CTAs in your homepage rightat the top are things that are
very tied to your revenuejourney.
So that is typically going tohave a very high degree of
impact.
So we can have a high degree ofcertainty, a high degree of
impact.
There might be other thingswhere we see tons of people
test, let's say customer logobars is an interesting example.

(03:29):
And we see that customer logobars might maybe surprising for
some of our listeners hereactually don't perform well.
The majority of companies thattest having or removing their
customer logo bar, removing itwins.
And there's a lot of nuance andwe can talk about companies that
are doing it better or worse.
But we also find that it doesn'thave a massive degree of impact,
meaning that if you have it orif you don't have it, it likely

(03:50):
is not going to have a hugeimpact on your conversions.
So it's like either move it downthe page or remove it or do what
Clay or Hex does by adding linksto case studies to it.
But if you mess it up, itprobably has less impact than
your headers, than your CTAbuttons, than other things that
might come to play.

SPEAKER_03 (04:08):
Yeah, very interesting.
I think we'll dive into allthose details.
I think it'll be good though,like just for the listeners, to
give us a little bit of overviewof Do What Works does, because
obviously I think it probablygathered by now, but I think
it'll be great to see a littlebit of context as well around
that.

SPEAKER_02 (04:22):
Yeah, yeah, that's where we should start.
So Do What Works essentially hasan algorithm that allows us to
detect A-B tests done by anycompany on the web.
So it's actually public data, sowe can gather it, we track it,
and we see, hey, this website,let's say Asana or Slack or
Klaviyo, Shopify, is running anA-B test and they're testing
this versus this.

(04:43):
We detect it and then we have ahuman research team that hops
in.
They start to track that test.
They add tags and kind ofvalidate what is being tested.
And ultimately, they just lookat what version the brand keeps.
So they say, okay, like Shopifytested A versus B.
We see that they ultimately keptB.
We have a whole bunch of thingswe look at, like how long they
have to keep it for at leastthree months.

(05:04):
And there's all these otherdetails.
But just to keep it simple, wedetect what is being tested.
We then track it and then we addit to a database.
And what it allows us to do isat scale, someone can come in
and say, I'm a B2B SaaS company.
I'm trying to test something onmy pricing page.
They can even put in exactlywhat they're trying to test.
Like I'm trying to test my CTAs.
I'm trying to test my plantiers.
And then you can look atthousands of tests and see

(05:28):
essentially what is working inthat space.
And so we're kind of trying toredesign how most people
approach A-B testing, get peopleto run less tests, but but to
get them to do it moreefficiently.
Like Optimizely, who is acompany that does A-B testing,
published a thing that said only11% of A-B tests are beating the
controls.
So that's a huge problem, right?

(05:48):
Like people are pumping a ton ofmoney, a ton of resources into
stuff that doesn't move theneedle.
And I think anyone who's been inconversion rate optimization or
worked on websites has had thisexperience, like running tests
and it's like not statisticallysignificant or you see a spike,
but then it comes down.
And so we're trying to help kindof solve for that problem.

SPEAKER_03 (06:06):
the peaking problem when you start looking too early
and you see the spike, getexcited, and then things turn on
you.
Yeah, I definitely think interms of experimentation, one of
the biggest challenges I thinkis understanding when you can
and can't do experimentation.
And especially as you say, at anearly stage startup, you
typically don't really have thedata to be able to get any level
of statistical significance.

(06:27):
So a lot of people end upwasting a lot of time.
And then I think it also takes agood level of sophistication for
teams to be able to run a goodexperimentation program.
I think optimizely themselvessort of realized that there just
weren't that many teams at thatlevel of sophistication to build
the business that they thoughtthey had and so I think like
it's very interestingperspective of like seeing how

(06:47):
you go about doing this but I'mkeen to understand as well like
you mentioned let's say a SaaSand a pricing page or you gave a
couple of examples like are youfocusing on any specific
verticals in general like Ithink from an e-commerce
perspective there's obviouslylike way more e-commerce stores
let's say than there are SaaSbusinesses and how are you
approaching this is the firstquestion and then the second
question I get as well is likeat least from my experience like

(07:08):
running A-B tests across likemultiple different companies and
stuff like you can sometimes runthe same experiment in two
different companies and get twototally different results and so
when you're like giving thisadvice is there sort of a level
of probability of success Iguess because it's not always
going to be the case for everycompany like you say like the
logos may work for one specificindustry one specific case or

(07:29):
even one specific company justthe way they do it so keen to
hear like on those two thingslike

SPEAKER_02 (07:35):
yeah yeah yeah for sure.
So we work across a lot ofdifferent industries and
obviously not every industry outthere, but I think we have more
than like 35 different trackedindustries.
So obviously e-commerce,obviously B2B SaaS, finance,
fintech, banking, direct toconsumer stuff.
So we do function across a lotof those.
In terms of what works for onecompany, will it work for
another company?
There's a couple layers of that.

(07:56):
First off, there's obviouslynever any guarantees.
So a hundred percent, like we'realways going to be looking at
statistics.
I think one of the advantages ofwhat we're doing though, is
we're not like, we can allow youto see a competitor.
Someone comes to us, they say, Iwant to see my two biggest
competitors.
We can show you your two biggestcompetitors.
We can show you what they'retesting.
But really what's valuable isthe data at scale, right?
So when you're trying tounderstand a phenomenon and

(08:18):
you're like, hey, is this goingto work for me?
If we see 850 B2B SaaS brandstest a basic logo bar right
below the hero, and we see 88%of those tests lose, that's
probably a pretty meaningfulsignal.
Maybe you're in a differentsituation.
Maybe you're the case whereyou're in that 12%.
And so, yes, we won't know.

(08:39):
But I think when we look at theamount of data that we have
across a very, very similarapplication, so A-B testing is
all contextual.
The size of the company matters,the vertical matters, the
specific type of customers, theymatter.
So that's one of the beauties ofour system is like you pull all
those filters.
Like we have this very advanceddashboard and you're looking at
like the exact element, theexact page type, the exact

(09:00):
vertical.
So you put in all those filters,or you can even say like these
five companies are most similarsimilar to mine, and you can
look at those five companies.
But I think really, you want tolook at the amount of data.
And then we've developed analgorithm with John Hopkins
University, some professors atJohn Hopkins University, called
bet scores.
And bet scores specifically area probability type of

(09:22):
assessment, right?
Like our degree of certainty,based on the test based on was
it a multivariable test or asingle variable, obviously,
single variable tests areweighted much higher, like
geography, all these differentvariables, we compile them all.
And then we give people anindication of how confident we
are in that specific phenomenonbased on the amount of data we

(09:42):
have.

SPEAKER_03 (09:43):
Interesting.
And from your customers, are youinjecting any of their first
party data as well to leveragefrom the system?
Or is it just purely scanningthe web and seeing what you're
doing?
Because I think also there'slike, one question then is like,
how are you measuring results?
Is it purely about like, whichone to get selected?
And that's like a binary, thiswas better than the other?
Or is there some sort of waythat you're detecting as well

(10:06):
the impact that theseexperiments have?

SPEAKER_02 (10:09):
Yeah, for sure.
Great question.
So first off, we don't cookie,we don't pixel, like we don't do
anything back end on any ofthis.
We're only using public data.
So because we're only usingpublic data, this is purely
based on what version.
It's like a black or white.
It's binary.
What version do they keep?
What version are they not?
That being said, we work with alot of largest organizations in
the world, right?
Like we're working with likeFortune 100, Fortune 500

(10:31):
companies.
So they have huge amounts offirsthand data.
So that becomes part of acollaborative back and forth.
So like we are a SaaS product.
We have a platform, but we alsohave this research team and we
have a pretty expansive servicesarm.
So those teams are workingdynamically with, like if we're
working with that certainclient, they're giving us
feedback.
They're like, Hey, you told usyou had a really high confidence

(10:52):
interval on this, this, and thistest.
This is what our first partydata says.
And we're workingcollaboratively with them to
kind of get them the bestresults.
So it absolutely like uses, wecan use first party data.
Some people at the most basiclevel will just be like, I just
want to track competitors, showthem what they're doing.
Cool.
We'll do that.
But especially for all of thoselarge established clients who
have built out conversion rateoptimization programs, it's 100%

(11:15):
a collaborative effort withtheir first party data as well.

SPEAKER_03 (11:18):
Yeah, because I mean, that could also get then
very interesting afterwards, ifthen it's not only from like the
first like the scrape data thatyou're analyzing, but it's also
actually from like yourcustomers connecting the
optimizer or the A B testingframework into do what works.
And then you actually have theirresults across the board.
I think like at some point inscale, that would look be super
interesting.

(11:38):
So because then you also havelike both sides of the coin.

SPEAKER_02 (11:40):
100% and these partners are vital.
So we always tell people like wedo not compete with Amplitude or
Optimizely or Webflow or AdobeTarget.
They're our partners becausewe're just giving you data and
recommendations.
You need to then go execute andactually run your tests and
validate those things.
So we work hand in hand withthose platforms.
We know those teams really well.
We run webinars with them.

(12:00):
We co-promote with them.
We share insights to theiraudiences.
So those are all like partnersof ours.

SPEAKER_03 (12:06):
Amazing.
And so in talking aboutexperimentation, like we've
mentioned the word CRO a fewtimes as well.
And I think generally when wethink about experimentation it's
generally around like how do weincrease conversion rates how do
we drive more sales moresubscriptions there is obviously
experimentation that can be doneto improve retention to improve
engagement activation and soforth and my assumption I think

(12:27):
we discussed it before the showis that probably like the
majority of the stuff thatyou're seeing is top of funnel
increasing conversion but areyou seeing anything interesting
from companies when it comes toexperiments that your team
believes will have an impact onretention and activation

SPEAKER_02 (12:42):
yeah yeah actually we see quite a few of those.
So one of the ones that comes tomind immediately is from ramp
and actually another companycalled hockey stack does the
exact same thing.
And so they essentially havethese blocks that set the exact
expectation for the first 30days.
They say like, Hey, day zerothrough seven, you need to get
us these forms.
You need to do these specificsteps.

(13:03):
Then at day 15, you should beconnected to your ERP or, you
know, blah, blah, blah.
And by doing that, by settingthat expectation up front, I
think teams are going to addressthat front level churn, that
zero to 30, that zero to 90 daychurn, where people don't even
get fully set up within thesoftware.
They get turned off by somespecific thing.

(13:23):
There's some disconnect.
There's some key stakeholdersthat aren't involved and there's
a mismatch there.
So I think it's very smart forteams to get ahead of that and
to address that.
And we've seen this across,there's another company called
Linear that has like a veryspecific like onboarding guide,
but instead of an onboardingguide, like email, that's
actually part of the website andit has like the things like send

(13:46):
this to your CFO send it likeit's very very directed towards
trying to tackle that front endchurn that activation layer
churn so I think that kind ofstuff is really smart and I
think speaking more broadlywebsites in general they tend to
do a decent job of establishinglike what they do and okay kind

(14:08):
of in the middle job of how andoften a very poor job of the
like why us the space specificwhy component.
So I think companies that areimplementing things on the front
that help with that affinity,that identification are going to
get much better traffic.
They're going to pre-qualify outpeople that aren't a good fit.
Another example, I know we'rekind of spraying examples here,

(14:30):
but another example here is YCombinator.
So Y Combinator is obviously oneof the most well-known
accelerators, incubators outthere.
And they have things on theirsite that are dedicated to
saying like what they do, butalso what they don't do, right?
Like we don't We don't do this.
We don't do this.
And that's all part of justclearly setting that expectation
to reality.
And so I guess where I wouldencapsulate all of these things,

(14:53):
and it's probably the mostimportant part of conversion
rate optimization, but obviouslyit ties to churn here, is just
that simple thing.
Expectation to reality.
Because I could, you know, wecould keep going down the rabbit
hole.
It's like, it's like CTAbuttons.
There's a reason why very basictext that says, see pricing,
book a demo, start churning.
trial performs better than textthat says something like start

(15:17):
imagining, hit your quota, likewhatever type of language.
And the reason is becausethere's a much clearer
expectation to reality when itsays book a demo.
People understand what book ademo means.
It's very clear.
So I think you want to try touse that as a litmus test on
your website.
And I think that will impactchurn positively as well as
conversions.

SPEAKER_03 (15:36):
Yeah, I really like the examples you mentioned on.
I just pulled up ramp as well totake a look at this.
And it's very interesting on thesite they have a section it's
probably is it just it's acouple of sections below the
fold but not too far down thepage there's not much there and
then it's a new softwareshouldn't take a year to
implement here's what you canget done within ramp in just 30
days and then it has a timelineso it's like today and then they

(15:57):
have three items on thechecklist get started connect
your ERP upload your policyissue yourself a card in one
minute and it's got day five day30 as you said and at that point
you're like 100% of businessspend moved over to ramp you
intake to pay 8.5x moreefficient and book close 75%
faster so I think that isdefinitely very interesting I

(16:17):
think that's also a lot of likewhen we think about the customer
success process of what goes onit's like setting those
expectations up front of likewhat works involved and what's
needed and to your point I thinkas well like this may even have
been an experiment like when itgets run you end up seeing like
conversion rates potentiallyeven drop from visitor to sign
up but ultimately what you'rereally doing is you're

(16:38):
pre-qualifying the rightcandidates to come through the
pipeline and saving your team alot of time and headache because
I think that's also like one ofthe negative side effects of
like conversion rateoptimization I would say when
you especially when you'redealing like B2B like high
ticket item is that if you'refilling the funnel with like the
wrong types of customers youalso end up then having a whole
lot of time that you end upwasting on like trying to close

(17:01):
these deals and then chasingthem and then supporting them
and so setting the rightexpectations up front as well
may seem like at first glancehave a negative impact on just
the vanity metrics but the realbottom line is where it counts
and it makes an impact.

SPEAKER_02 (17:15):
Yeah, I think you got to be aware of it.
And we look at a lot of thingsthat hike up conversion rates,
like for example, embedded emailcapture hikes up conversion
rates, forms that are shorterthan forms that are longer on
intake hikes up conversionrates.
But you do have to be verycareful with that for the exact
reasons that you justarticulated.
Like, yes, having embedded emailcapture does increase the total

(17:36):
number of submissions, kind oflike progressive profiling in
general.
But what ultimately matters ishow much revenue comes to the
door and how much revenue stays.
And so all of these changes needto be done under that lens.

SPEAKER_03 (17:48):
When you say embedded email capture, is that
having the email address form onthe landing page and the hero
and not on the signup form?

SPEAKER_02 (17:55):
Exactly, yeah.
Like if you go to a website likeBuffer as an example, you'll see
that there's the email captureis like they can type their
email and it says like, getstarted.
Or actually, I believe that likeRippling and a lot of those
finance sites do the same thingwhere they have like an embedded
email capture.
Even Ramp actually might.
I'm not sure if I remembercorrectly, but yeah, anytime
where you're saving that clickby there's not just a button,

(18:18):
but you're actually adding youremail in first.
We've seen tons of data on thisand we actually tested this back
at ActiveCampaign ourselves anda bunch of other cases of that.
It will increase total number ofsubmissions, but you do just
need to be careful and make surethat it's ultimately netting out
to more good traffic.

SPEAKER_03 (18:37):
Yeah.
And I think that's also thething with experimentation.
It's not always like you need tounderstand the primary metrics
that you want to be tracking andalso have your secondary metrics
that you're monitoring to ensurethat like you keep your eye on
the ultimate goal which is likenew business and retention as
opposed to just like increasingsignups and increasing noise for
the team 100% and so nice soDurex obviously like it's

(19:01):
definitely sounds interestingwhat you're trying to accomplish
now you mentioned like you havequite a few researchers on the
team working like I'm keen todive into a little bit about
like your business now how isthe team approaching churn and
retention from a perspective ofa business?
Like you mentioned, you'reworking with large Fortune 500
companies.

(19:21):
What is your process lookinglike internally when it comes to
churn and retention?

SPEAKER_02 (19:26):
Yeah, for sure.
So I think one of the firstthings that's super important
for us when it comes to churn ishaving an expansion mindset off
the bat.
So people usually come in with aspecific challenge or problem,
like they're They wantintelligence to help with that
problem.
But as people can probablyimagine, if people come in

(19:48):
wanting to fix something ontheir homepage or wanting to fix
a specific landing page they'repromoting or pricing page, you
got a whole website, right?
So there's a lot more ground forus to give you data, to give you
tests, to do analysis, to dowireframes, et cetera, et
cetera.
So as soon as we come in, ourresearch team is very trained at
trying to understand not onlythat specific problem, but like

(20:08):
the larger apparatus of thatteam, what they're grappling
with, What are your metrics?
How are you measured?
And then having expansion aspart of that.
And we have an awesome group offolks and an awesome head of our
research team who's really,really good at training all the
folks to kind of have that eyeand lens.
And as a byproduct, we drive aton of revenue from expansion.

(20:31):
We actually have more revenuecoming from expansion right now
than we do have new revenue,which is kind of wild.
And so it's a super key part ofour business.
And especially with very largeorganizations, there There's
dozens of departments, right,where you can expand into.
And so there's also this hugeexpansion potential if you're an
enterprise focused organization.

(20:51):
But I think the key thing isfrom the onset, having that eye,
you know, some people use it'skind of like the cliche, like if
you're not expanding, you'rechurning kind of thing.
But I think very much likeembracing that from a
functional, systematicstandpoint of like, okay, we
just signed this client at monthsix, like what are the things
that would also benefit us?

(21:12):
And making sure that we'regetting in there, we're
positioning it, we're havingthose conversations in advance,
has led to our team having avery, very strong net revenue
retention.

SPEAKER_03 (21:23):
Where do you think that insight came from?
Because I think like, it's notcommon for a lot of startups to
start thinking about expansionfrom like the get go.
And it's typically somethingthat comes three, four, year
five, somewhere there wherethere's a little bit of
saturation on the growth front.
And it's okay, like now we needto sort of think because I think
that's also like, expansion, asyou mentioned, it's also like a
good counter to churn on theother front, on the other side

(21:46):
of it, to net revenue retentionand for health of business as
well.
So where do you think that itoriginally came from?
Like why so early?

SPEAKER_02 (21:54):
Yeah, I think where it comes from is like, and this
is a transition for me too,because I come from SMB world.
I've spent the last 10 yearsworking for like SMB companies
and now I'm in the enterprise.
So the first thing that's starkwhen you walk into the
enterprise is you have a multi,you can have a multimillion
dollar business that has 10customers, right?
So because there's lesscustomers, in general, that
creates some stability, but italso creates a huge amount of

(22:16):
vulnerability, right?
If you have 10 customers and youlose two customers, like it's
huge.
And so because of that, I thinkoff the bat, when I came and
joined you at works, I realizedthe critical importance of
keeping that core base, right?
Like if we add one new customerevery single month with the six
figure plus size contracts, likethat's great.

(22:37):
But we also want to make surethat we're continuing to expand
and also retaining that existingbase.
So I think part of it is abyproduct of this, this
enterprise focus.
And I think if you are a companythat is focused on larger
brands, that's something that Ithink makes a ton of sense to
embrace off the onset.
Um, it's also kind of, you know,again, this whole thing, I hate

(22:58):
to use the word flywheel causeit's kind of also so cliche, but
it's like when you're in theenterprise world, it's also a
huge word of mouth game.
And it's also when you're tryingto get in with the largest
companies in the world, likehaving the ability to call up
and say, like, Hey, NFL, Hey,you know, insert like large
organization.
Like, can you talk to the MLBabout us getting in on this?

(23:20):
Like that is hugely impactful.
And so I think having this thingwhere you're rolling out the red
carpet and you're not onlytrying to see what services, but
it's really going above andbeyond doing everything you can.
Like we had one customer who gothit really hard with the tariff
issues.
And instead of doing like thenormal, like we've all heard of
pausing, like, Hey, we're gonnado this normal pause.
We did kind of like the activepause where we're still

(23:41):
providing a bunch of services.
We're still like, we'recommitted to you.
We want this relationship to bea long-term thing.
So we understand that you mighthave to take a quarter off.
We're still going to provide yousupport.
We're still going to provide youthese services and these
different things.
And guess what?
That customer renewed twoquarters later after we did
that, they renewed for anotherbig contract once things had
kind of settled down.

(24:02):
So it just shows like thatinvesting in the customer,
investing in the long-termrelationship.
And that same customer, by theway, also referred us another
really good client.
And so it's creating the word amouth engine it's protecting
that revenue it's all kind ofinterconnected

SPEAKER_03 (24:16):
and then how is marketing like supporting this
expansion revenue yourself soobviously you have the
researchers identifying is thereany way sort of marketing and
you mentioned in the beginningas well like when a company
signs up you're also thenthinking about like month six
what's that going to look likewhat are the additional use
cases or ways that they can usethe product then is there any
way marketing is supporting thisfunction or is it really
marketing just focused purely ondriving top of funnel

SPEAKER_02 (24:40):
no I think it's all I think it's all interconnected
So like we, on the top offunnel, we very much have a
pipeline.
Like we run a newsletter.
I produce a lot of social, mysocial content on LinkedIn does
like 10 million plus views peryear.
Our newsletter is doing likedeep dive analysis is on certain
targeted companies andindustries.
But when I say it'sinterconnected, like we look and
understand what do our customerscare about?

(25:03):
What do they value?
What are the new areas ofexploration?
So for example, if a lot of ourcustomers are really interested
in AI positioning, you know, notsuper surprising, it's a big
question.
We're on a marketing front.
Also, we're going to maybe startproducing a lot more of that
collateral, especially if that'sa land vector where someone's
like, I'm going to be redoing mypricing and I'm going to be
using AI credits now for thisnew agentic model.

(25:26):
Okay, let's start producing abunch of content on the
marketing side.
We know that other customers aregoing to be interested in that
as well, but it also serves ourown internal team to showcase
like we have robust informationand analysis.
And I think what I try to add onthe marketing front is our
research team is very much,they're trained researchers.

(25:47):
So they tend to be very, verycareful about making any
extrapolations.
They'll just say like, this iswhat the data says.
I, on the other hand, I'm amarketer and I make
extrapolations all the time.
I'm like, okay, we see thiscorrelation between everyone,
two CTAs seems to beconsistently winning.
Why?
And I'll think about that.
And I'll say like, well, thereason I think is people come
with different levels of intent.

(26:08):
If you only have the ability todo a demo, you're losing all
these other people who maybedon't have have that level like
i will give my input andanalysis and so on the marketing
front i put out a lot of thelike why i think these things
are true based on that data andi think that's also really
appreciated by both newprospects but also our customers
right because they can look atthe data and they can obviously

(26:28):
make extrapolations butsometimes that layer of analysis
is something that i think canadd a lot of value on both ends
so we definitely try to worksales marketing customer success
all in lockstep

SPEAKER_03 (26:39):
very nice yeah because it seems as well like uh
at that sort of focus of thecompany that there's a lot of
ways as well I think marketingcan add value to it and it's
interesting to hear like how alot of times as well like as I
mentioned there's not manycompanies that focus this early
on expansion revenue and isthere a customer success team
over and above the researcherssupporting these clients or is
like would you call like theresearchers maybe in a way CS

(27:01):
reps to some degree as well

SPEAKER_02 (27:04):
yeah they're kind of interspersed right so our
researchers are like and there'sdifferent kind of like
delineations based on the exacttitles and stuff but in general
it's like we have a researchteam the research team is
providing the novel analysisthey're providing the services
they're analyzing all thesetests coming up with outcomes
they're responding to specificproblems so we do a lot of

(27:24):
what's called rapid analysisbasically means someone comes in
and they're like you know we sawthis news article or we saw this
thing like we want to run thisfast change and we need info in
the next three days that tellsus whether this is a smart move
right especially these largercompanies they tend to like move
quickly and so our team thenrallies and kind of puts that
together but it's the same folksthat are then like going through

(27:44):
the analysis with them and kindof going through the post
there's a kind of separateresearch team to that that is
doing the annotation of theactual test layer itself like
the people who are tagging andlike noting the test declaring
what test one declaring like whyit was like that's kind of one
layer and then there's like thecustomer research team I guess

(28:07):
you could call it that's dealingwith all of that kind of

SPEAKER_03 (28:09):
interaction yeah it's interesting that you're
have like humans labeling allthis data as well now and slowly
over time like the model i thinkthat you end up producing will
become smarter and smarter to beable to detect what's going to
work and what's not

SPEAKER_02 (28:22):
yeah and there's a huge amount of like you know i
don't want to underplay like thetechnology itself is doing a lot
of like identification andrecognition and tech but like
it's critical for us to makesure that the human side of it
that there's like a humancheckpoint right like these
decisions are highly impactfuland we want to make sure that

(28:42):
human side of it is not lost.

SPEAKER_03 (28:44):
Yeah, for sure.
And I think especially when itcomes to sort of the
experimentation side of things,like there's a lot of
subjectivity in it and nuancesneed to be addressed as well by,
I think at this stage, likepeople are probably still going
to be better at that.
I have an interesting questionfor you to think about.
I'm sure you've been asked thisas well, maybe a few times
recently, but since you haveaccess to all of this data,

(29:07):
let's imagine tomorrow you needto work on a totally new
homepage and it could be, let'ssay for Do What Works or for any
other startup, what are the keyingredients you're absolutely
including?
And like, what is the order ofthese ingredients look like?

SPEAKER_02 (29:22):
Yeah, for sure.
So one thing I'll say too, ourcompany grew entirely from word
of mouth.
When I walked in, our websitehad been done four and a half
years ago, hadn't been changed.
We are very close.
I'm on about month six on thejob.
We're very close to a fullrelaunch.
Our website is ugly.
It does not follow almost any ofthese best practices.
And that's because an enterprisefocused organization like us, we

(29:44):
just have haven't had a marketerhaven't had a focus on that so
yeah that's that's one thingi'll let people know as a you
know come back in a month or twoor maybe when this airs we'll
see we can judge it'll be a lotit'll be a lot nicer uh that
being said i think you want tostart by thinking about your key
pages your pricing page yourhome page or anywhere you're
driving substantial traffic andtypically you want to start with

(30:06):
things that are revenue tied interms of focus areas so think
about your image that you'reusing at the top right like we
know that in b2b SaaS productimages tend to perform well,
there's a reason why people dothose over stylized images, over
cartoons, over videos, over alot of other things is because
people have tested that andthose teams seem to perform
well.
We've talked about this CTAbuttons, those seem to perform

(30:29):
well.
And there's a lot of kind oflike nuance, you know, things
like, for example, reassurancetext, when you have a trial
saying no credit card requiredis reassurance text, and those
seem to perform well when peopletest having them or removing
them.
So I think you focus on thosecore language Like on the
homepage, on the pricing page,so much of the best practices

(30:49):
are all about simplicity.
So we've seen a ton of testingabout people putting anything
above the planters.
And it's like, don't do it,except for maybe a header and a
subhead, like get straight intoyour planters.
And then inside your planters,we see simplicity work.
So having more than seven lineitems tends to perform worse.
Having like jargon or thingsthat are not easily digestible
tends to perform worse.

(31:11):
Having the underneath theplanters, some sort of simple
identity So if it says like foursolopreneurs, as an example,
that's something that can allowpeople to self-select into that
plan.
Like anything that's giving thatshorthand for people to
self-select seems to bevaluable, right?
And so when it comes to thosepricing pages, there's a huge

(31:31):
amount of bet scores, hundredsof bet scores we have, but many
things are tying back tosimplicity and process level
clarity, I think are very goodthings for folks to think about.
And then as you're going throughthe page, I mean, there's so
many different things aroundlike grids versus carousel.
Like there's a huge amount ofthat.
But again, if we're to go reallysimple, focus on your headlines,

(31:52):
focus on your headlines and yoursubheads and make sure those are
really clear.
What most companies do, which isa mistake is like, it's all very
buzzwordy, right?
It's all focused on like gettinga specific benefit.
You're going to increaserevenue.
You're going to increasewhatever.
Really what you want to do isyou want to change it to
capability focused language.
So try to get away from benefitfocused language and capability

(32:14):
focused language.
So benefit is like, you're goingto increase revenue because we
have all of these, you know,currency conversions or
whatever.
But the capability there veryspecifically is like, we can
auto translate a hundreddifferent currencies on your
page.
That is a specific capabilitythat is being provided.
And that is what the customeractually wants to see.

(32:36):
Benefit language is very muchglossed over.
And that's a huge mismatch rightnow on websites.
So if you're kind of goingthrough with a lens and you're
like, okay, based on listeningto Casey, you're like, what are
the quick things for me to do?
Go to those headlines, go tothose sub headers, make sure
they're not too long.
Also, they shouldn't be likemore than 200 characters on the
subhead, stuff like that.

(32:57):
But like, make sure they'resimple and make sure they're
focused on capabilities.
Make sure that you have CTAseach spot on the page.
This might seem weird to somefolks, but we've looked at a lot
of test data.
You want to continually have CTAbuttons because you don't know
what block might activatepeople.
So make sure that those are donethroughout and you're giving
people that opportunity toengage.

(33:18):
And then I know I mentionedthis, but I'm just going to be a
broken record.
I'm going to say it because it'sprobably the most important
thing.
Expectation to reality.
What does someone think is goingto happen when they click on a
button and make sure everysingle button on your website
has clear process level clarityand you will see click through
rates go up.
You will see less bouncing.
You'll see more quality folkscoming through the door.

(33:40):
It is one of the single mostimpactful and important things
you can do.
And And to be candid, it's alsoone of the most straightforward.
Redesigning your brand colorsand your scheme and the
transitions, like those arecomplex.
What's not complex is goingthrough and making sure every
CTA that you have is very, veryclear, right?
So it's a low-hanging fruit forfolks.

SPEAKER_03 (34:00):
Yeah, I think that there's just so much as well,
like so much time I think in newweb design gets focused on like
the actual UI and design side ofthings as opposed to really like
the UX and the copy.
And I think for me, like one ofthe best lessons I learned to
actually was at Hotshot from oneof our content team at that time
and she sort of said like you'vebeen doing websites wrong all
this time like you start withcopy not with like design and in

(34:23):
the early days like you want tomove fast you start doing
wireframe you start putting textin and then you end up sort of
like fitting your text into adesign as opposed to really like
focusing on like having goodclear copy and as you said like
CTA is really critical that it'scrystal clear there you started
out as well just focusing onlike the hero section and then
the pricing page and I wonderlike as I think you alluded to

(34:44):
in the beginning it wassomething as well like I noticed
at Hotjar I think it's like itwas something like 85 to 90% of
people never ever reallyscrolled past the hero on
websites something ridiculouslike that and so you really just
have like one message you canyou can give to people at that
point and so yeah it'sdefinitely a very interesting

(35:05):
like place like to think aboutlike obsess over if you want
that way like so like if you'regoing to run experiments like
that's the place to start andlike you could probably just
like focus on that and have hugeuplifts without even like
approaching the rest of the page

SPEAKER_02 (35:18):
100% and it also this is a huge thing that's
important it's where does yourtraffic come from is really
important and it matters andespecially when you come to
startups that can be a littlebit more monolithic just meaning
like you know it might be thatfor some organizations like they
have a big podcast that's whereall their people come from they
come from the podcast well ifyour people all come from the
podcast chances are they'regoing to have more context right

(35:41):
they've been tuning in they'vebeen listening they're like
follow So versus like a paid adthat says like, go do whatever.
It's good to understand whatthat level of understanding is
based on where your traffic is.
Because going back to like myone versus two CTA example, it
may be the case that if all ofyour traffic, 95% of your

(36:02):
traffic comes from one sourcethat's super high intent, maybe
you should just have one CTA.
Maybe you should just say likebook a demo because like people
already know they come and theyknow versus if you have very
disparate sources, Or if you'rea person like me that drives a
lot of stuff from social media,social media can be kind of a
cold avenue because someonesees, oh, Casey shared this test

(36:23):
from Clay that was reallyinteresting.
They come, they still don't evenreally exactly know what I do.
They just know it's aninteresting test from Clay.
In those cases, it's probablyvery valuable to be able to be
like, hey, join our newsletter.
Hey, do this other lower liftthing.
They don't want to hop on a demoyet.
They're not ready.
So that's one thing I wouldthink about for your
organization.
Where does your traffic comefrom and make sure your stuff is
paired to that?

SPEAKER_03 (36:44):
Yeah, absolutely.
I think understanding sort ofthe traffic source is really,
really important for you.
And as you say, if you know youhave an audience that's really,
really high intent, maybe twoCTAs over one is not the best
choice.
But again, I think that's alsomaybe the beauty of what you do
at Do It Works is you can beable to filter and see that and
understand what works indifferent use cases and spaces.

(37:06):
We're running up on time, so Iwant to make sure I ask you a
question.
I've asked you this questionbefore, I'm pretty sure, but I'm
going to ask it again now in adifferent context.
Is What's one thing that youknow today about Chandler Sinton
that you didn't know six monthsago before you joined Do What
Works?

SPEAKER_02 (37:21):
Yeah, I think the most notable change is probably
the expansion focus that we'vebeen expanding on and talking
about here.
I think that's been the biggestdifference working in an
enterprise organization andunderstanding how that
functionally is so different.
I mean, as a person who'sfollowed churn for a long time,
I could recite to you theaverage percentages like, oh, an

(37:41):
SMB, you know, average churnmight be three to 4%, but in
enterprise, it's, you know, butbeyond those statistics, the
actual like tactical layer ofhaving that very specific
expansion focus.
I think that is one largedifference that I've adapted
since joining this organization.
I'll echo though, just as kindof a closing note, probably what

(38:02):
I talked about last time, whichis the stages of churn, which I
still believe is reallyimportant, which is like
thinking about churn in zero to30, zero to 90, 90 to 365 type
of deal, I think helps youcreate plans that functionally
deal with the actual corereasons why there is churn.
So I also think that is quiteimportant.

SPEAKER_03 (38:24):
Yeah, absolutely.
And maybe the last question thenfor today is, you obviously are
now assuming you get asked a lotof questions when it comes to
A-B testing and experimentation.
What's one question that youwish more people would ask, but
they don't?

SPEAKER_02 (38:41):
I think that's a good question.
I think part of it almost as astarting point, this might seem
blasphemous coming from someonein my position, but it's
probably should you be A-Btesting in the first place?
Right.
I mean, there's a lot of

SPEAKER_03 (38:53):
variables.

SPEAKER_02 (38:55):
Yeah.
Yeah.
I mean, that's the honestanswer.
That's the candid answer, whichis that as you talked about in
the very beginning, if you're avery small organization, if
you're a startup, if you don'thave statistically significant
traffic, I think all of thoseare very important.
The other thing I want to justunderscore too, is I think
there's this big misconception.
When I joined, I still rememberI had this conversation with a

(39:18):
friend of mine, guy I reallylike, but he was just like, I
don't get this company at all.
all.
Like you just want to copypeople.
Like you're just going to be acompany that copies your
competition.
Like what's the, what's theexcitingness of copying.
And I want to underscore that.
I think there's thesedifferences between things that
should be done on websites, liketechnically logistically versus

(39:39):
kind of creativity.
Creativity to me is like yourbrand and your, in your, in your
logos and kind of the way you'representing things and your copy
and all these kinds ofcomponents.
But there's so many functionalaspects that we're talking
about.
Like, should you have one or twoCTAs?
How should you talk about thisinformation?
Like all of these types ofthings that going back to my own

(39:59):
personal experience, like when Ilaunched an e-commerce company
and I remember spending a year,spending a year by only having a
credit card button, no abilityfor people to pay in an
alternative way.
And then having someone sit downwith me and say, Hey Casey, your
conversion rates are really badon your checkout page because
you're selling in Europeanmarkets and these foreign

(40:19):
markets.
And in these markets, it's waybetter to have the ability to
pay via PayPal, pay via Venmo,pay via Apple Pay.
And when I added that one smallthing, it was literally a
massive binary overnight hike inmy conversions.
And I was hitting my headagainst the table looking, I've
done this post about like how Ileft$200,000 on the table,

(40:42):
calculating my new conversionpercentage across my existing
traffic.
This is the type of thing we'retrying to solve for people.
We're not trying to replacecreativity.
We're selling in Europe and youdon't have that pay with PayPal
button, maybe that's somethingthat you should be more aware
of.
And that's, by the way, I don'tknow specifically about how big
PayPal buttons are in e-commercein Europe anymore, but at the

(41:03):
time, that's the type ofdecision that was super
impactful.
And that's really what we'retrying to optimize for is those
types of best practices.
It's not about replacingpeople's creativity.

SPEAKER_03 (41:13):
Absolutely.
Well, Casey, it's been anabsolute pleasure having you on
the show today.
And is there any final thoughtsyou want to leave the listeners
with before we wrap up today?
I

SPEAKER_02 (41:24):
think we've honestly covered the core things.
And so the last thing I'll say,again, tying back everything is
go to your website and look atyour headlines, look at your
buttons, look across the websiteand say, is there a match
between expectation and reality?
That one thing that you can dowill have the biggest impact on
impacting your churn as well asyour conversions.

SPEAKER_03 (41:43):
Absolutely.
And it definitely is, as wetalked about a lot, one of the
biggest areas where marketingoversells makes a promise,
product under delivers, likesetting the right expectations
up front will save you a littlepain down the funnel.
So Casey, as always, absolutepleasure.
I wish you best of luck now inthis new journey and looking
forward to seeing what that newwebsite looks like so we can

(42:05):
judge you when it comes out.

SPEAKER_02 (42:07):
I love it.
I look forward to it.
Thanks, Andrew.
Thanks.
Cheers.

SPEAKER_03 (42:17):
And that's a wrap for the show today with me,
Andrew Michael.
I really hope you enjoyed it andyou were able to pull out
something valuable for yourbusiness.
To keep up to date with churn.fmand be notified about new
episodes, blog posts and more,subscribe to our mailing list by
visiting churn.fm.
Also, don't forget to subscribeto our show on iTunes, Google

(42:40):
Play, or wherever you listen toyour podcasts.
If you have any feedback, goodor bad, I would love to hear
from you.
And you can provide your blunt,direct feedback by sending it to
Andrew at churn.fm.
Lastly, but most importantly, ifyou enjoyed this episode, please
share it and leave a review, asit really helps get the word out

(43:00):
and grow the community.
Thanks again for listening.
See you again next week.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.