Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
This is the Unknown
Secrets of Internet Marketing,
your insider guide to thestrategies top marketers use to
crush the competition.
Ready to unlock your businessfull potential, let's get
started.
Speaker 2 (00:16):
Howdy.
Welcome back to anotherfun-filled episode of the
Unknown Secrets of InternetMarketing.
I'm your host, matt Bertram.
Guys, there's so much going onin SEO and search.
I feel like everybody justsmashed Google and all the
traffic just ran everywhere.
I feel like it's just fannedout like you wouldn't believe,
(00:39):
and so I wanted to bringsomebody on that I have bumped
into a few times, at Brighton aswell as at SEO Week.
He's been an advertiser.
He's got a next-gen tool.
Ray, I didn't tell you this,but I just published an article
on Search Engine Journal and Ireferenced you.
Speaker 1 (00:57):
Thanks very much.
Speaker 2 (00:58):
I just think what
you're doing is great, so I
wanted to bring you on for allthe people listening and talk
about, like there's an old gamethat a lot of people are playing
in SEO and there's a new gameand people really need to be
playing the new game or you'regoing to get left behind and so
you speak a lot.
I would love to just kind ofpick up what we were talking
about at SEO week and kind ofwhere the market's gone since
(01:20):
then.
What's most topical for you?
Speaker 1 (01:23):
Yeah, thanks for
having me.
I'm really happy to be here.
I've enjoyed our conversationsat events a few times and I
think there's a lot to cover.
So what?
Obviously, ai is the big topicthat you know.
Everyone wants to know.
Ai is my website dead?
Is traffic gone forever?
You know, like, is it all over?
I hear that from a lot of teams.
(01:44):
Other teams are like is it, isit all over?
I hear that from a lot of teams.
Other teams are like this isthe opportunity of a lifetime
and we're going to makeeverything we can of it.
And so I found that there hasbeen those two kinds of camps of
you know the way people areapproaching it.
Obviously, I favor the latter.
Um, any sort of change likethis, it's going to happen one
way or another.
So, uh, the quicker you canadapt to it, the better you know
you're going to be off in thelong term.
Speaker 2 (02:10):
Now I'm curious, the
two different kind of teams that
you're hearing about.
Is it across the board or is itcertain industries?
Because I know that mediapublishers have been hit pretty
hard.
I'm just curious.
Any more you know granularstatistics.
Speaker 1 (02:21):
Publishers are in a
very interesting place.
I don't even know where tostart with that one.
It's most of our just forbackground, most of our clients
tend to be in e-commerce andalso what we would call kind of
like database-driven directresponse type things like travel
search, real estate search, jobsearch, all those sorts of kind
(02:41):
of transactional you know whata lot of like programmatic SEO
plays would have been relevantand so for them they're like
well, we still get traffic andAI is having a huge impact, so
we're going to have to adapt andovercome here.
But I know that there has beena lot of you know, chaos kind of
in the publisher side of thingstoo.
Speaker 2 (03:01):
Well, you know, one
of the big things that I ran
across was a lot of new clientsthat were coming to us were
actually filtering out orblocking the bots of a lot of
the llms and I'm seeing I haveseen that like across the board,
um and so like I'm not rankingin llms and I'm like, well,
(03:23):
you're blocking them, so theycan't them, so they can't see
what's going on on your site,and so you know.
I even saw a staggeringstatistic.
It was something like 26% ofpeople are now using LLMs versus
Google search.
I don't know what are youseeing in your data about user
(03:46):
flow?
Speaker 1 (03:47):
And.
Speaker 2 (03:48):
I think AI overviews
are just kind of stemming the
tide of people kind of goingacross the river, because LLMs
are doing all the research foryou, right, and people want that
.
Speaker 1 (03:59):
Yeah, so we see a lot
of interesting data.
I would say that traffic fromLLMs is still very much a drop
in the bucket compared totraffic from traditional source
quote unquote, traditionalsources, but I don't think that
that necessarily means thatpeople are not using them to a
much greater degree than isrevealed by the traffic metrics
(04:24):
metrics.
There's obviously the wholeconcept of zero click, and
there's multiple searches andongoing conversations and
entirely different types ofbehavior patterns that are
emerging that just did not existbefore, and so a lot of the
analytics tools are kind ofstuck in this traffic-based
model, and I think that's one ofthe reasons why there is so
much angst about a lot of thisis how are we even going to
affect the impact of this if allof our tools are just based on
traffic, and so it's going to belike there's going to be an
(04:46):
entire new generation of toolsthat help brands focus on, you
know, the bigger picture.
Um, obviously it's part of whatwe're doing and, uh, I think
it's gonna be a lot of otherreally good.
Speaker 2 (04:56):
you know technologies
too well, you know, brand uh
ran fishkin basically at seoweek Week, and I think he said
this at Brighton too.
He just kind of tightened uphis deck.
It was search traffic towebsites is down, with one-click
search like 58.5%, right, yep?
(05:18):
And we're seeing drops with allour clients, right, yep, we
have actually not all ourclients.
We have a couple of clientsthat are growing at an
exponential pace, so they'rekind of bucking their time.
But we're having to measurewell, here's your drop versus
what the average is right, andhaving to explain that that
traffic metric is not revenueand your impressions are going
(05:40):
up.
I was showing to our teamearlier we're about to do a
quarterly presentation.
Every number's up, okay, um,even so.
So we're seeing an inversecorrelation if you get them into
the first position.
So we had like 300 positions inthe first top three positions
in Google, based on like rankmath or whatever, and traffic's
(06:01):
actually down.
And so we're seeing, as you getpeople up into the top spots,
they're getting into the AIoverviews they're getting into.
You know, people also ask likeand the zero clicks happening,
and so traffic's dropping andthat doesn't mean that the brand
consideration's not there.
Maybe that's happening 75% offpage and then when someone goes
(06:24):
to the site, man, you betterhave that site ready to convert.
And I know you've done somestuff with UX in the past.
Like what would be someactionable recommendations If
people because even I've seendata that said, well,
traditionally the, the well, theLLMs, propexity, chatgbt and AI
overviews are pulling fromGoogle as the corpus, right, so
(06:47):
good, seo is still pushing up tothe top, but you're seeing the
traffic drop, so you're still inconsideration, but you don't
have visibility of that.
And I know that a lot of whatDemandSphere is working on and
others are trying to come upwith this next generation of
tools.
But what would you tell them if?
If their traffic's justevaporating but they're doing
(07:10):
all the right things and well,you know their leads are down,
or their traffic's down, like,what should they be thinking
about?
I, I'm, I'm talking about UX,I'm talking about like, okay,
we're going to add some heatmapping, like some pop-ups like
some pop-ups, like exit pop-ups,like we need to really make
sure this page converts, likeyou need to have all that
(07:30):
information on that page to helpthem make that decision,
because you know those preciouspeople that are coming to the
site.
It's like a car lot I'm usingthis analogy now.
It's like it's basically, ifsomeone comes to your website,
they're like walking on a carlot.
It's basically, if someonecomes to your website, they're
like walking on a car lot.
They are ready to buy a car, soit's your job to like sell them
(07:51):
or not, right, so that it's alot higher intent based traffic.
So I don't know if you wantedto add anything to anything that
I shared in that kind offramework.
Speaker 1 (07:57):
I think it touches on
a lot of really good points.
I think the last one that youmentioned, intent is a really
important thing, and also kindof reevaluating what gets
measured At the end of the dayyou know, even conversions is an
interesting one.
What we're seeing, especiallyat the enterprise level, is
(08:18):
there is a renewed focus on whatis called like media mix
modeling.
You know there's differentversions of that three letter
acronym of MMM, but basicallywhat that is kind of pushing
people towards, both on the paidand organic side, is, instead
of trying to look at all ofthese things that are metrics
but are they KPIs, maybe notwhich would include things like
(08:40):
traffic, potentially even thingslike conversions it's just a
very stark look at revenue andwhat's contributing to revenue
over time, and it ends up beinga return on spend model and it
basically says we're not goingto even think about things like
attribution modeling.
(09:01):
It's just literally, this iswhat I spent on this and this is
what I got from it at the endof the day, and that creates all
sorts of interestingopportunities.
There are pitfalls with it aswell.
It requires a lot of nuance inthe way that you look at it and
to kind of zoom out even more.
One of the things that I'malways talking about is holding
(09:21):
multiple paradigms in your mindat the same time and not taking
an absolutist view on things,because that can lead you into
trouble pretty quickly.
But the main takeaway for all ofthis is that there is a renewed
simplification on looking atwhat is going to contribute to
revenue and also realize thatthings that may be gathering
(09:41):
user behavioral attention at onestage, even if you can't
specifically attribute a clickor a conversion to that
happening at that time Forexample, in the case of, like an
AI overview, zero click search,all of those different things
if it's contributing to that netattention that ultimately leads
to revenue down the road,that's a good thing.
(10:01):
But then the question obviouslybecomes well, how do you
attribute that?
And the kind of emergingconsensus on some of this anyway
is you don't worry aboutattribution.
It just comes down to where areyou spending and where do you
see the results coming from?
Now, again, that's a whole.
We could do a whole podcastjust on that one topic, because
it opens up so many cans ofworms, but it's an interesting
(10:25):
way to look at it.
I'd say there's other anglesthat you should also consider,
but that one has been a reallyinteresting one in this whole
topic of conversion optimization.
And then the other thing that Iwould say, just to answer your
first question, is be willing tolook at what's really
contributing to the things thatyou call conversions as well.
(10:46):
As an example from our ownbusiness, one of the things that
we see is we don't do a lot oflike CR, like conversion rate
optimization, on our site, andyou know we're not like trying
to, you know, incrementally add2%, lift or anything like that.
We're always doing like smallupdates here and there, but it's
we don't.
(11:10):
We don't really do a lot ofthat, but what we do see is when
we do things in the market thatattract attention, our leads go
through the roof.
A good example of this is whenwe were the first company to
launch AI mode tracking, smxadvanced, a couple weeks ago, we
got more leads probably in fourdays than we had the previous
year combined, you know.
(11:32):
And so it's more about likedoing things that attract
attention.
Speaker 2 (11:33):
That are going to
have the biggest impact.
I think yeah, Like breakingthrough the noise and getting
there.
What everybody's thinking about, like saying it out loud and
addressing it Like that was abig update that I saw and I was
super excited for you and itdefinitely broke through
algorithmically, you know, andgot a lot of visibility Going
back to what you were sayingpreviously of just MMM, of like
(11:59):
you're just putting in stuff atthe top right and then you're
measuring the bottom and youknow there's just this mixed or
or, uh, you know, weightedattribution across the board and
you're like, well, just focuson what works.
That goes back to um, just, uh,educated assumptions,
(12:19):
essentially right, like of wherewe think we're seeing that
impact.
What you're seeing on, maybe,that paid platform and that was
where I was going to ask you islike how are people assessing
attribution?
And I went to that model awhile ago as well.
You can't really measure thisstuff and sometimes we're
(12:40):
building funnels for clients ornonprofits that are going from
one platform to another platform, to another platform, to
another platform, and you losethe UTMB tracking code and you
just got to go.
Well, this is what I believe ishappening.
This is where I see the value.
This is the data that I'mseeing from the industry.
So I believe that this is theright thing we should be doing
and we're going to run thecampaign and we're going to
(13:01):
measure it on the end, on what'ssuccessful.
But there's a lot of guessingthat goes on educated, I guess
but of of what that mix shouldlook like, and it's almost like
run it to statisticalsignificance and take away that
component and add a component ata time, based upon your mix,
(13:24):
and see how it's impacting you.
But then you got like okay, fora lot of businesses right now
we've just hit the summer, sothere's like a you know, so it's
it's.
It's very hard to consider allthe different variables and
measure it.
How, how would you?
If you went a little bit deeperand I know we could continue
this discussion I'd love to peelback one more layer of how
(13:46):
should people be thinking aboutthat or approaching it.
The framework I tend to use isthe 7-11-4 framework.
Right, seven hours of content,consumption, seeing their brand
or logo 11 times on fourdifferent channels, consumption,
seeing their brand or logo 11times on four different channels
.
And that framework is what weuse for a lot of our clients to
(14:07):
say, hey, we need to get you atleast to here on a roadmap to
start achieving greatervisibility for your brand, and
we want to be watching thebottom line and we want to be
launching impressions.
But we got to look at it acrossall platforms now, because it's
almost like all the websitetraffic and the assessment that
(14:30):
was done, so really thewebsite's more transactional and
then everything else is offpage.
Speaker 1 (14:37):
Well, and I think one
of the things that's really
tricky about organic inparticular that does not
necessarily apply to spend onother channels is the fact that
you have what you know,especially when we're talking
about, like SEO, which is a veryloaded word, you basically have
two different financial modelsin the same word.
(14:59):
You have CapEx and OpEx in thatsame word.
From financial models in thesame word, you have CapEx and
OpEx in that same word, and youknow.
So all the stuff that you'redoing on the product site, work
site redesign, architecture,technical SEO, like most of
those things are typically onlyto fall into that capital
expenditures.
But then you have marketing,which is typically OpEx.
You know when you're looking atother channels, that's all the
(15:22):
stuff.
That is only like half of whatwe do in SEO.
You know save, content,optimization, all the you know.
So it's like how do youproperly even attribute,
attribute your financial modelif you don't have that
conversation internally whereyou're looking at those with?
You know the correct lens, sothat I think that's a really
important one.
Speaker 2 (15:37):
That gets you know,
glossed over quite a bit too
important one that gets, youknow, glossed over quite a bit
too well I've been trying toconvince clients to move, uh,
like a branded spend over tocapex, you know, and it's just
like because we can't.
Like you just need to and likeif you turn off some of these
platforms, you're resetting theai as far as, like the heat
sinking missile to find theright person, so you just got to
(15:59):
build it into your coststructure is really kind of how
I'm educating them to do that.
Let's transition into demandsphere.
I think a lot of people arestuck on the traditional tools
and I think that there's a lotto be desired from the
traditional tools.
I feel like you were one of theones that have educated me on.
(16:23):
You know well, there's reallyan issue now when you go to
Google on how much real estate,like from a pixel standpoint,
you have on the page, becausethat first position's now two
thirds of the way down.
And so I'm curious, like, whathave you seen with the
application of your tools ofwhat people are not thinking
(16:48):
about?
Like, where are the biggestgaps when, when they start
leveraging your tools, what arethey finding?
What?
How should people be thinkingabout stuff?
Because you know there's a lotof like, um, really great
keywords that people missbecause they're so narrowly
focused on whatever the topic is.
And if you, if you focus onthat too much and over-optimize
(17:13):
in that area, you'll get analgorithmic penalty and then for
the word you want to rank for,you can't rank for it.
So you gotta, you gotta kind ofbuild out that information
architecture to, to, to clusterit a little bit better.
Um, even though you know a lotof clients on maybe like the
small to mid-sized business side, they're you know, they're like
I got to rank for this word andit's like, well, you know, like
(17:34):
we need to do all these otherthings to lay a foundation.
That's maybe like the top ofthe pyramid, but but we got to
build, we got to build thatfoundation.
And so I'm just curious, youknow what are the big things
that you're seeing a lot of SEOsmiss?
So there's some maybeactionable insights for people
that are listening.
Speaker 1 (17:51):
Yeah, one analogy
that I use a lot is I'm sure you
probably talked to, you know,people who have a PhD in some
field biochemistry or whateverand then they start riffing
about something that'scompletely unrelated to their
field and they talk with thislevel of confidence that has
they sound like they know whatthey're talking about.
But if you actually know whatthey're trying to talk about,
you're like you actually don'tknow anything.
You probably are a PhD inbiochemistry but you know
(18:13):
nothing about aerospaceengineering or you know, and so,
as an example so I've usedsites that way a lot as well
where you've built up a level oftopical authority around a
certain topic.
And one of the big changes youknow over the last it's not
really in the last few years,it's been going on for a long
time Google is looking at whenwe're talking about Google.
(18:34):
They're looking at what is yourset of topics that you're good
at talking about, and there'sdifferent ways of measuring this
, even on a mathematical level,which we can talk about, but
they're looking at that semantictopical authority and if you
are creating content that is wayoutside of that traditional
(18:55):
things you know set of thingsthat you talk about.
You just need to recognize that.
It's going to take time tobuild up that semantic authority
.
You're gonna have to work onthings like linking to that.
You're going to have to maybeupdate old content right.
There are ways to merge it inand make it something that you
are good at talking about, butit's not something that's going
to happen overnight.
So you know, if you're a sitethat's talking about A and you
(19:16):
suddenly want to start rankingfor quote, unquote ranking for B
, it's going to take a lot morethan just writing a few articles
.
Speaker 2 (19:27):
you know about that
and so I think that's a really
important thing to understandabout the way sites are
evaluated.
Yeah, so let me, let me addonto that.
So you know, if you're lookingat like a three-dimensional
model of of where topics arerelated to each other on this
graph, you're saying, well, ifyou're talking about this over
here and you're an expert overhere, the distance from that
people talking about this topicto this topic, you know you're
(19:48):
measuring that mathematicallyand saying, hey, you might want
to prune this content over herebecause it's not really in your
core area of focus.
And you know Google hasn'tbrought back officially
authorship, but it kind of hasand it's.
You know who's saying what andwhat are they talking about.
The question I have for you inthat is and I tell clients to
(20:08):
say, you can't be an expert ineverything Right and based on
your you know, if we use themetric of domain authority or
you know domain ranking, youknow we need to pick one or two
or whatever right.
Is there a correlation or aquick reference guide that you
would say, well, I am an expertin this, I am an expert in that.
(20:29):
Well, do you need to produce Xamount of content clustered on
this area, over what amount oftime before you hit that
threshold, or if your site onlyhas X number of backlinks, or
whatever you know rating systemyou want to use, whether it be
Moz, ahrefs, you know SEMrushany of them that these kind of
generated ranking systems thatare not really true what
(20:52):
Google's using.
But is there any kind of rule ofthumb that you can use to say
we only want to focus on onetopical authority?
Now we're trying to spreadourself too thin, because that's
what I do see clients wantingto do.
A lot is they don't have enoughbudget to really focus on one
area and they try to spread itout and get a little bit of
(21:13):
improvement everywhere.
But if you're not the top threepositions in Google,
historically you weren't gettingany traffic and Google
historically you weren't gettingany traffic.
And now you know.
If you're not, you know in theknowledge graph, et cetera you
know, uh, in AI overviews andLLMs you're not getting any
(21:35):
traffic at all.
So, um, and you're not evengetting an option because it's
not.
It's given you six links or youknow, potentially, and you
might not be included, and it'svery customized.
So how should people trim thatfocus or think about that rule
of thumb, to just measure it,before even getting started on
the complexities of cleaning uptheir content, but probably the
(21:59):
easiest accessible way forpeople to do this now with some
hands-on tool would be ScreamingFrog.
Speaker 1 (22:05):
You probably saw
their latest update, but they
released so a lot of what we'regoing to be talking about is
probably most of the people inyour audience will be familiar
with the concept of embeddingsand that sort of thing, but if
not yeah, and just basically.
Like you know, when you'reembeddings, I always describe it
as you convert text intonumbers so you can do math on it
, and the type of math you woulddo would be related to
(22:25):
measuring the distance betweentopics and terms and so forth,
and so one of the updates thatScreaming Frog released recently
was the ability to visualize alot of these topical clusters
and understand if you do havesomething that is way out of
range.
So there's different types ofscoring that you can do.
Like Euclidean distance is onethat gets used quite a bit, but
(22:45):
the easy way to describe it isbasically it's like how far away
is something you're talkingabout on your site from all
these other things that you talkabout on your site, and that's
some of the things they can helpyou do.
I think you're going to see alot more of these types of tools
come out.
It's not our main area of focuswithin DemandSphere.
We use embeddings quite a bit,but we use it for other things.
(23:09):
But this is, I think, one ofthe easiest.
It's not an expensive tool.
Anyone can very easily startplaying with that to understand
that some of the things thatwe're talking about.
Speaker 2 (23:15):
Yeah, I think what
I'm hearing you say is it's not
necessarily.
Think what I'm hearing you sayis it's it's not necessarily
because.
but I am seeing like if youconsider them ranking factors,
or ranking factors is even likeyeah, that's term, uh, anymore
is um.
How much traffic's coming totheir site?
(23:36):
How many backlinks are coming?
Uh, to that, that page, maybe,um, or that.
We're talking about a specificpage, so traffic to the page
links to the page.
But the biggest call out I seeis and we saw this maybe in the
last like 18 months give or take, maybe a little bit longer
Google just unindexed stuff Likethey're just unindexed like
(24:00):
thin content, low content.
Index stuff Like they're justundid x, like thin content, low
content, and you know, reallyyou need to.
I feel like that weights thesite down, like the more pages
that you have that are thincontent or are not really
driving the link equity throughthe site.
It kind of becomes friction.
If you were aerodynamic, whatare you know?
(24:21):
So so the more clustered youget your site, the faster it'll
just scream to the top of thesearch engines and we're still
kind of in that pruning concept.
Is there anything else youcould add color into that area?
Speaker 1 (24:35):
yeah, we.
So we really started seeingthis probably about four or five
years ago, in particular, whereyou know, you look at the data
and it's like this happened withum, uh, core vitals, but also
just with their aggressivenesson, uh, how often they're
indexing your site and so forth.
You know, and it's basicallythe conclusion was that google
(24:59):
really does not want to beindexing and crawling nearly as
much as they were before.
Speaker 2 (25:03):
Yes, yes, that's what
I'm referencing.
Speaker 1 (25:05):
Yeah, and so what you
said was exactly our biggest
piece of advice to most of ourclients and, in many cases,
still would be, because, youknow, most of our clients are
going to be larger sites thathave hundreds of thousands or
millions of URLs, going to belarger sites that have hundreds
of thousands or millions of URLs, and we'll look at it and we're
like you know, you're just,you're just wasting resources
(25:26):
and Google's looking at you like, oh, I don't even want to touch
that, you know.
So reducing the URL count to thethings that are going to be
actually adding value to yourpotential customers is a thing
and it's a very big shift,because before that it was all
about, like, um, programmaticit's a bad name because of this,
but I think that there's been ashift in that too.
(25:47):
But programmatic seo in thiscontext was basically like we're
going to predict potentialtraffic from every potential
search search term variation,understand what the search
volume is and everything, andthen basically generate or
create content that would matchthat specific presentation.
And that was kind of goingagainst like the older model of
google which was before they gotinto all the semantic stuff,
(26:08):
where it's just like.
It's kind of like a justtraditional nlp evaluation of
your content.
That's long gone.
That stuff doesn't, you knowlike, even if it works, don't
count on it.
Um, it's a totally differentworld now.
So I think that shift hasreally brought about where we
are today, where you definitelywant to be focusing on only the
things that are going to berelevant and organizing your
content in such a way that youdon't have to go, you know, too
(26:30):
deep into your site to reallyexplore the topic.
Speaker 2 (26:33):
So some of these
larger sites, what's your like
rule of thumb for like crawlbudget?
Speaker 1 (26:39):
Well, I wish I was
lucky enough to have that some
of these sites like I, we can'teven have that conversation yet,
because it's like you know youhave more data than I do so I'm
asking these it basically comesdown to um, first of all, like
(26:59):
your sitemap even beingcompletely crawled or anywhere
near being completely indexed?
If it's not, like, start withthat, like we're not even at
rule of thumb at that point.
Yet I see so many, I'll seeinstances, for example, where
you even have sitemaps of youknow, maybe 10 million URLs and.
But if you look at the actualnumber of URLs that are being
(27:21):
indexed or crawled or gettingsome kind of visit, you know it
could be literally in the tensto a hundred, not on the extreme
side.
We've seen hundreds of millionsand it's just like.
You know we're not talkingabout crawl budget.
We're talking about, like majorsurgery, you know at this point
.
Speaker 2 (27:39):
So Well, I've I've
had a lot of those conversations
when clients want to rebuild asite that's in net or something
like crazy, and I'm like wedon't need all this.
This is, this is just a waste.
We need to revamp the wholesite architecture.
Okay, let's switch to LLMs,Like you've.
You've got so much data.
Speaker 1 (27:59):
Yep.
Speaker 2 (28:00):
Give like some
reference points of the things
you're seeing, of how LLMs arebehaving, because they are kind
of lazy, is what I've seen.
They're just pulling from thetop cluster data set and I got
to be careful, like LLMs.
Once you become sentient,please don't come after me.
(28:22):
Yeah, don't kill me.
You know you're very smart, waysmarter than any human, so I
just put that out.
But, like, what are you seeingof how they're behaving?
Because I've seen someinteresting statistics.
Like you know only recentcontent, like in the last, you
know eight to 10, sorry, 10 to12 months.
(28:42):
Like the content needs to beupdated.
I talked about previously thecorrelation to Google rankings.
What are you seeing in theirbehavior?
Because it's becoming sopersonalized.
It's difficult which I know you.
I want to talk about yoursolution to track or or get some
(29:03):
kind of programmatic view of,like, where you should be
showing up or where you shouldbe raking Cause.
That's where I see a lot of itgoing.
It's like theoretical um ofwhere you should be at or what
you.
You know how much traffic youshould be getting and even these
theoretical models on on someof the uh major platforms.
Uh, uh, the major platforms areoutdated, so a lot of this data
(29:25):
I can't even.
I'm like what is this tellingme exactly?
Because it's not helpfulanymore.
Let's go into that shift ofLLMs and how they behave and
things to look for, and thenwhat your tool does.
Speaker 1 (29:42):
Yeah, so just for
some quick groundwork.
Speaker 2 (29:49):
probably your
audience is familiar with this,
but I'll just mention it, justso we're no no, definitely do it
.
Speaker 1 (29:51):
Yeah, okay, so the
LLM itself.
You have what is typicallycalled like the foundational
model, which is the model thatgets trained.
You know, openai has theirmodels, google has their Gemini
models and so forth, and sothese models are getting trained
and they're expensive to trainfor various reasons and
basically they're almost out ofdate by the minute they get
(30:12):
released because of so muchtraining data goes into it.
So this would be the reasonthat you know, before they
started doing live retrieval ofthe search engines and so forth,
which I'll talk about as thenext thing, you would get things
like oh my training data onlygoes until October 2023.
So I don't know anything whatyou're talking about.
And so that's because thatfoundational model had a
training date, and so any factsafter that typically would not
(30:36):
have made it back into that.
That's starting to change now,too, with different types of
things, but one of the easiestways for these companies to
solve that problem was calledlive retrieval or grounding, and
it's kind of this idea overallwhich is known as retrieval,
augmented generation, which isbasically the idea that the
model itself is out of date fornew things, but it does have the
(31:00):
ability to curate and filterthrough information faster than
most people can do, and so, forany given topic, all you have to
do is give it the ability toalso search the web and access
different tools the same waythat you would as a human, but
it's going to curate thingsfaster for you, and so you know.
The early example of thisbetween OpenAI and Microsoft,
(31:22):
for example, was OpenAI gotaccess to the Bing API to do
live searches on Bing.
And so now, when you you knowsearch for something running
shoes is one of the you knowlike what's the best set of
running shoes if I'm going to bea trail running guy?
You know like, ok, so it'sgoing to search for that and
it's going to be pulling inthose results from search some
(31:43):
search index.
Um, interestingly and I'll talkabout this more later if you
want to go into it, it is kindof a rabbit hole as well, but
interestingly, actually, bing isnot the main index being used
anymore, even though it is stillbeing used.
Speaker 2 (31:55):
Sometime, uh, we can
talk about crawl are you talking
about?
Speaker 1 (31:58):
common crawl is in
there as well.
Um, they're building their ownindex, but actually Google is
being used, yeah, yeah.
Then people realized we'rethere.
Speaker 2 (32:06):
I have a graph that's
showing a lot of that's being
pulled from that.
Yeah, I would tell you when,when that integration happened,
we had to really take a look atsome of the legacy clients we
had, yeah, and a lot of themdidn't have being webmaster
tools even set up, yep and anduh so that that was.
(32:29):
That was a.
That was a big opportunity toto unlock some stuff.
I I feel like what I'm startingto do and I I don't know what
you're seeing.
You're, you're interacting witha number of these speakers.
You're just on the speakingcircuit.
A lot I'm starting to tunecustom GPTs for my team to use
(32:49):
on top of the foundational modelto kind of correct or to
accommodate our workflow of howthings are being done.
And I've found that, plus theRAG and also the update of logic
across different searches andthen organizing it into projects
(33:10):
, you can really I mean you gotsome real leverage now- Totally.
And I've seen and I'm sureyou're probably doing this on
the backend because I've seenkind of what some other
platforms are doing but I'mstarting to use it a lot for
data analysis and kind ofpulling some of those things
(33:33):
together and I'm looking at someagentic agents to kind of do
this workflow for us, to give usthe output, because there's a
lot of manual grabs and puttingstuff, putting data in.
But I'm finding some reallyinteresting outputs that that
are happening.
You know y'all y'all built thisinto some SAS tools.
(33:55):
What tell me let's go into thelike the next phase of kind of
what to look for with the LLMs.
Speaker 1 (34:04):
So what I am always
very clear to to, especially
when we're talking about search,there are, up until even you
know, right now, um, people willtypically talk about
traditional search and then aisearch, and traditional search
is basically what they say.
Anything is on google and theai search is l in the lm thing,
and what we always say is likeit, it's all AI search.
There's no such thing astraditional.
(34:25):
So, like when you're talkingabout traditional search, I
think everyone's still kind oflike having nostalgia for the 10
blue links which have been goneforever.
Speaker 2 (34:32):
The, the, the Google
I.
I have played with it just alittle bit, but I am like when
they make this switch to the newGoogle AI interface or whatever
AI mode, yeah, like when theymake this switch to the new
google ai interface or whateverai mode.
Speaker 1 (34:47):
Yeah, I mean
basically google's saying old
search is dead.
Yeah, I mean it really, andthat's coming, yeah yeah um, and
it's here in the us.
This just hasn't been fullyturned over um they're going,
they're doubling down on youtube.
That's what I see they'rethey're totally doubling down on
it I mean youtube is incredible, yeah, yeah for sure.
And then obviously there's likeReddit and you know everything
else being pulled in there too,so it's all AI search now.
(35:10):
So really it's not a questionanymore of traditional search
versus AI search.
It's more a matter of what isthe interface between you and AI
and AI search engines?
And so one interface is what wecall Google, which is
increasingly not the traditionalSERP, although the traditional
SERPs are still getting a ton oftraffic.
(35:31):
Now we're going to see amerging to your point between
these two things.
Ai mode and the SERPs are goingto like meld, because there's a
lot of ecosystem things thatare still happening in the SERPs
that Google has billions ofdollars invested into.
Think about merchant center,flights, job ads, hotel, like.
There's a lot there thatthey're not just going to like
piece out on.
So but it's the question isgoing to be what's that going to
(35:51):
look like?
Speaker 2 (35:52):
what in the context
of ai lm type interfaces being
primary interface there, youknow, um well, the
personalization like how you,based upon someone's past
histories and the things thatthey've said to the LLM, it's
going to customize an answer toyou.
So the only way I can seearound this is you need to know
(36:13):
exactly who your customer is andyou need to be speaking to them
on all levels, and then youcan't worry about everything
else.
You're not going to geteverything else.
You just want to own as deep asyou can that topical authority,
that, that, that silo or thatvertical, and just you know.
(36:33):
Not try to claim everything.
Speaker 1 (36:35):
Exactly, and so we've
done some of this in our work.
The way our company is set up Iknow I haven't talked about the
company too much, but the waywe're set up is we have our
platform, we have demand spheres, like the company, and then we
have demand metrics, which isour core analytics engine, and
then we have a couple ofdifferent pipelines.
One pushes everything intoBigQuery-powered data warehouses
, and then we have anotherpipeline called ProseVector,
(36:58):
which is our own internal ragpipeline, and we do a lot of
different things with that.
From a service perspective, wehave a SaaS offering, but we
also provide managed services tohelp that implementation, and
we also have solutions.
You know consulting.
You know with that too, and sowhat we're seeing that allows us
to see a lot of interestingthings that we would not see if
we were just a tool vendor, andit helps us to differentiate
(37:19):
quite a bit with our largercustomers as well, and so one of
the things that is, I think, anuntapped thing, but we're
starting to hear a lot moreabout this and we've gotten
pulled into some of these aswell is thinking about what
you're talking about isunderstanding your ICP and
building that profile aroundyour ICPs and your SEO team may
(37:41):
not be I mean they should be,but they may not be the best
group of people to articulatewhat your ICP is, your set of
personas is.
I mean, you have an entirelydifferent set of teams that
that's their whole job.
Hopefully your product team isall over that, but a lot of
times they're not.
But what are the sources ofyour ICP data and how are you
going to integrate those backinto your search campaigns, your
(38:03):
search operations?
So this would be anything fromall of your CRM, all of your
Zendesk, all of your reviews,all of like.
There's tons and tons of datathat every company has about
these things that need to begetting integrated from a
pipeline perspective into yourentire product and search
strategy, and that's somethingthat I think is going to be a
big deal more and more.
(38:24):
And you're going to get a lotmore leverage from a search if
you're a search person talkingto these teams.
When you're talking about theseICP-based personas From a
monitoring perspective, as adata guy, it's very interesting
because you basically have twodifferent ways of looking at it.
Our main business from a dataacquisition perspective is what
(38:45):
we just call like nakedmonitoring of the index and the
responses as they are now.
We do also have the capabilityand do some.
Surprisingly, we haven't had asmuch demand for it yet, but I
think that's going to change.
To change.
We do do some like memory-basedmonitoring, where you're
(39:06):
basically incorporatingeverything that you know about
your ICP into the monitoringscenarios and building out
system props and everything elsethat can help guide what those
results are going to look like.
It's never going to be perfect,it's never going to be
one-to-one, but you can get somereally interesting insights out
of that too.
Speaker 2 (39:20):
So I would love for
you to go a little bit deeper.
Just anonymize any of theclients, but what are some of
the just interesting insightsgood, bad, whatever of, as
you've helped do some of theseimplementations and been working
closely with these enterpriseclients?
What are you seeing Like?
What are you as you kind ofopen up the hood?
What are you seeing under thehood?
Speaker 1 (39:41):
You know, good, bad,
ugly kind of open up the hood.
What are you seeing under thehood?
You know good bad ugly On thegood side.
What I am seeing and I guessthis is more of like a general
trend but the search teams arekind of undergoing a resurgence
in importance.
You know they're gettingquestions from their board about
(40:02):
search strategy around stuffthat just never happened before
and everyone's getting pushedinto getting really good at
understanding AI, and so whatthat's turning into is a lot of
interest in new budget gettingunlocked.
Where things get tricky andinteresting for them very
quickly is now I'm being told Ihave to have an AI strategy and
(40:23):
like, where are we on AI?
It's like, well, what do you?
How do you even know what tolook for?
Like, what do you?
What is the basis of yourprompt research and your prompt
strategy?
And and again this goes back toreally understanding your users
but also understanding the datasets that are available, you
know, and so that's one of thethings that we've really doubled
down on on that side of things.
We've been very successful onthe data side for helping
(40:45):
companies to very quickly get afeel for, uh, the research side
of things on the prompts and thedifferent types of things that
are going to be querying thesemodels.
Um, and we're seeing that, um,you know it's, it's companies
are not nearly as visible asthey might've assumed they were,
and so that's.
It's good in the sense thatyou're surfing that, in
(41:06):
surfacing that insight.
Bad in the sense that a lot ofcompany I'm talking like global
fortune 50 companies and youstart interrogating the reasons
why they're not as visible asthey are.
And it goes back to what wewere talking about at the
beginning, where it's like ohyeah, we're just blocking all
the bots, that's it.
These are not.
These are not publishers.
These are like transactionalbusinesses, like your content,
(41:27):
like Google's already.
You're not blocking Google bot.
I published a flow chart onthis about a week ago that you
know.
It was kind of interesting, andbasically it came down to do
you think that you need to beblocking these bots?
Yes or no?
No, okay, good.
Do you think that you need tobe blocking these bots?
Yes or no?
No, okay, good.
We're on the same page.
Yes, you do think that.
Okay.
Are you blocking Googlebot?
No, okay, good.
(41:47):
If you are, do you realize thatall of these you know ChatGPT
and everybody else they'realready indexing Google anyway,
so they're getting your contentone way or another.
It's just a question of whetheror not results in their
interface because you'reblocking them, and so the cat's
out of the bag.
You know it's.
It's tough for certainindustries, like publishers, but
it is what it is, and so, youknow, I see some weird stuff
(42:09):
like that too.
Speaker 2 (42:10):
So I mean a lot of
these.
Well, there's a lot of uh,increased with whatever factors
you want to say, geopolitical,anything like that.
There's a lot more, uh, cyberattacks, and so people are just
trying to clean up the trafficthat's coming to their site
because they don't know what itis.
And a lot of these LMs are onrotating IP addresses, right, so
you can't just like whitelistit on your server and so I don't
(42:34):
know.
I mean, what was the answer?
I feel like you want to make itas easy as possible to use as
little crawl budget as possibleto give them the the best
roadmap to to leverage your dataand not have them to jump
through hoops to to figure itout Like I feel like they're
going to go.
Like I said, the the lazinessis like the easiest, quickest
(42:56):
path, totally.
Speaker 1 (42:57):
Yeah.
Speaker 2 (42:58):
And so, yeah, I would
, yeah, send.
And so, yeah, I would, yeah,Send me that article.
I'd love to take a look at that.
What else, right, we're gettingclose to time here.
I would love to hear, kind of,one of the things that LLMs love
to know is how you're differentfrom your competitors.
(43:19):
Right, that's a bigtransactional term.
I would love to have you kindof position demand sphere in the
space.
That's why I included you inthe search engine journal
article, because I feel likepeople are still using the old
tools, and so I would love tokind of hear it from you of how
(43:42):
you feel that you're differentfrom what's currently on the
market.
Speaker 1 (43:45):
Sure, yeah, I
appreciate the question.
So it depends on which set ofaxes you're using to define the
different types of things.
Things are moving so fast nowthat two years ago I would have
given you a different answer ora different set of categories
than I would right now, or Iwould maybe caveat things a
little bit more than I do nowjust because things are moving
(44:06):
so fast.
But basically we've been inbusiness for 15 years and so we
came out of the search world.
You know, 15 years ago it wasall about rank tracking and
content analysis and everything.
So we've done all that for 15years and still do a great job
at it.
That being said, it's not aboutrank tracking anymore.
So when you're comparing thesolutions that are in that space
(44:26):
, we don't even call it thatanymore.
It's been SERP analytics, youknow, and SERP analytics versus
rank tracking are two verydifferent things.
Rank tracking is basically justlooking at generally, like what
is your positioning within theorganic rankings, and also look
at features in terms of likepositional stuff, potentially,
if you have a tool that doesthat.
(44:47):
More broadly, on the visibilityside of things, it goes into
all sorts of additional factors.
Talking about some of the stuffthat I was talking about before
.
Billions of dollars are spentand consumed every year on
shopping results and shoppingads and hotel ads and bookings
and all sorts of things that arehappening within the SERP
feature, and there's an entireset of share of voice modeling
(45:09):
that happens within all of thesedifferent things that are
impacting user behavior.
So when people get bent out ofshape about zero-click searches,
they're not really factoringall this stuff into that
equation.
So one of the things that we dofrom a data perspective is we
are capturing the shape of theSERP visually and showing what
is impacting user behavior.
Um, and that is all about.
(45:31):
Again, it's, it's not.
There is no traditional searchanymore.
It's all AI search.
So if you take this type ofinterface SERP interfaces uh, as
the largest attention drivingand traffic driving source on
the internet, we're in there allday long looking at that data
and measuring that and helpingcompanies to understand how
that's relevant to theirbusiness.
So that's a big part of what wedo.
And so you know, if you look atlike from a differentiation
(45:54):
perspective, there's really onlya few companies out there that
go to the level that we do.
And you know, obviously withinthat space, space, there's going
to be all different types ofdifferentiation.
Our thing is all about buildinga unified view across disparate
data sources.
So we integrate ga4, weintegrate search console,
because that unlocks a lot ofopportunities for understanding
(46:15):
your market much better than youwould have otherwise.
Um, and then our solutions.
You know provision on top ofthat, the fact that we can build
new solutions on top of allthis data very quickly.
We have a whole team of dataengineers that are very good at
this.
That's one of our other majorones.
On the what people would kindof typically consider like pure
AI side of things we're talkingabout, like chat, gpt,
(46:36):
perplexity, obviously now AImode and Gemini and so forth.
We have a whole monitoringsolution around that too, and so
we're doing very richmonitoring around that.
And again, for us, thedifferentiation on that is not
only are we doing that, we'realso tying it all the way back
to what's happening within theselarger indexes, whether it's
Google being the largest one,but also we're looking at Bing
(46:58):
and other indexes too.
So there is a lot moreinterplay happening between
those two worlds than iscommonly understood, and so we
spend a lot of our timeeducating our market and our
users about how to really takeadvantage of a lot of that
interchange that's happening.
Speaker 2 (47:14):
And so I wanted to
drill down on one area.
Would you feel that with SERPanalytics, it's more important
to look at how you're doing andlike your share of voice, but do
you think compare because I'mseeing this from a lot of CMOs
they want to know, likescorecard wise, how they're
doing versus their top whatevercompetitors and that was a big
(47:37):
component of rank tracking right, like we're doing better than
them, you know, and now it's alot harder to see it.
And so how would you frame thatup to an executive team of how
we're doing versus like themarket?
Because, while the market'sconstantly changing, so the only
point of like reference is howwe're doing it.
(47:59):
Like if the waves are goinglike this, where are the boats
at right?
Speaker 1 (48:02):
yeah, no, exactly
that's a great question, um, and
I think you have to hand.
We do so.
The share of voice, competitiveshare of voice is something that
is a huge part of our platform,and I I think we have probably
the best and most granularversion of it out there.
Um, this is very customizableand adapts to your data, um, and
that, so that's really good forthe things that you know you're
(48:25):
tracking, but what it also doesthat a lot of others don't is
it takes a market level view onthings, too, and so, even if
you're not tracking it, it'sstill going to show you who is
affecting your share of voiceand who is affecting what your
users are seeing, and you maynot even have them on your radar
, and so that can get again verygranular and very detailed, but
(48:50):
a lot of times, uh, there arethings that are going to limit
your visibility.
No matter how good you are froma competitive standpoint, you
could be beating every singleone of your competitors, and you
still are only going to get toa certain level of visibility,
because there's other sites outthere, like Wikipedia, amazon.
You know you're never going tobeat them, and so you have to
understand what that looks likeas well, and so we have a ton of
data on that, and understandingthat and how to use those
(49:10):
sources to your advantage issomething that is a really
important thing, too, so youhave to look at both.
Speaker 2 (49:16):
Awesome.
What, as we kind of wrap uphere, what is one unknown secret
of internet marketing?
Maybe you can repackagesomething we talked about is a
huge takeaway for people thatare trying to understand and
orient themselves to all thechanges happening today.
Speaker 1 (49:39):
This was.
I get this every once in awhile.
The best answer I can give onthis is, even though I'm a data
guy, is is very non-technicaland it's basically be as
remarkable as you can be, be,you know it's.
It's kind of like when I movedto Silicon Valley in 2010,.
You know, networking was notsomething that came naturally
and the best advice I had aroundthat was the best way to you
(50:02):
know, have fun at all thesedifferent Silicon Valley parties
, and everything is to beworking on something that is
solving cool problems thatpeople want to talk about, and
people will just come up to youand find you, and so it's the
same way on the internet.
People are very simple creatures.
At the end of the day, we'redrawn to the light, you know,
and so like.
Do things that are generating,you know, that light of
remarkability or whatever youwant to call it, and you'll get
(50:23):
the attention that you need, andit's just a matter of
organizing that into acompany-wide campaign level.
Do it all the time.
You can't ever just launchsomething and just expect it to
pay dividends for months andmonths.
You have to be doing it overand over again.
Speaker 2 (50:38):
Awesome, I like that.
So what, Ray, is the best wayfor people to get in touch with
you, follow your work and checkout?
Demandspherecom is a greatplace to start.
We'll put that link in the shownotes.
I know you're pretty active onLinkedIn.
Is there anything else thatpeople should be looking out for
on stuff you're working?
Speaker 1 (50:57):
on?
Yeah, for sure, demandspherecomand LinkedIn are the two places
where they're most active.
We just launched our eventspage on DemandSpherecom as well,
so it's just right at the topof the menu.
I love going to events becauseI like talking to people in
person whenever I can, and soit's one of the main reasons I
do it.
So you know, I'm all over theworld most of the time, so
(51:17):
there's a decent chance thatI'll be kind of near you if you
ever wanted, you know, connectat one of those events and doing
a lot more webinars and stufftoo.
So that's usually a good way tokind of hear what we're talking
about these days.
Speaker 2 (51:30):
Yeah, and as a
attendee of one of the demand
sphere breakout events from SEOweek.
He put together a great eventreally enjoyed it.
Met a lot of great people,learned a lot of great stuff, so
really enjoyed it.
Met a lot of great people,Learned a lot of great stuff, so
really enjoyed that.
So everyone, it sounds likeChatGPT is becoming Google
faster than Google is becomingChatGPT, because Google is
(51:51):
doubling down on YouTube andwell, they're trying to become
Amazon.
So the market's changing really, really rapidly.
So continue to stay tuned tohelp as we kind of guide through
what to do in this crazy time.
Until the next time.
My name is Matt Bertram.
Bye-bye for now.