Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joe Panepinto (00:00):
When it comes to
data, my personal perspective is
(00:02):
more isn't always better. Dohave these people, and I'll go
back to you mentioned that Iteach, go back to is like, I
can't grade the paper that younever wrote, right? I can't, you
know, always waiting for theexact right data to tell us
exactly what to do gets in theway of making the decisions we
need to make. Pros and Comms.
Laura Smith (00:34):
Welcome to Prose +
Comms
I'm Laura Smith.
Brian Rowley (00:39):
And I'm Brian
Rowley.
Laura Smith (00:40):
Today's topic is
one that I think that is near
and dear to all of our hearts,data. As marketers, data can
guide us, but it doesn't alwaystell the full story. Sometimes
we need to know when to trustthe numbers or when to follow
our gut. And as we know with AI,we're only getting more and more
data every day. So how are wegonna understand how to
interpret it and act on it andmaking sure that's becoming just
(01:03):
as important as it is tounderstand the insights
themselves?
And today, we're gonna get rightinto this conversation.
Brian Rowley (01:09):
Yeah. We've got a
really interesting guest today.
So he's a seasoned strategyleader with some really good
expertise and experientialmarketing, sort of helping
brands craft experiences thatresonate and drive impact. In
addition to that, he's also anagency veteran known for
blending creative strategy, AIinsights, and hands on learning
to solve complex marketingchallenge. And on top of all
(01:32):
that, in his free time, he's along time professor bringing a
thoughtful analyticalperspective to the intersection
of data, instinct and strategy.
So please welcome Joe Panapinto.Joe, welcome to the show.
Laura Smith (01:42):
Welcome, Joe.
Joe Panepinto (01:43):
Thanks, Brian.
Hey, Laura. How are you?
Laura Smith (01:45):
Great. We're
excited for today's discussion.
Are you?
Joe Panepinto (01:47):
Oh, I can't wait.
I can't wait.
Brian Rowley (01:49):
This is going to
be a good one. We're going to
jump right into it because Ithink one of the things that we
should start with is sort ofwhen it comes to shaping your
overall marketing strategy, Joe,where do you find data most
useful? And I guess the otherside to that is where do we see
the shortfalls?
Joe Panepinto (02:08):
Sure. I mean,
it's a great question. And as
you had mentioned, Laura, at thetop, there's more and more data.
I mean, there's no shortagedata. You have data on
everything and you can track tojust a much more granular level
than you used to before.
I mean, it's very usefulthroughout. And I think very
(02:28):
often people will not use dataanalysis throughout a campaign
or an experience. They'll do itat the beginning or they might
do it at the end. But data isreally helpful throughout. So
it's helpful to mine insights.
So what do we know about aparticular audience? Is there
anything that can tell us aboutwhat some of their preferences
(02:49):
might be, even some of theirattitudinal stuff, information
about their psychographics andtheir demographics, just so that
we could begin to formulate,whether it's an experience or a
campaign, something that's goingto resonate for them and it's
going to engage. Then whilewe're activating, whether it's
during an event or during acampaign, checking in on the
(03:12):
data in an appropriate way thatcan give you the opportunity to
shift strategy, shift tactics,is important. And then in the
end, trying to understand whatwe were trying to achieve more
so from a business perspective,is there any data that's going
to give us an indication thatwe've achieved the things we set
(03:32):
out to? So I think using datathroughout the process is really
important.
I think where it falls short iswhen we were talking the other
day is like data doesn't speakfor itself. So an overreliance
on data and assuming that datatells a story, I think is a
(03:53):
mistake I see with youngstrategists all the time. I
think, you know, there isindicators, there are
indicators, and the data willgive us some indication of
what's going on. But theinterpretation of what's
actually going on, especiallywith experiences, really needs
to be filtered through yourexperience and your
understanding.
Laura Smith (04:14):
Well, that's a good
segue, Joe, because you
obviously focus in events andexperiential marketing. So talk
to us a bit about how youmeasure success, you know, from
the action to outcome. Yeah.When there isn't that closed
loop, right? Because there's aphysical experience potentially.
There's not like maybe a digitalaspect of it. So how do you look
at those experiences and justifyand or interpret if they're
(04:38):
successful or not?
Joe Panepinto (04:38):
Sure. I mean,
there's a bunch of ways to kind
of address that question. I'lltake it from, let's talk about
the ways that we collect dataand an experience of the
potential ways we could collectdata, because that's actually a
fairly simple way to start out.And like any good strategist, I
have three S's. Things have tostart with the same letter.
Laura Smith (04:59):
Always in threes.
Joe Panepinto (05:00):
Of course. And so
you can use sensors, right? So
sensor data will give us a senseof how people are flowing, what
people are, how people areengaging, where they're
engaging, how they're movingthrough an experience. It'll
give us dwell time, all kinds ofdifferent KPIs that'll give us
some kind of indication of howmany people we're reaching,
(05:21):
where they're going, and whatthey're doing. So the first
source of data is really sensorsto start to understand what are
people actually doing in theexperience itself.
We do surveys, right? We dosurveys too, we do pre and post
surveys, we do pre, post andduring surveys, we can do all
kinds of things, asking people,you know, what they're thinking,
(05:46):
what they're doing, what thenext action is, what their, you
know, interpretation of whatkind of knowledge we might be
transferring to them is. Andthat gives us a different level
of data and a different sense ofdata because it's something
that's interpreted by theparticipants. So we could
collect sensors for what arepeople doing, where are they
(06:06):
going, surveys for how are theyfeeling, what are they intending
to do. And then we can look atsocial, and this is one of the
areas that is, I think,undervalued in terms of value
proposition for experiences,because experiences are content
(06:27):
generators and the opportunityto use an experience, whether
it's a fan fest fan fest is agreat idea or a great example,
Whether you use that as aplatform for creating content,
and then what impact did thatcontent have?
So really that gives you apretty rounded view if you think
(06:48):
about what are people doing inthe actual experience, how are
they feeling about it and whatare they intending to do, and
then what kinds of actions havethey taken, that kind of covers
the gamut of a spectrum that youwant to collect data for to give
you an indication of whetheryou're doing what you need to
do.
Laura Smith (07:07):
Does that resonate
with clients or customers? Do
they get it? Because it makessense. Tell them. It makes
sense.
There data there. Obviously,you're gathering research, all
of that, so piecing thattogether and interpreting it
will tell a story. But I couldsee some customers or clients in
your world, you know, maybe or,like, that's not enough. You
(07:29):
know? Where where do I you know,is it is it are we making money
off of this experience, and isthat does that hold you up?
Joe Panepinto (07:36):
It's a great
question, and it's one that's
asked all the time. And you dosee both ends of the spectrum, I
think you all know. You havepeople that don't want to
measure if they're not beingmeasured. They don't want us to
measure it because they're notbeing measured internally, and
they feel comfortable if theirboss feels like it's successful,
(07:56):
it's successful. That's not avery strategic way to go about
it, and probably not a very longlived way to go about it.
But the desire to go from anykind of marketing intervention
or any kind of marketing tacticto sale is the holy grail. It's
(08:20):
what everybody wants. They wantto understand what contribution
did this particular interactionwith the client have on their
decision to make a purchase?Well, you mentioned at the top,
we don't often have a closedloop where we have identities of
(08:40):
people that we can then trackthrough all the different touch
points. Digitally, we get a moreclosed loop system, right, from
prompt all the way through topurchase.
But experientially, it dropsoff, very, very typically it
will drop off. So we have torely on the fact that we are
(09:05):
part of the marketing mix and animportant part of the marketing
mix and where we can runexperiments, natural
experiments, I would say, wheresomeone is exposed to an
experience but not exposed to anexperience, we can compare and
see where, you know, in thisclosed loop system, whether they
are purchasing more often ormore is kind of the closest that
(09:31):
we can get. In general, though,for experiences, it's really
about brand lift, it's aboutengagement, it's about a couple
of single questions. Is this abrand for me? An NPS question,
so a Net Promoter Scorequestion.
Would you recommend this? Thoseare all correlated very highly
with purchase behavior, so wecould use those as proxies, but
(09:55):
going from inexperience to saleis very difficult. Clients are
asking for it all the time, andwe're getting closer and closer
as we try, and as the technologygets better, but that is a
challenge, and it's currentlynot solved.
Brian Rowley (10:12):
And we'll talk
about this a little bit further,
but later on here in thisconversation, because I think AI
also has a play in there, right,to help us get to that point.
But before we go there, becauseyou've talked on a couple of
different things and we saidthat, you know, sometimes people
don't want to be measuredbecause they're not expecting to
be measured. But I think theother side to that too is
(10:34):
sometimes people fall into thatanalysis paralysis mode where
they have so much data that theydon't actually know what to do
with it or they're sort ofexpecting sort of the magic
number, right, to guide adecision. So I'm not so sure
that it's the, they don't wannabe measured as much as, okay,
(10:55):
now that I have this, I mean, weall fall victim to this. I can't
tell you how much money we'vespent over the years in roles of
people who are collecting data.
My first question is always is,okay, now that we have that,
what do we do with it?
Joe Panepinto (11:07):
Yeah, it's a
great question.
Brian Rowley (11:09):
How is that
changing the influence? I mean,
recommendations there, what istoo much? When do you have
enough? Because you mentionedfrequent touch points, but when
is there enough for you tostart? Or is it really like you
start to use your gut at thatpoint?
Joe Panepinto (11:25):
I do think I do
think you start to use your gut.
I do think you have to use yourgut throughout. Right? You have
to look at numbers, and you haveto interpret them. So what what
do they suggest?
We could talk about some of theAI stuff later, because once
that becomes automated, there'ssome benefits. And then there's
also some lack of transparencythat could become difficult,
(11:48):
right? Because every system hasassumptions built in. And so
understand I think when it comesto data, my personal perspective
is more isn't always better. Youdo have these people and I'll go
back to you mentioned that Iteach, I go back to is like,
can't grade the paper that younever wrote, right?
(12:08):
I can't you know, always waitingfor the exact right data to tell
us exactly what to do gets inthe way of making the decisions
we need to make in order tocontinue to push forward in
whatever marketing objective wehave. So I think that
(12:28):
recognizing and thinking aboutdata on a couple of levels,
right? There's indicators, andthose indicators are going to
give you some sense that needsto be interpreted on what's
going on. And those indicatorsmight, you know, be positive or
negative, and you start to lookat those and then you make some
decisions based on it. And thenthe final outcomes.
(12:50):
And those two things are verydifferent, right? So the final
outcome is what did someoneactually do? And one point I
like to make about surveys,which can be a little pedantic,
but I can be a little pedantic,so is just, you know, when
you're asking people what theydid, you're still asking them,
right? There's still apossibility that not telling you
the truth. So we actually have afairly limited amount of actual
(13:14):
outcome data, yet what we knowin general is that, at least on
the experiential side, is thatwhen people interact with a
brand in an experience, in alive face to face experience,
their interpretation of this isa brand for me goes up
(13:34):
significantly.
That brand lift lasts longerthan exposure to an ad, and
their inclination and purchaseintent goes up. And that all
makes sense because what anexperience does is captures
(13:56):
attention for a sustained amountof time. And so many other media
don't necessarily do that. Butthe world of marketing and media
always talks in CPMs, right?Like what is our cost for an
impression?
And an impression at an eventversus an impression on a scroll
(14:20):
is very different, right?
Brian Rowley (14:22):
Well, you also
have the humanistic element,
right? So the experiential. AndI think, you know, Laura and I
have had this conversation onprior podcasts, just the power
of the people and how do youeven use, to what extent do you
use people within theorganization to help tell the
brand story? Because that iswhat people, people interact
(14:44):
with people. That's where thevalue sits there.
And, you know, if you look atit, it's almost like when we
talk about it, you know, thehardware and whatever we're
producing on a screencomplements that. But that's
that's not necessarily the hero.The hero is our people at the
end of the day because peoplegravitate towards people and and
that's what helps brands in inmy opinion.
Joe Panepinto (15:05):
Well, and that's
the you know, I agree. I mean,
the way we look at it is whenyou come together in an event or
an experience, it's typically,we look at it as a cultural
moment, right? Whether you'retechies, whether you're trekies,
or whatever different cultureyou belong to, those experiences
(15:26):
are an important moment in timefor that entire sub community or
culture. And so they are moreauthentic and have more impact
because it's concentrated,right? It's concentrated in your
greatest fans.
And I'm not just talking aboutfans of sports, but fans of
products, fans of differentcompanies, right? And so when
(15:48):
you come together, that impactis much deeper because the
attention is more important, thevariety of inputs that we have.
So when you think aboutscrolling through a screen,
anything that comes across ascreen, I would consider as one
medium. As one medium, itdoesn't matter. We break up
social, it's an ad.
(16:09):
No one knows the differencebetween a social post and an ad,
an influencer post and a paid ororganic. They are interacting
through a screen. In person, youinteract in person, and there is
some value to that that lasts alonger time. And so trying to
capture what that is, that'swhere we try to use multiple
(16:33):
methods, so some sensors, somesurveys, some social, to try to
capture that unique value. I'llgive you an example.
So very recently, we ran a proofof concept. We were thinking,
okay, one of the undervaluedpieces of the value proposition
(16:55):
for events is the content thatthey create. So is content
that's driven by an event, at anevent, more engaging and more
positive than general brandcontent? So we grabbed a dozen
activations, B2B, B2C from lastyear. We looked at you know,
(17:17):
several million posts, not oneat a time.
I didn't do it all. We lookedThat's at, you know, we right.
That would be impressive. But welooked at them and what we
noticed was that consistentlyacross more than a dozen
activations, content that wasdriven by an event or that was
(17:39):
created at an event about thatevent for any particular culture
was more engaging and it wasmore positive. So, you know,
starting to think a littlecreatively about when you get
people together, what value doesthat create for your brand is
(18:00):
the first step.
And then the next step is, okay,well, how can we measure that?
What's gonna give us someindication of whether that's
actually true?
Laura Smith (18:08):
Can we stick to the
humanistic side of things for a
minute? Because I do feel likethat human role is important. So
when we think about interpretingdata, what role does empathy
play in doing and interpretingdata? Because I feel like you
have to Tell me more. Well, Ifeel like you have to
(18:29):
understand, like, you know,because it's not just about the
data.
It's, like, understanding thosecircumstances on which you're
gathering the data, what thatdata could mean, how you're
interpreting it. So, like, youhave to think about the end
customer, something about humansand how we engage with people.
But like, does empathy play arole in that?
Joe Panepinto (18:45):
Empathy, very I
love that question. Empathy,
very broadly. So probablyawareness of everything. I might
I might blow up the definitionof empathy, right? So what can
impact when we are measuringexperiences, I could speak for
experiences, There are severalfactors that you may not
(19:11):
consider that could impact thedata.
For example, what was going onthat day in the news? What was
the weather like? What else wasgoing on that day? What was the
traffic like, for example?Right?
So all kinds of things that youneed to bring some interpretive
(19:40):
skills to is really important.So looking at the data, that's
where it goes all the way backto the kind of throwaway line of
like, like data doesn't does itjust doesn't speak for itself.
Data never speaks for itself.Data is is an indicator of
something. And then you have todo some interpretation to
(20:00):
understand.
And gut check, to your point,like, how many times have you
looked at a report and you'vesaid, That doesn't feel right.
Let me ask some more questions,and then found the reason that
maybe the data didn't feel likewasn't correct. And that's where
(20:20):
I started to allude to, like,when this becomes a black box
and you just get a report out,success, not success, you know,
do more of this, do less ofthat. I think that that's very,
I'm not going to say dangerousfor a brand because danger is a
really alarmist
Laura Smith (20:40):
It's siloed. It's
like siloed thinking.
Joe Panepinto (20:42):
Right. And then
it becomes. But then if you rely
too much on systems to makethose interpretations entirely
and there's no human in theloop, which when we talk about
AI, we talk about the importanceof human in the loop. When
there's no human in the loop,there's also no differentiation.
Right?
So, okay, I subscribe to thisservice that does all the data
(21:04):
interpretation for me and justtells me to do more of this, do
less of that. And I don'tunderstand why that's going to
flatten the differences betweenwhat marketing teams bring to
the table, and I do think thatmarketing teams do make a
difference.
Brian Rowley (21:24):
And I think, you
know, we talk about all the time
balance. And I think that thehumanistic side towards the
automation side, there's abalance there. Because I will
say that, you know, in the threeS's that you mentioned, when you
look at survey, how many of ushave been surveyed so many times
that we're just like, notanother one, I'm just not doing
(21:45):
it. And I think that ties backto Laura's comment somewhat with
empathy. I think we have torealize what are we asking and
how does this not only benefitthe company, but what impact
does it have on the person whoyou're asking to participate?
Because that also very muchskews your results. And I think
(22:07):
that's one area where I think AIand we can transition into the
conversation about AI, but Ithink that is one area where AI
will definitely help us because,you know, things like
understanding, just as anexample for us, in the digital
signage space, our players havea neuro processing unit that's
built into them that allows you,you know, to, you know, process
(22:30):
a lot of information. And so weare looking at things like gaze
detection and all of that objectdetection and all of those
things so that we don't have toask people we're doing that,
which, you know, of courseborders on the line of, all
right, now you're being creepy,but are you you're like, yeah,
we're empathetic, but yet we'rebeing creepy. So, I mean, I'm
(22:51):
just curious, Joe, from yourperspective, like, do you think
AI is actually going to impactthe future of sort of this whole
strategy and how that all playstogether?
Joe Panepinto (23:02):
Yeah, I mean,
it's such a great question. I
mean, AI is, you know,increasingly we're seeing it as,
and it's being implemented inmultiple ways for multiple
stuff. Right? And it's beingembedded in all kinds of
existing things. So so todescribe AI in general is is
I'll probably make errors aboutit.
(23:25):
But when you think about the wayI think about it, like what does
AI excel at? Well, patternrecognition and data analysis
pattern recognition. Now, thebig models are doing better and
better in pattern recognition byinterpreting data, but they're
(23:47):
not perfect. This is one ofthose cases where you know, if
you're wrong two out of 10times, it really matters, right?
Like I will run, I'll often usemodels to interpret Excel sheet
data.
(24:07):
I get a ton of data, and there'sExcel GPT, and then there's
versions of the same inPerplexity, and there's gems on
Gemini that are customized fordata analysis. But I'll run it
through all four models that Iuse on a daily basis. I'll take
it and I'll see just to see ifthere's any differences, because
(24:29):
I'll tell you what, sometimesthere is, right? And then I have
to say that number doesn't lookright. But then I go to, you
know, Brian, it's funny becausethen I go to, okay, well, how is
the next generation of peoplegoing to actually be able to
interpret it and look at it andsay that doesn't look right
versus like, oh, that's what themachine told me.
(24:50):
And I'm not denigrating the nextgeneration and say, oh, that's
what the machine told me. Butthe more you automate things,
the more difficult it is tointerpret and understand what's
going on and what the inputsare. And that's one of the
downsides, I think, ofautomating so much. Stepping
(25:12):
aside from, let's not even talkabout AI in terms of
intelligence, but justautomation, and say, the more I
automate something, less Iunderstand actually how it works
and what the ingredients are.And very often, understanding
the ingredients that drive anoutcome from the data
perspective can go all the wayback to designing something
(25:34):
well, right?
Like understanding which partsand pieces of an experience
drive an outcome. So I thinkit's a mixed bag. I love the
idea that we have more, youyou're collecting data that we
didn't have before. There wasone I'm sure you saw it. There
was one activation.
(25:54):
It was several years ago wherethe more people looked at a
billboard, the more it uncoveredinformation about a social ill
that was going on that wasbehind the curtain. I think it
was domestic abuse. And the moreeyeballs that were on it, the
more you saw it, the more itexposed the issue. And it really
(26:17):
made a point, right? So it wasintegrated into the actual
experience itself.
You know, when we talk aboutsentiment recognition and facial
recognition, I often thinkabout, you know, like, you know,
people have faces that you know,I'm can I curse? I'm not gonna
curse.
Laura Smith (26:37):
You can curse.
Brian Rowley (26:38):
You can Right?
Joe Panepinto (26:39):
It's like, what
do you do with resting bitch
face? I mean, what do you dowith, like, when people are
like, okay. Well, jeez, they'rethey're mad. Right? Like That's
Laura Smith (26:47):
who they are.
Brian Rowley (26:48):
Just do it.
Joe Panepinto (26:48):
Right? It's like,
Okay, well, I'm going to
mischaracterize that. But it'slike social. It's like social
analysis. Right?
Social analysis still is not Imean, not still. It's not
perfect. There are tools thatinterpret sentiment. There are
tools that, you know, interpretstuff. They get better and
better.
But there's still a need tointerpret what you get back by
(27:08):
triangulating a whole bunch ofinputs. And that's what I'm you
probably got a sense, as I said,like, you know, who sends a
spreadsheet to four differentmodels? Why don't you just send
it to one? I mean, I'm that guywhere I'm like, okay, I want as
many perspectives as possible,then I can begin to, you know,
sort through it.
Laura Smith (27:27):
You like to nerd it
out.
Joe Panepinto (27:28):
A little bit.
Yeah. Well, a
Laura Smith (27:30):
lot can see that. I
told my wife,
Joe Panepinto (27:31):
I was like, this
is like a video game. She said,
no, it's
Laura Smith (27:36):
not. Real life
data.
Joe Panepinto (27:38):
Yeah, it's
working.
Brian Rowley (27:39):
But a lot of
people, I think, look at the AI
conversation as an eitheror. Idon't think that's the
discussion. I think thediscussion is you still need
that human interaction to beable to say to your point, like
I understand the industry wellenough that I can look at this
(28:00):
data and I can call out orquestion parts of it because I
know it enough. But we can'targue the fact that it does
provide us access to a ton ofinformation a lot faster than we
would have ever been able to doit. So that businesses can make
the pivots that they need tomake a lot faster versus waiting
for a campaign to run itselfout, gather all the data, sit
(28:23):
down, do a post mortem of thecampaign.
I mean, we can do this on thefly right now. And I think
that's the part that's reallyvaluable, but you still need the
people who can interpret thatand understand, okay, I know
this is what I should beexperiencing. This is way off.
So I'm not sure I'm gonna trustthat. But I mean, the access to
the data I think is amazing.
(28:44):
But I don't think it's thateither or conversation. I do
think it's a combination still.
Joe Panepinto (28:48):
I think
Brian Rowley (28:49):
anybody who
approaches AI as it's going to
be the replacement to yourearlier point, I think that's a
strategy that's gonna fallshort.
Joe Panepinto (28:57):
I agree. I mean,
I think it's interesting.
There's a couple of points thatI heard in what you said that I
think are worth emphasizing. Imean, the first is just using AI
as a tool, a partner, ateammate, you know, that's
you'll you could read about thatall over the place, and those
people and teams that do that,you know, are more successful.
But also recognizing that youknow, I say it all the time.
(29:21):
I'm like, it used to take a lotof time to do a SWOT analysis.
Right? Mhmm. Not a lot, butsome. Right?
To do it well, it took sometime. Now it takes nothing. So
it takes five minutes. It takesten minutes. It takes whatever.
But getting now you this is whatyou're supposed to do. Now, how
(29:43):
do you align a whole bunch ofpeople to actually do first of
all, to believe that, and thento actually do it. Those skills
are extremely valuable. And it'snot eitheror. I can get a SWOT,
but then I have to convincepeople that that SWOT is
accurate.
Why is that SWOT accurate? Well,I got to go back to some of the
original sources. Or itindicates that we should be
(30:05):
doing this. Okay, well, whatyou'll hear in the promises of
AI and it's like, well, thenit'll automate the workflow.
It'll Okay.
But at some point, it's a humanthat's interacting in the system
and has to understand and thenget other humans to do things.
And you know what? If history isany indicator, I mean, we're
(30:30):
messy. People are messy. Andthey take a lot of, yes, some
data, but data doesn't alsoconvince people to do things.
Lots of people do. They see thedata, they say, I don't think
that's right, they go dosomething else. So those human
skills are just incrediblyimportant.
Laura Smith (30:50):
And I think that
goes back to a point you talked
about earlier too, is that theexperience, right? Experience
over time, doing something,understanding it, and then in
the newer generations, they'llbe obviously, data will be very
much the forefront of whatthey're you know, like, with AI
and understanding that. But I dothink we need to make sure that
they understand that, like, ittakes time and experience and
(31:11):
exposure to certain things to beable to have that ability to
interpret and do exactly whatyou're doing, Joe, because it
just can't be system based. Ithas to there's a reason why we
have our jobs and the reason whyexperience over twenty plus
years matters. And I sometimeswill say, you have discussions
with people, and I've said itbefore, and it's just, you know,
like, you could disagree onsomething.
(31:32):
And even if data's at the basisof what this, you know,
disagreement is about ordecision making is about,
sometimes I'm like, I know thisbecause I've done this for a
really long time. So trust thatI'm gonna take that data, and
I'm gonna then obviously, like,filter that through lots of
other pieces of information orexperiences and have a different
opinion and outcome andrecommendation. And that may
(31:53):
look be looked at as like, oh,because I don't have the
experience. I don't know. No.
Just think that you have totrust that, like, over time and
overexposure to certainstrategies or tactics, it
matters, and it can influenceand work in combination with the
data.
Brian Rowley (32:06):
That's that
instinct. And that that's what
it boils down to. Based on whatI know, based on the time that
I've been doing this, thisdoesn't look right. And I think
that is perfectly okay to beable to step into this world
that we're all stepping into andbe able to say that. But also
step into the worldunderstanding, hey, this is a
(32:27):
really powerful tool that canget me from point A to point B a
lot faster so that I can thenmake my decisions based on that
information.
Joe Panepinto (32:36):
Well, I mean, I
love what you were saying,
Laura. I mean, it's now to gettalk about nerding out, right?
So, internet, I'm old enough toremember the internet, you know,
kind of everything rolled out,the Internet, and and it
shifted. When I think about, youknow, what these general purpose
(32:56):
technologies are gonna do,right, I I give a little bit of
a hierarchy. It's like, they'regonna impact tasks, then they're
gonna impact roles, then they'regoing to impact jobs, then
they're going to impactdepartments, then they're going
impact divisions, then wholeorganizations, whole industries,
right?
It goes that way. So when you'retalking about the value of
(33:21):
experience, the value ofexperience also shifts, right?
So I'll give you an example thatI see all the time. It's very
easy going back to say the SWOTanalysis, it's very easy to
generate one and spit one out.However, and I've gotten them,
and I get them from my studentsall the time, right?
Then I say, Why? Why is that?And what does that mean? And
(33:49):
it's challenging, right? Becauseyou haven't done you haven't
done the work in the same way.
Now, I'm not saying I'm notsaying that to say, oh, they
haven't done the work, so theyhave go back and like, Okay,
well, you know, we have carsnow. And it's like, Well, you're
never going to know how to shoea horse. So you have to go back
and figure out how to shoe ahorse. Right? We're not going to
do that.
However, now we have to thinkkind of hard about how do we get
(34:12):
to those next skills, and whatare those next skills, and what
is the value of your experience?If the value of your experience
isn't putting together the SWOTanalysis or putting together the
plan or the COMS plan, what isit? Is it helping to convince
people that this is right? Is itasking why for alternatives? Is
(34:34):
it, you know what I mean,understanding what next steps
might be?
And so the nature of skills isshifting because the tools are
shifting. Just like if you thinkabout AI as a tool and you think
about physical tools, if youbuild things, a power saw
changes things dramatically,right? What you can accomplish,
(34:54):
what you can do. At first, youjust do what you used to do
faster. But as it becomes morecommonplace, we don't know what
the jobs are gonna be.
The most popular jobs are goingbe five years from now. We
actually don't know what thelargest companies are going to
be five or ten years from now,because they don't even exist.
(35:17):
And that's really hard. Humansare really bad. We are so bad at
predicting the future.
We're awful. We're terriblepredicting future. Tony Robbins,
there are some of those people.Maybe they're better, but
they're not. But it's like we'rehorrible at predicting the
future.
We can't envision jobs thataren't here. That's why those
visionaries, people that startbusinesses and do things Imagine
(35:39):
starting a business today andwhat it would be like with the
tools that we have. No, anybusiness, any business, not Any
industry, any business, imaginewhat you're starting with. It's
just like I said, with the powertools. Imagine starting a house
building business with a wholebunch of hand tools versus a
(36:02):
whole bunch of automated tools.
Laura Smith (36:04):
Right. They show
that I think there's, like, that
meme with that picture that JeffBezos in his first Amazon
office, which is like
Joe Panepinto (36:11):
Oh god.
Laura Smith (36:11):
It's like an old,
computer and, like, papers on
the wall, that's all there is.You know? And now think about,
like, none of those papersstamps. None of those papers or
those big machines exist. It'slike, yeah, it's crazy because
that's right.
Starting with a very differentpoint to where we are now if
you're starting the business.Yeah.
Joe Panepinto (36:26):
And you know
what's interesting? I'm sorry.
Just to build on that is, like,what I also talk about quite a
bit is like, you would havethought, right, selling books
online, Barnes and Noble.They're going to dominate. Nope.
It was someone that started fromscratch. So like Lemonade, for
(36:50):
example, in like LemonadeInsurance, right? Lemonade is
all about starting up, you know,right from the bottom up with AI
tools as an assumption. Andthat's what we're going see as a
new generation of businesses.We're not seeing them now, but
we will.
And I know what going to looklike, but, you know, the skills,
(37:11):
the jobs, the tasks, the roles,they're all gonna be different.
Brian Rowley (37:15):
I think a lot of
it will go back to some very
basics though in that process.And that kind of goes back to
the experience side of things,because there are certain things
that I think every companyneeds. Because today, to be
honest with you, starting abusiness today with the amount
of information that is at yourfingertips the day you open your
doors or whatever the case maybe, is so much more robust than
(37:38):
anything that I think any of ushave seen during our careers and
doing it. But there are somevery key basic principles,
right? Mission, visions, allthose things, right?
Around setting up andestablishing because I think it
can, you can get very lost inthat. And I think it's easy to
be in that analysis paralysismode if you don't really
understand who you are. We sayit all the time as a brand,
(38:01):
think the biggest thing is for abrand to understand who they are
and not try to be everything toeverybody because that never
works. It's not a strategy. So Ithink those are some interesting
things.
Laura Smith (38:10):
Well, this has been
I feel like we've done data,
we've done We've We've nerded itdone AI. We're talking about all
this stuff. This has been superfun, but we're gonna segue to
our hot seat segment. Welcome,Joe. This is a surprise.
(38:35):
Joe does not know that he wasentering into the hot seat
today.
Brian Rowley (38:38):
And we have a
jingle.
Laura Smith (38:40):
I love it. I love
it. Thanks, Thanks, Joey.
Alright. So for today's, seat,we're gonna throw out some
experiential and strategytrends, and we want you to tell
us if it's overhyped or worththe hype.
Quick. You could say onesentence, one word, whatever,
whoever you wanna so we have,like, six to throw out here. So
I'll start with metaversemarketing. Over over hyped. Over
(39:03):
hyped.
Like dead. Dead. And dead. Wehave something metaverse? Is
this still a thing?
Joe Panepinto (39:08):
I don't
Laura Smith (39:08):
even know.
Joe Panepinto (39:08):
Is this
information superhighway? Is
that like right Xment?
Brian Rowley (39:12):
I think that's
actually been the consensus over
our conversations with peopleabout the metaverse. How about
predictive AI dashboards?
Joe Panepinto (39:20):
What is it?
There's overhyped in.
Laura Smith (39:22):
Worth the hype.
Worth it.
Joe Panepinto (39:24):
It's worth the
hype. Okay.
Laura Smith (39:26):
AR enabled retail
experiences. Missed experiences,
so you probably have experience
Brian Rowley (39:34):
with
Joe Panepinto (39:34):
this I mean, it's
worth the hype in the right
circumstances.
Brian Rowley (39:41):
Is it the AR side
to it? Or is it like if that was
immersive instead of AR?
Joe Panepinto (39:47):
Yeah, a little
bit. And it depends on how it's
implemented. And it also dependson the the shoppers. I spent a
few years in retail consultingand training, right? And
shopping is there are a lot ofaspects to analog shopping
(40:08):
experience that people reallyvalue and love.
And so I think as an option, asan alternative, and as an add
on, yeah, sure. Will replace?Will it replace it? No. I mean,
still more than 80% of retailshopping is done physically.
So Yeah. And I'm not a big yeah.I'm certainly not a headset
(40:33):
person. I'm not a big headset.
Brian Rowley (40:35):
Yeah. No, I'm not
Joe Panepinto (40:36):
a headset.
Laura Smith (40:36):
Is a metaverse dead
with AR? Because I don't feel
like don't even see it or talkabout it.
Brian Rowley (40:43):
Yeah. Persona over
optimization.
Joe Panepinto (40:49):
Persona
overoptimization? Personas, I
think, are overhyped. I mean,optimization and interpreting
data. I have a good example. Iwas working with a client, and
they said they had personas, andthey had like 30 of them.
What am I going do with 30personas? Right? Like a persona
is supposed to summarize stuff,make it easier because I'm
(41:09):
trying to reach an audience.Now, if I'm working in a medium
that is hyper targeted, thathelps. But I don't work in a
hyper targeted, area ofmarketing.
So for us, it's overhyped.
Laura Smith (41:23):
Immersive pop up
experiences.
Joe Panepinto (41:26):
Worth the hype.
It's worth the hype. I mean, for
certain cultures and peoplewanna experience stuff, and
while they're physically at aplace, they'll nerd out. So you
could take someone and you putthem on a flying carpet. Could
take someone and you can, youknow, for for you know, when you
(41:50):
think about b to b is like whenyou talk about digital twins and
the development of digitaltwins, you know, they love that
to be able to see it.
And I think those are I thinkthey're worth the hype.
Brian Rowley (42:02):
Real time AB
testing everything.
Joe Panepinto (42:08):
If my system does
it, yeah. But, you know, I think
if you're doing it thetraditional way, I do think, you
know, getting to your pointearlier, Brian, I mean, you get
to the point where it's not ananalysis paralysis, it's like
analysis overload, right? Like,so you're spending so much time
analyzing stuff, you're notreally spending any time
(42:29):
interpreting it, and I don'tthink that's worthwhile. I do
think you have to act and youhave to do stuff and bring your
gut and your instincts to it tojust continue moving forward and
measuring and going. AB testingeverything, the challenges and
(42:50):
the problem I always see with itis there are methodological
challenges that are always apart of AB testing, right?
So you're not really getting theinterpretation that people have.
I mean, the closer you could getto their actual consumption of
(43:12):
your media, the better, but it'salso, you know, it's going back
to what we were talking about islike, just think about, like,
you have AB testing an ad, whichis better, right? And easier,
because then I could go, I couldthrow them out in the same
media, But everything else, theworld today, the conversations
(43:33):
today are different from theones tomorrow, and are different
from the ones And so And thosethings matter, right? And those
things matter. So think about amessage of an ad.
Message of an ad today, youknow, is gonna be interpreted
different differently based onthe big news stories of that
particular day, for example. SoI wouldn't put all my eggs in
that basket.
Laura Smith (43:52):
Okay. Last one,
Joe. Gamified marketing
campaigns.
Joe Panepinto (43:59):
Yeah, I'd say
overhyped. You know, we as
marketers think people want tobe engaged all the time, and
they don't necessarily. Theywant information sometimes. I
mean, we think they want to, oh,maybe they'll play a Rebus or
(44:22):
they'll, you know, fill in theblank. It's like, no, they want
information, you know, it's likemost of the time or a lot of the
time.
So understanding where to bringgamification in when the mindset
of your consumer is, I want toplay a game versus I want to get
information. And I see toooften, people introduce games
(44:42):
into a consumer journey thatactually is an information
gathering journey or apurchasing journey and gets in
the way. That doesn't mean thatthey're not fun and they're not
interesting. Again, going backto when your audience is in the
mindset of like a game would beappropriate or it'll be fun.
Brian Rowley (45:01):
Understanding your
audience. Again, some of the
basics that we need to make surewe keep in check. They're all
great. I mean, at least fromwhat I think, you know, there's
some really cool tactics from amarketing perspective and as
marketers that we can pull, butit's understanding where your
audience is in their journey andputting those tactics in place
in the right places and not thewrong places. So to your point,
(45:24):
you don't slow down.
Laura Smith (45:26):
All right. Well,
thank you, Joe. This has been so
much fun. I feel like we, hasbeen a long one because I feel
like we just really peeled backsome some layers on some good
conversations. So we reallyappreciate you joining us, and
we'll talk soon.
Joe Panepinto (45:39):
Joe. I appreciate
you bringing up the conversation
and and having it. I think it'sreally valuable.
Laura Smith (45:45):
Awesome. So Brian,
what'd you think?
Brian Rowley (45:48):
You know, I think
it's such an interesting
conversation because I think wealways get caught in those
conversations around how muchdata is too much data, because
we're all trying to solve forthe answers to ROI on
investments that we're makingand what are we using to justify
that and how are we doing it andwhere's the spend going and all
that stuff. But I do think therewas some really interesting
(46:10):
conversations, topics that wediscussed here. I mean, to your
point, we covered a lot.
Laura Smith (46:15):
Covered a lot. And
I'm a data nerd, and I do love
data because I feel like yourely on data, but I also value
Joe's perspective on, you needto be able to have that human
interpretation, that experience,trusting your gut, because I
think that does matter. And Idon't know if everyone believes
that because sometimes I thinkpeople just only want to look at
the data.
Brian Rowley (46:34):
I think it's the
world we're in. And I think as,
you know, leaders in businesses,I think that's our
responsibility to step back fora second and make sure that we,
as much as the data is injectedin its full force being injected
in stepping back and making surethe humanistic side is equally
as relevant in the conversationsthat you're having when you're
(46:54):
actually reviewing that data.
Laura Smith (46:56):
And it's our job as
leaders to educate up, down, and
around that that is theimportant part of the process.
So it's not just the numbers.
Brian Rowley (47:03):
100%.
Laura Smith (47:04):
Thanks everyone for
listening. And most importantly,
if you liked what you heardtoday, be sure to follow us. If
you want to hear more from JoePanepinto, you can find him on
linkedin