Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
Welcome to the analytics power hour, analytics topics covered conversationally
and sometimes with explicit language. Hi everybody, welcome. It's the analytics
power hour and this is episode 261. A long time ago,
it was January 3rd, 2015, the first episode of the analytics power hour
(00:31):
aired. And we didn't think much about the future back then.
I think we were just trying to share ideas and keep a conversation
going with people that we liked from different conferences we'd attended.
And we were trying out a new format
and wow, 10 years can go by really fast. And
(00:53):
over the years, we've had some amazing opportunities to take the analytics
power hour to places we never would have imagined. 261 episodes,
multiple live events across multiple continents, and a chance to interact
with so many amazing analytics people who made the choice to listen and
(01:13):
interact with us along this journey. And to everyone who's been a part,
both as a co host, as a guest, and as a listener,
I mean, we can only just say thank you. Thanks for an amazing
10 years. It's been a good run and we're done. That's it.
And this is the final. No, it's not.
But it is the year review episode. All right. So one of my
(01:38):
favorite episodes every year. And in our profession, a lot of times in
analytics, we get more attention for what's wrong with everything. In fact,
we analytics people, we tend to be good complainers about all the different
things that are broken and don't work right. People don't listen.
And maybe this time around, we're just gonna do something where we focus
(02:01):
on the positive, Tim. Do your best. It's just an hour.
In this episode, we're just gonna talk about some of the best things
we saw this past year. So let's jump into it. Let me introduce my
co host, Tim Wilson. Welcome. Two continents is the third we do in
(02:22):
Antarctica in 2025. I sure hope so. Live with the penguins.
And we can do that in Australia, apparently now. I mean I was right. Speaking
of that, Moe Kiss. Welcome. Hi, how you doing? I feel like today's gonna
be a real role reversal 'cause I'm feeling particularly pessimistic. And
(02:42):
I might have to steal that from Tim today. All right, well you. Freaky Friday.
That's right. And, of course, Josh Crowhurst. Welcome. Hey, good to be here.
Awesome. Thanks for being here. Val Kroll. Welcome. Hello. Holding things
(03:04):
down from Chicago. Fourth Floor Productions. 's right. Fourth Floor Productions.
And Julie Hoyer. Welcome. Hey there. Awesome. And I'm Michael Helbling.
All right. So it's worth mentioning as we get started. Like,
each of us does kind of very different things in our day to
day. And so what I'm excited about as we talk through some of
(03:24):
these topics, I actually think there's a lot of different ways we look
at the world and a lot of diversity in what we handle.
And so I think actually what we're gonna talk about, it kind of makes
me really excited. Because when we go through this stuff, we'll all have
different perspectives and different experiences. And sometimes maybe even
differing opinions. But that usually never happens, right? If you've listened
(03:46):
to this show for 10 years. For 260 episodes, we've always been on
the same page. That's right. In lockstep on every idea. All right. Let's
dive into it. So we're gonna talk first about the best thing we
saw this year in data collection and pipelines in our work.
So I don't know who wants to kick us off, but I'll throw
(04:07):
it out there. It's sort of like one of the things we do
in analytics. We collect a lot of data. What was something awesome you
saw in that area this year? Okay. Well, full disclosure. I feel like
this year has been the year that I have really
started to work more closely with Snowflake. And full disclosure, that's
who Canva uses as one of our partners. And
(04:30):
I've just really been blown away by some of the technology they've been
building. The Snowflake feature store is particularly cool because I feel
like lots of companies try to build, like, a feature store,
or a CDP, or whatever you wanna call it,
in house, especially if you have an in house data warehouse.
And Snowflake have made it, like, they've really simplified the ability
(04:51):
to build a feature store that then particularly can be used for ML
models. So that's been really cool. And Snowpark is the other thing that's
been pretty awesome to see 'cause it lets our analytics engineering team
program in multiple languages. I know my team in particular have managed
to build a bunch of pipelines this year using Snowpark that have saved,
(05:14):
like, days of full time, like, FTE every week. And so
for me, it's really been kind of leveraging what other companies are doing
well and then kind of using that to drive our innovation.
But, like I said, I am very much on the Snowflake bandwagon in
2024. And they did not pay for that endorsement.
(05:38):
They did not. Well, speaking of partnership, Moe, I'm glad you brought that
up 'cause one of the greatest things I've seen in
this year in, like, data collection in general, is actually we've been working,
like, cross groups, skill sets, capabilities, we call them practices, on
some clients. And it's been really great to see we've
(06:02):
started to partner better around implementations in general. We actually
have had a couple clients where we've had implementation work and helping
run their paid media. And in the past, that hasn't always happened as
often when we're, like, in an engagement with a client. And so we
were able to kind of take a step back and talk about,
like, how do we take all of this into context when designing our
(06:23):
implementations and better serve the client and kind of the big picture?
We talk about that a lot, obviously, here on the podcast of,
like, how do you take that bigger context into consideration? And so I
was really happy that we were able to start to actually put that
within our process for some re implementations that we're currently doing
for clients. And it was interesting, too, because
(06:45):
we had to really break down the understanding for the client and even
internally, like the best practices they do in paid media compared to the
best practices we do in certain reporting tools like Adobe, for example.
And so making that a single connection point was actually something we spent
a lot of time on. And I think it was really valuable for
(07:07):
like, us and then obviously the client and the output they're getting.
It's interesting. In my world, a lot of my clients this year were struggling
with various parts of consent management, and that's been around for a while.
But there were some changes to some things this year that a lot
of companies had to grapple with. And it was cool 'cause one of
(07:31):
the folks here at Stacked Analytics, Charlie Tice, who's pretty amazing,
so not something I did in data collection, but I got to be
part of a team that did some amazing things. And
just watching him break that down into very simple terms for our clients
and create elegant solutions that actually solve the problems that we're
(07:51):
facing to get that right. Because it's kind of really imperative to get
that stuff right now, like you can't really fake it. And it's crazy
to me how badly as an industry we still are
at that aspect of data collection. So that was my highlight for the
year is just sort of how we were able to track down and
really get some good solutions in place for companies this year.
(08:14):
And mostly Charlie did it all, but I was very thankful to be
part of it. It's time to step away from the show for a
quick word about Piwik Pro. Tim, tell us about it. Well,
Piwik Pro has really exploded in popularity and keeps adding new functionality.
They sure have. They've got an easy to use interface, a full set
(08:35):
of features with capabilities like custom reports, enhanced e commerce tracking,
and a customer data platform. We love running Piwik Pro's free plan on
the podcast website. But they also have a paid plan that adds scale
and some additional features. Yeah, head over to piwik.pro and check them
out for yourself. You can get started with their free plan.
(08:56):
That's piwik.pro. And now let's get back to the show. My contrarian.
Well, first, I will say I've not worked with it, so I've just
seen kind of the videos around it. But on a data collection,
the nightjar from Moonbird.ai, very much Adobe Analytics focused thing.
Just what I liked about it is they seem very, very focused and
(09:19):
not like this is gonna solve everything. They're like, we have identified
a problem that we have lived and felt the pain with.
But I haven't actually used it. I've only sort of... I know the
people who are doing it and have liked what they've been saying it
is doing. But for me, there seems to... Picking up more signals around
(09:39):
doing less data collection has kind of been what I've gotten most kind
of excited about this year. And I probably pointed to Matt Gershoff kind
of laying out the just enough, just in time
mindset as opposed to just in case. And you kind of stack.
I mean, Annie Duke just last month came out with something where she
was kind of theorizing that we overcollect data
(10:03):
partly so that we can talk about all the data that we looked
at if something doesn't turn out as we'd like it to. So,
I've been... And even Jen Koontz has kind of been also talking about
that. A lot of it's been under the starting point of privacy and
how we need to be very disciplined about what data we're collecting on
(10:24):
users for privacy reasons kind of writ large. But
I've spent a good chunk of this year
thinking around I think a lot of organizations. They have enough data.
It's clean enough. And maybe they shouldn't be
worrying about doing more or better or more sophisticated. So,
(10:45):
that may be... That's tying up what I've seen with kind of the
focus of like my personal journey this year. But
I really liked Gershoff's framing of just in case is what we default
to but getting just enough, just in time. I love
that framing as a way to go about thinking about data collection.
(11:08):
Me too. Like that actually is like... I started to do Chef Kiss and
say it instead of just having the emoji. But, I mean,
it's a podcast. The emoji wouldn't work. I guess the only caveat or
the catch is we can see that in the data,
I guess, community in 2024. But I don't feel like our stakeholders are
(11:29):
there yet. And I think that's the really difficult thing is like we
might say to them, just enough, just in time. Did I get it
right? Yeah, just enough, just in time. Yeah. But
I still feel like there's a tension, right, where the business is like
track everything. And you're like, no, that's really not a great idea anymore.
So, maybe that's where we get to for 2025. Mean, I think that's a
(11:54):
great point. I think that it's an easy thing to default to. I mean, I
go back over it probably 15 years ago being told,
hey, this tracking is gonna be really easy. We just want you to
track everything. Like it's easy to articulate. So, there's definitely a
business partner management because it's... They feel like
(12:14):
obviously we need this. Like in the absence... Like clearly we need the
data. And I think AI has driven a lot more around
the, oh, you got to have the more data the better to feed
into the monster machine. And it's got to be really, really clean.
Ben Stancil had a post last month about kind of calling that into
(12:37):
question, which I thought was really good as well. So,
I think there is a need to bring along
our business partners with, wait a minute, what are you trying to do?
Let's not be prescriptive that you need this data.
There's no value in the data. There's only value in what you're doing
with it. So, let's talk about what you're gonna do. But that's a...
(13:00):
I agree. I think it's got to... Maybe it's got to start with
the data community to have that mindset to not just kind of take
when somebody says we need this data to just say, well,
not having it or having it. Clearly, yeah, we don't have it.
We can default to we need it instead of defaulting to
maybe we don't need it and we should question it without
(13:21):
burning our relationships. And even if you were to look
retrospectively... And, Julie, I've heard you talk about this several times,
and I love the way that you put it. So, if I bastardize it, please correct
me. But the number of times when, like, the just in case actually
saves you is actually pretty slim. Like, trying to torture data for the
purpose it wasn't intended to be collected for gives you that,
(13:44):
like, where's the wind blowing direction kind of take on things,
but it's actually not always customized to answer the specific question
at hand. And so, how much are you really missing out on,
right? Like, I've heard you... I don't know, Julie, if you have
thoughts. This is definitely coming from your brain, but I totally agree
with you. Yeah, no, I would say most of the time people say,
(14:06):
oh, well, we're collecting, data in the vein of this question.
We should be good. And then by the time we really get into
it, it's like, oh, actually, sorry, you're missing, like, a key setup,
configuration, the way the data is actually coming in or is implemented.
Like, it's not gonna do what you think it's gonna do.
So, I run into that way more than, oh, cool, you already have the
(14:28):
data. It's perfectly set to answer this question. That, like, never happens.
Like, I've learned my lesson. Serendipity. Well, and of course, with any
topic, we're gonna find some things to be
concerned about. But let's move on to our next category. So,
this year, what was the best thing we saw in the area of
experimentation and optimization? Like, what things did we see in that category
(14:53):
this year that we liked? Okay. So, I'll go first in this one.
I'm actually excited to talk about this. So, just before
I left further earlier this year, I was working with the optimization team
on a cool project for consent banner optimization.
So, as we know, Consent Banners are a little bit Wild,
Wild West. As you mentioned, Michael, some changes
(15:16):
happening this year. And so, I say Wild, Wild West 'cause we don't
really know kind of, like, what works in terms of right practices or
how do we create the right transparency and allow people to really hold
the remote on their preferences. So, the team started with a lot of
research on, like, what capabilities are out there. OneTrust offers some
A B split testing. So, we could do a little research on what
(15:39):
levers there were to pull. We did a lot of research to develop
a POV on what further wanted to recommend to clients to be testing
and what did we wanna ethically support to make sure that we weren't injecting
any dark patterns and that we were really focused on aiming on education
and not obfuscation of information to really increase brand trust. And then
(16:01):
we put some of that together to run some tests on their own
for their website. And the first one, which was really just kind of
playing around with some of those basic, I'm embarrassed to say it, almost
button color principles of contrast just to really make sure people didn't
think it was a modal, that it wasn't gonna stop their behavior,
but that it allowed them to interact and set their preferences,
(16:22):
we saw a huge increase. And so, we were able to kind of bundle
that together to put together a nice solution to be able to share
with clients and having that case study already in hand after
60 days of research and work 'cause it was definitely a labor of
love to pull that together quickly. But it was really kind of fun
'cause it kind of felt like something I hadn't seen before.
(16:43):
I was able to engage with lots of different clients across a lot
of different industries, but there's definitely thematic, some of the similar
things that come up time and time again. So, it was fun to
kind of engage with this totally new and kind of like total clean
blank slate. That's cool. So, I missed it maybe, but what are you
optimizing for in the cookie batter? Yeah, like what's the actual metric
(17:06):
that you're... So, some of the times when you're testing, it's just about
interaction with it altogether so that as maybe new regulation comes in,
that affects it, that they know how to get people to interact with
the banner itself to either set your preferences or accept all cookies.
But there was some other tests that we hadn't seen the results of by
the time I left. So, definitely reach out to Lucy over at Further
(17:29):
if you want more information. But we were trying to figure out how
could we assess its impact on brand trust. That was one that we
were really interested in because with some companies, with some sites,
the cookies are the cookies, you have to be able to navigate.
But because the banner has to be present, if there's really no other
lever than the limited levers that you have to pull, I should say,
(17:53):
that brand trust is the one that feels like the most meaningful way
to invest in that. So, that was one that was kind of in
the works. Yeah, very cool. Well, maybe I'll chime in with one here.
It's a little bit out of left field, but I've been becoming really
interested in experimentation in the realm of sports science this year.
(18:15):
So, yeah, I've gone down a lot of rabbit holes on studies that
have been done on how athletes should train to do better in endurance
events. So, like marathons, ultra marathons, cycling events. And there's
been a lot of RCTs done, a lot of
research and experiments into it. And one area that I'm looking at is
(18:37):
like the research into just training most of the time in a low
intensity, really light. Like if you're training for a marathon, just doing
like 80% plus, long, slow, light running and not just feeling like you
need to work as hard as you possibly can like every training session.
And like this is personally relevant for me because I do
(19:01):
this long distance paddling endurance sport, right? So, I'm kind of the
nerd in the team, like going out there and being like,
okay, what does the science say? And that's totally not the culture at
all. Like, it's really like people are very resistant to... Trust the data.
We should back off, just know, easy. Yeah, yeah. This guy. Yeah.
(19:24):
So I'm just the one nerd in the corner doing that.
And it's definitely not the culture to train in that way.
But the thing is, our rival team this year
kind of did that. Like, they put that
into practice and just did all this really light training. And even within
their team, I know there was a lot of skepticism and doubt and
people were questioning it. And then we had our races and like they
(19:46):
smoked us, they completely smoked us. Whereas last year they didn't.
So I know that's like n equals one. But
I still thought it was pretty interesting to see,
applying some of these learnings from other sports and the exercise science
that's being done and getting a good outcome.
(20:08):
So anyways, maybe I could push for next year
to listen to the nerds. Well, I was glad to hear this wasn't
associated with like a gambling thing or something, so that's good.
Yeah. Josh, I find this really, when you discovered this, were you really
surprised? I feel like this is counterintuitive to what I thought the research
(20:30):
said about like, doing short burst in intervals and like
it seems really... Is it just because it's endurance sport that it's different?
Yeah, so it's... You sort of need to have both. And I'm definitely
looking mainly in an endurance context, but I think this also applies to
like shorter events. But basically the idea is that you do the long
(20:51):
slow as a way to build up your aerobic base, which is
gonna help you basically absorb oxygen more efficiently into your muscles
and clear the byproducts more efficiently. And then you still need to do
that high intensity as a component of your training. Sorry, I'm getting
(21:13):
into the weeds a little bit in this. No, I'm here for it.
But you only need to do a little bit of it because
your system reacts really quickly to high intensity training. So you get
those adaptations quickly, but it reacts really slowly to the long endurance
training you need to get that base, but it just takes way longer
to do it. So you can get away with just maybe leading up
(21:35):
to the races, you incorporate more of that high intensity.
So that's sort of my understanding of it. I can say I've
actually been seeing a lot of articles about that.
My challenge is that like, I'm like, the range from my max effort
to trying to go slow is really, really small. So if I'm running,
(21:56):
I'm like, wait a minute, oh, I'm supposed to slow down.
I'm like, I'm already going pretty slow. So
what do you do? You already optimized, Tim is what you're saying.
Like can't go much. Like you can't really call it
a run if you're walking. My 80% looks like walking. Oh, that's a
(22:19):
good one. Yeah. I love that. One that I wanted to bring up
is actually against the popular opinion as well.
I had gone to one of the latest TLC
Friday sessions and it was Georgi Georgi event, and he was talking about
the difference between observed power and observed MDE compared to like
(22:44):
what you set when you're designing your test and how a lot of
people will conclude because the test ends and then they check their power
and they say, oh, it was underpowered, we like can't use it and
it's conclusions. And he was pretty much saying, that's not true and you
shouldn't actually look at your observed power because it's different than
(23:04):
the power that you use to design your test. And I
don't fully understand it enough to give you a nice synopsis here,
but he has blog posts out and some LinkedIn posts about it.
But it was really interesting and he just said how
it really changed his understanding and way he actually uses power and even
like MDE and that a lot of people in practice
(23:28):
are still relying on observed power to make conclusions and that that is
actually not a best practice. But I know that's not a popular thing
yet and I think a lot of people,
it's gonna take a while to see that change in the actual practitioners.
And I was pretty shocked and it was a really great session.
So highly recommend and I'll definitely be rewatching it to try to wrap
(23:49):
my head around it. Can I share that you and I were slacking
each other during that session? I just pulled that back up to look
at it. Where I was like, you were like, yeah, this is hard. And
I'm like, yeah, if you're not hanging on, I'm really not hanging on.
That's okay. I'll ask. I have no shame. I'll ask a question.
And I asked the question and the response was like,
yeah, I'm gonna have to take that offline. Which I interpreted as being
(24:10):
like, that was the dumbest question. Anybody? No, he said it was a
really good question. He just needed time to, like, he wanted to think
about his answer. Yeah. I think that was kind of the,
oh, that's a really good question. You're adorable. Idiot. I'm not gonna,
yeah, I lost you clearly at slide two. Did you guys see those
recently sort of tangentially related, but did you guys see those recent,
(24:33):
I guess memes about like statistical significance and in some of the
really terrible analysis out there, they're sort of classifying like p equals
0.10 as like borderline semi significant or like extremely statistically
(24:53):
significant for like a low value, Like some of these like really awful
descriptions and kind of just the arbitrary nature of that. Oh my God. P
equals 0.05 just in general made me laugh. Oh, that's funny. I feel
like I could... I'm not gonna name them, but there are people who've
already gotten triggered by that Yeah. There. Are some very divergent schools
(25:17):
of thought as to how useful it is to go down the path
of parsing those versus not. Can of worms. Yeah. Deep breaths Tim. Save
us Michael. Alright. Let's move on to another category. One thing
most all analytics people are involved with is getting data out into the
(25:39):
hands of people who need to use it, whether it be marketers or
operators or things like that. So what did we see this year in reporting?
Like what were the best things we saw there? It's
not always the most glamorous part of the job, but it's necessary and
crucial. Ooh, Michael, I'll start because you're gonna be really excited
(26:00):
about this one. Okay. So if you all remember the great episode we
had with Cedric Chin and it was about XMR charts. Well,
I have seen them in the wild for clients
and they went over really, really well. Lucy, we already name dropped her
earlier, she is just a rock star. She did it for one of
our large clients that heavily relies on us for reporting. And it did
(26:22):
a great job of showing variation like we talked about and showing the
client that like, you're always gonna have variation. Your KPI will move
up and down, but like, let's put a little thought around when we
should really pay attention to it. So I know that they did a
lot of like education and change management with the client. They worked
really hard on the different visualizations to kind of show those bands
and how they were setting them and when their KPIs they cared about
(26:45):
actually needed to be paid attention to. So I think they said it
went really well with the client and it's been a great tool for
them this year. Awesome. And yes, I am a huge fan of XMR charts
now. Thank you. I have another one that's a shout out to another
(27:06):
former guest from this year. So there was an article by Eric Sandham recently,
one of his weekly articles he's putting out and he did one called
The Joy of Business Reporting. So yeah, counteracting that narrative of
reporting is just something you have to slog through and get through.
It's a necessary evil, not the most exciting part of the job,
but actually he made a few points about how it's a great way
(27:29):
to gain deep subject matter expertise in how your business works and gives
you an opportunity to link up with like people across the company that
you otherwise wouldn't have a chance to talk to, build your network.
And I just thought it was a nice perspective to see,
what are actually the benefits that it'll bring to you as an analyst
(27:51):
to do some of this instead of just sort of seeing it as
okay, monthly reporting, again, I gotta go chase the stakeholders, the numbers
are late, that whole thing, like you can kind of look at it
in a different way and say, yeah, actually this is kind of a
good opportunity. Especially if you're relatively early in your career.
It's a great way to build some of those relationships and understanding.
(28:13):
So I felt that was a nice article.
I like that 'cause I mean, it's like the double edged sword.
You can go work with your business partner and have them just...
I was having coffee with somebody a little while back and she was
like, oh my God, like the head of marketing is just like,
where's my dashboard? Where's my dashboard? And she's like, no matter what
(28:34):
the context, whether it's a... She just perceives
that everything that comes out of analytics and data science winds up on
a dashboard. And this was driving this fairly senior
person who's managing the analytics and data science teams a little nuts.
Because she was having to sort of push back. So the the flip
(28:55):
side is saying, oh, if the reporting exercise is an opportunity for me
to ask you questions about the business and what you really
need, then that is an opportunity being mindful of don't turn into an
order taker where they say, and I wanna look at this and I
wanna look at this and I wanna look at this and I need to slice
it this way and that way. Like that's the challenge. But I do
(29:18):
like that framing. I mean anything out of
Eric. I was gonna say, I don't know if we're gonna do a
category for best of matchmaking in 2024, but I will definitely take credit
for the love affair that is Eric and Tim.
But I also like, I found him first and now I'm like,
(29:40):
yeah. Obsessed. As obsessed as we all are. Interestingly enough in this
area for me this year, I think this is where I used AI
the most was in using reporting and developing reports.
I feel like this... Really? Yeah, like lots of other places I use
AI but like in terms of like practical application to my work,
(30:04):
it had the most impact in terms of
a lot of times. I've been around a long time, so I've forgotten
a ton of stuff that I used to know really well.
And so going back into the weeds and like creating things and data,
I'm a little rusty sometimes. And so AI was amazing to be like,
okay, I know what I'm trying to do.
(30:24):
And it would just be like, oh, you just do this,
this, and this. And I'm like, oh, that just saved me like two
hours of Googling for an answer and I'm off to the races.
And it helped me do things that I'd never done before that actually
were really helpful because sometimes when you're trying to think through
a way to display data in a certain way, like
(30:44):
you're limited in different ways by the tools you've got and
what you're trying to show is not well supported by both the underlying
data as well as the platform you're on. And so
getting to a couple of really cool solutions and then
being able to show that to clients and then being like,
wow, how'd you do that? I be like,
(31:05):
that's why you pay me no big deal. Even though it's like
I was stayed awake half the night being like, I wonder if that's
possible. And then AI could be like, this is what I wanna do.
And the AI is like, oh, I think if you did this,
this and this and boom, you're often going. So that was like the
cool thing for me this year. It was sort of like leveraging AI
(31:26):
to compress some of the work and help me to remember stuff I
used to know how to do better. So using it in the context
of kind of the execution, the development of the report. Yeah. Building
stuff. Yeah. Interesting. I funnily enough had the same thing, but I actually
put it under analysis. So one example, I
(31:48):
had someone in my team the other day who I'd asked them to
run a query. It was a bunch of data sources I'm not super
familiar with. 'Cause also I haven't written a line of code in probably
two years and I was overseas so I didn't wanna like wake this
person up 'cause it was Australia time. And I basically was like,
I just wanna make sure that I'm like... It's the logic that I
(32:10):
intended when I asked him to pull this data. And so I was
like, Hey, ChatGPT, here's the query. Can you tell me what this query's
doing? Is this the calculation it's doing? And I was like,
I'd never thought about using it that way, but fuck it was a
beautiful thing. Like it basically came back and was like, this is how
it's calculating this field and this is what it's doing. And I was
like, great. I didn't have to bug one of my data scientists to
(32:33):
be like, am I definitely interpreting this correctly? Because it was really
complicated. And see, I would put that under analysis though Tim. Do you
think that's under reporting? No, I mean those are both aspects of kind
of supporting the development tasks. So to me that feels like something
(32:56):
that cuts across like legit cuts across both of them.
Yeah, I suppose 'cause you could do the same thing with your queries
for reporting purposes. Well, since we're talking about analysis,
what are some of the best things we saw in analysis this year?
(33:16):
I'll start with like, there's a whole movement and maybe this cuts across
reporting and analysis as well, that was like some of the worst thing
I saw, which is all of the hype around, you point AI at your
data, throw your data at AI and insights will emerge. Which I mean, being
the inbox manager for our inbound pitches, there are certainly plenty of
(33:39):
people who would love to fill your ears dear listener about how their
AI solution is gonna generate insights. So that was very, very off putting
'cause it feels so wrong. But the flip side, and I'll credit seeing Jim
Stern at MeasureCamp Austin, seeing Jim again at Marketing Analytics Summit,
(34:01):
John Lovett at Adobe Summit. I saw him then again at MeasureCamp Chicago
and both the really pushing for using generative AI as kind of an
ideation companion to think about hypotheses to like be
a smart companion to, it's kind of like
(34:23):
rubber ducking on like super, super steroids to say I can have an
exchange where I am forced to be in conversation and I am
encouraged, like using prompt engineering as a way to get to
ideas for analysis, to get to hypotheses, to think through, to kind of think
(34:44):
a little bit more broadly. That to me seems really, really useful.
And even with like John sort of building GPT specifically around training
them how to be good ideation companions was pretty exciting. Well,
I think fundamentally, look, I was saying to someone the other day,
(35:06):
I feel like ChatGPT is changing my job and it's making me
better at my job in some places and worse at my job in
some places. In some cases, like yes, it is definitely the
ability to come up with a list of hypotheses or what often happens
with me is like, I will do all the thinking, get 90% of
the way there. It's the last 10% that I really struggle to do.
(35:29):
And so I've been leveraging that to do the last 10%
very, very effectively. But yeah, I don't know. I feel like I use
it across the full stack. So like companion,
bounce ideas off, ask stupid questions, that's probably my favorite one.
I'll be like, explain this thing to me. And previously like I probably
(35:50):
would've struggled to find a simple way to do that. And you keep
saying ChatGPT, is that your one go to or do you use... Yes. Well,
I use that because we have enterprise and so we can put
confidential information in there, so it makes a really big difference when
you can upload a data set and be like,
(36:10):
hey, the other one that I loved so much, we had this Slack
thread the other day and we were like doing an investigation into a
metric that had declined. And everyone's like, what's going on here?
And of course we have all the like best brains in the business
and you have all these senior data scientists like updating the channel
of what they've looked at because it's cross company, there's all these
senior leaders in it. And I caught up on the thread like, I
(36:35):
don't know, seven hours later and I was like,
what the fuck is going on here? Like, I can't tell what we've looked
into, what we haven't looked into, what's next, like, where are we at?
And I basically just copy pasted the entire chat
into ChatGPT and was like, give me a summarized version of what we've
looked at, what we've ruled out and what's still outstanding and who should
follow it up. And it was beautiful. I mean so succinct.
(37:00):
That was so good. And then I pasted it back in the channel
and was like, Hey guys, here's where we're at. And everyone was like,
again, Chef Kiss, as this way to just like keep everyone on track
almost without anyone having to digest all that information and do the summary.
But I think the point that I made that I
take for granted is like having an instance where you can put
(37:23):
confidential company information is a real game changer, right?
Absolutely. Yeah. Yeah. I'm just not gonna fucking talk anymore 'cause every
time I do it's like stone cold silence because I'm making really points
and I'm being super annoying. You are not. No, not at all.
Not at all. No, mine's just totally unrelated, so I couldn't think of
a segue, but totally agree with what you said. I'm dying to have
(37:48):
our own instance of ChatGPT, like my company still blocks access to all
Gen AI applications despite having a Gen AI team and COE in the
company. So that's where we're at. Do you know what my founder did
the other day? My founder the other day was like, I think everyone
just needs to like brainstorm this. And people were like, writing on post
(38:09):
its with pens. She's like, no, no, no. Someone grabs some like permanent
markers. Write in big letters. And 'cause we had like 30 people in
this brainstorm, everyone threw it up on a board and she just took
photos on her phone, uploaded it straight away and it summarized brainstorm.
And I was like, boom. It's pretty nice. Yeah. Some of those like
virtual whiteboard tools, like the murals and the murals have some that
(38:31):
built into where you can like. Like Canva? And like Canva. Didn't realize
that one. That's the one we use. Yeah. But you can, like the
summarize is super helpful, but there's other ones where you're asking it,
like you can draw a lasso around to certain ones and say like, what's
the theme of these? Or like, you can have it do a assistance. That's awesome.
(38:55):
A couple things that it's pretty manual. That definitely is a time saver.
Especially if you're in a facilitation mode or there for a day long
workshop with clients kind of thing. Moe you gave me some ideas of
how to use it without having to put like client data in there.
'Cause we obviously we can't do that for privacy stuff, so, but yeah,
Moe you have given me good use cases of how to use it
(39:15):
for the ideation and the creative side, but a little more applicable to
the day to day. 'Cause I'm like, I don't get to be that
creative in my job, so thank you. I'll keep throwing you 'cause I
just keep finding more and more ways and I honestly was like,
I think I've mentioned this example before, but like we had another like
metric deep dive where something had gone down and one of my stakeholders
(39:38):
had put like, what are the possible hypotheses for these in ChatGPT? And
what came back was very good. And the thing that I really liked
about it, it actually reminded me so much of analysis of competing hypothesis
where it kind of like came up with 10 hypotheses. It was really
structured. And then we actually... I was like, let's lean into this.
(39:58):
And we used it for our analysis and we're like, okay,
we've ruled this one out, this one we've ruled. Like we didn't take
the same process exactly as analysis of competing hypotheses, but it definitely
had that mentality of like, okay, here are all the causes.
Like let's disprove them. And the thing that I keep coming back to
is... But why did you not tell it to take an analysis an
(40:19):
ACH or ACHA in your... Ooh, I don't... That is interesting and I
will try it. The only caveat is I don't know if the conclusions
it would draw about the different pieces of evidence would be sufficient.
Plus you would probably need so many varied pieces of evidence
(40:42):
in there. But I'm gonna try it. I'll come back to you.
Okay. Well, it kind of goes back to that would be a little
more like asking it to do some fact finding for you,
which then you have to check. Like we always say like check it, gut check
it. Which I think would be hard to your point, like asking it
to do that. And then you'd have to go through the exercise to
check it kind of, compared to what you're doing. You're doing more of
(41:02):
like the brainstorming, creative summarization get you moving in the right
direction stuff. Nice. Alright. Last category. How about we talk about
what we saw this year that we love that was in analytic strategy
or strategy related? I have to start out with something that Valerie actually
(41:24):
sent me at a very timely time that I needed to read this.
'Cause you always have existential moments where you're like, what am I
doing? Should I be more hands on? Should I stay more generalists? You know,
we all have those. And then it's Julie, get back on the podcast.
We're trying to have a monthly call here.
(41:44):
Yeah. Exactly. But Valerie sent me a quote by Adam Grant,
and I feel like it perfectly summarizes what I hope
we see more of in 2025. Like, people embrace this. The quote is,
the hallmark of expertise is no longer how much,
it's how well you synthesize, information scarcity rewarded knowledge acquisition,
(42:05):
information abundance requires pattern recognition. It's not enough to collect
facts. The future belongs to those who connect dots. And I was like,
oh my heart just like settled my self talk where I was kind
of spiraling. So I really loved that. So like an analytics translator.
(42:26):
Absolutely. Yeah. I mean I'm... I'm sorry. Like I wasn't already,
like that was like it, the switch was 80% triggered by this.
Anyway, yeah. You just... Really, you're triggered by this?
By Adam Grant. That's all you had to say. No, I mean it's stating that like
that is not new like that is not new. Like there is like kind of. That is...
Okay. I was gonna say that Tim, I was gonna say his perspective
(42:50):
is not new, but I think he sometimes has a way of packaging
an idea that makes people, I don't know if I wanna say believe
in it, but makes people wanna follow it forward like in a really
concise way. So like that's the value add. Okay. Anyway. I love that.
(43:10):
I'm glad it was helpful 'cause... I liked it. A lot of people
like I don't think that what you were grappling with Julie is uncommon
at all as people progress. They're trying to figure this stuff out and
with everything going on in our industry with what we're just talking about
with Generative AI, like there's all these things that people are trying
to say like, what is the shape of my career?
(43:33):
And so it really is helpful to get guidance or guide posts from
people that like that. But you know what's gonna happen is a bunch
of people are gonna say, sweet, so I can just take all this
stuff and throw it at Generative AI and tell it to connect the
dots for me. And I found the shortcut. I mean. Okay.
Well then you can swim on past him. So, Tim, none of us
(43:55):
here are gonna do that. We are all gonna come to you.
Okay. Yeah. So don't worry about it. But I think it's nice to
show that like also there's been a lot of talk in the,
not a lot of talk in the industry, but there's definitely been a
change, right? Of like, especially in being like a consultant,
being able to deliver a point solution for a very specific pain point
(44:18):
is valuable. But there has definitely been a bigger demand, from clients
of like, now that's like table stakes as I know Tim, you're gonna
say it should have been all along, but like there is a bigger
push of them wanting bigger pictures painted for them, more guidance,
help them connect the dots. Like they are now feeling the pain of
like, I have so much stuff going on, I don't know how to
(44:40):
make it work together. They still wanna grab onto, we'll switch the tools
of what I'm comfortable with, but then they do that and they're not
getting right the outcome they were expecting or the benefit from the swapping
the tool that they thought. And so like this quote to me was
also hopefully to Moe's point, like if more people are seeing this and
there's a critical mass of people understanding that, like it's making everything
(45:02):
you have work together and seeing the bigger picture. Like one,
the industry's feeling the pain, I think they're pushing towards wanting
more strategy. And two, I thought this did succinctly say like that I
could pass around to my colleagues at work and say like,
Hey, us thinking this way. Like yes, it's painful and it can be
hard sometimes, but like we're on a great learning curve and it is
like way more valuable than classically like the great work we've done.
(45:26):
Like yes, we're still gonna do that and need that expertise,
but like we do have to upskill in this area. And so I
think that's why it was so encouraging. The thing though that I would
add, which I was gonna say earlier is that, like Tim said,
I don't feel like that's new. I feel
that kind of like working to connect the dots is kind of a
core data scientist. Like that's what makes a good data practitioner.
(45:50):
And I feel like it's still a work in progress and what can
sometimes be attention particularly is that like I feel the industry right
now, I don't know about anyone else, but I feel like our foot
is down and we are moving so super fast that it almost feels
like a bit of a tension has been taken away from that core
skill. And we need to like refocus. I feel like I still work
(46:10):
on this with my team all the time. Like it's not...
That's a good point. Yeah. I don't feel that we're there.
I mean 'cause my concern is that this goes to,
say this goes to our business partners and now they've just been given
another cudgel to say, well this is great. You gave me what I asked for,
but I need you to really like connect the dots and I really
(46:31):
need you to do the pattern recognition. Or they throw it.
I mean I'm not opposed to the idea, but like the data practitioner
who is constantly coming up short from the expectations of their business
partners, like the critique. I mean it is a partnership,
(46:57):
to me there's more about communication and partnership and deeply empathizing
with and understanding what our business partners really need
is a lot way more important than where analysts a lot of times
wanna scurry off and say, let me just keep digging into the data
and let me just keep finding something. So I think
(47:18):
the danger is how somebody passes it. Because if they read too much
into synthesizing the information, they're like, well, I just gotta get
more and more data and I've gotta synthesize it and then I'm gonna
come back with, look, I found this relationship between these two things
that are completely divorced from the actual business context. So
(47:40):
I'm not just trying to shit all over it, I promise.
No, no, but it's just interesting the way you're reading it is definitely
in a different light than I was. And I think some of it
is like the space I was in, I was primed to read it
and take it completely differently than you are taking it. So it's not
to say like, yours is wrong and I disagree. Like I agree with
what you say. It's just interesting. I didn't have that
(48:03):
point of view when I first read it. I guess.
I like it. Okay. Controversial question. How important is data strategy
right now? Like, I made that comment about things moving as fast.
Like I feel like it's moving as fast as possible.
I feel like there are a lot of documents sitting on shelves somewhere.
(48:26):
I don't know someone said something to me in 2024 and I had
this like. You mean this year? Yes. This year, but I am already
in 2025. My mind is in 2025 and has been for the last
three months. Okay. But someone said something to me which was,
(48:46):
oh well the data strategy should just connect up to the overall company
goals. And I was like, well fuck, that's obvious. I think that's harder
to do in reality sometimes. But I guess I'm just having this like existential
crisis of like what is the purpose and it like, and I'm thinking
when I say data strategy, they're like 12 to 15 page document
(49:10):
that probably does sit on a shelf. Like is it as useful as
it used to be with the pace that we're at? Like I don't
know. Like is it more about the behaviors and the ways of working
that are important or the concepts and less about the how?
At times like these Moe, I like to think about the Canadian band,
The Arrogant Worms and the song they sang called star trekking across the
(49:34):
universe, always going forward. 'Cause we cannot go in reverse.
And I think when we're going so fast,
in my opinion, that is when data strategy is actually even more crucial
because we're going so quickly and we have to respond so quickly.
(49:54):
We do have to have a good strategy for where we're trying to
go or else we'll get pulled 45 different directions and end up nowhere.
Counterpoint. Okay. To put me in most camp. Yes. I think most if
you say data and I struggle with, I mean I've asked what is
a data strategy? 'Cause I don't even know. So, and I've seen defined different
(50:20):
ways. One of the more recent ones in discussion with a company,
the data strategy is like, what data are we gonna have?
How's it gonna be hooked together? What are we gonna gather?
What are we gonna collect? How are we gonna manage it?
And nine out 10 of those come down to,
okay, here's our strategy. We've gotta spend the next 12 months getting
(50:43):
all the data hooked into these just, we need the whole year to
kind of execute it. Which I think that's where they default to.
And they'll wave the flag of AI and all the data has to
be super clean. And so there is a tendency,
I don't know Moe, if I'm articulating the same thing you're saying,
there's a tendency to over index, towards collecting, getting all the stuff
(51:07):
and getting all the process in a good place with this
belief that you've gotta build this strong foundation and that means the
next year then we'll be off to the races and the Generative AI
will be so much more powerful and that's why it winds up as
opposed to being more nimble. What's interesting, Tim, is what you just
(51:27):
commented on is actually the execution on the strategy being too slow.
Not the strategy itself. I think both are true. Both are true. Yeah.
But I think that's the issue you have there is sort of like,
well how do we execute on the strategy effectively?
And it does your strategy. I think Moe to your point,
(51:48):
does have to fly back to what is the business trying to do.
One of the things we did this year with one of our clients
was... As you're going along helping companies do stuff, you get to an
inflection point and you start, you get a chance to actually say like,
Hey, does our business better serve by like altering our strategy?
(52:09):
And then what are the steps we can take now with what we've
got? And then how do we need to adjust out into the future
to build that strategy forward? And like that's work we are doing.
But I tend to take personally a much more iterative approach,
which is sort of like, okay, if this is where we see the
puck going, don't be like, I need 12 months to get everything ready
to be perfect. The boil, the ocean work is never right.
(52:32):
It's always sort of like, here's where we're at and here's the five
steps we can take in the next 90 days that are gonna push
us a little closer. And I think... But sorry, but when you run a
strategy, you don't run a 90 day strategy, it's typically one to five
years minimum. I didn't say 90 day strategy. I said here's where we're
at today and to get to our strategy, here's what we're gonna do
now and then across the next so that we're not taking forever to
(52:56):
get there. 'Cause you've gotta be incremental about it or iterative.
I just don't think it works to try to do the big bang
every time. Sometimes you have to do it that way, but I don't
think it's effective every time. I don't think we do the big bang. I
think the problem is exactly what Tim said where we said we're gonna
spend the next 12 months and we're gonna get everything up to scratch
(53:18):
and everything's gonna be perfect. And then 12 months later, a bunch of
shit happens in the business. You've got a bunch of new tools. Sure. A
bunch of other shitty data and you're like, we're gonna spend the next
12 months making everything perfect because if it's perfect we can help
the company achieve their user and revenue goal. And you're like,
no one gives a shit. Yeah, but that's not strategy, that's execution.
I was gonna say I definitely, am saying strategy.
(53:39):
I just have to give you the context. I definitely was not meaning like data
collection strategy. I was definitely talking concepts in the other things
you mentioned Moe. Just to put that out there. I can't go on
the record people thinking I was talking about the collection side. 'Cause
I wasn't. I definitely wasn't talking about collection. But I think that's
where we landed. Oh, yeah. But I think I've learned this,
I mean I'm finally starting to understand that the generally accepted thing
(54:03):
is like data versus analytics. The data tends to be the collection,
the piping, the governments, the management. But at the same time,
I worked with somebody for years who, when he would do a
data strategy, he meant more Julie what you meant. So there is...
Yeah, it's exhausting. And this starts to feel brutal.
(54:23):
It sounds like a podcast episode. Alright. But I think this is...
Sorry just to round it out though, this is why I think it
comes back to maybe what the direction I need to go in is
less about the what are we trying to achieve and like what are
the behaviors and ways of working we want to use to get there
and to support the business. And I don't know if those are the
(54:45):
same things or different things, but that's kind of what's been rolling
around in my head is like what do we wanna stand for as
a team? And anyway, yeah we do need a whole episode.
Totally. Agreed. So I'm gonna take us on a little bit of a
left turn for my... Good. Observation for 2024, hopefully a little less
(55:05):
contentious positive. We shall see. At this point. Don't count on it.
That's right. I feel like the jumping on happens later and later in
the episode. So I am a little nervous. No,
I had the opportunity to attend... Oh, okay. Wow, Tim. Well, you were
(55:28):
there so you know that I was there,
and there were a lot of great presentations. I really was impressed overall,
but there was one in particular by Noam Lovinsky who was the CPO
of Grammarly and his presentation was a little provocative. It was have
LLMs killed Grammarly. And so it was just super
(55:48):
interesting. And I have the recording in the slides that we'll definitely
link in the show notes if you're interested. But one thing that really
has stuck with me ever since he presented it, is thinking about the
question, has the problem that you're working on truly been solved for your
customers and your users, so the example that he gave to kind of like illustrate
this point was back in the day, I guess in the 70s and
(56:11):
80s, there were these things called Thomas guides, which were essentially
like these paper like little manuals that had like maps and places you
could go so that if you were out of town, it was one
of the best ways to like navigate the city.
And Thomas guides were kind of outrun by MapQuest in the 90s because
now you could just type in where you wanted to go and print
out your directions and that was considered the solve. But he thinks that
(56:35):
where we are today with Gen AI is the MapQuest stage because we
all know what comes after MapQuest, which is the smartphone and the constant
access to Google Maps. And that's interesting to think about Gen AI just
being at MapQuest. But the part that really sticks with me is he
was saying, actually, having Google maps on your smartphone or on in your
car, like always accessible still isn't the solve, that if you're really
(56:58):
obsessed and focused on if you're actually solving the problems that would
be self driving cars because that's what gets rid of the need for
navigation altogether. And so like that's actually where you get there.
So he was kind of giving some examples in like the product sense,
but thinking about how to be really deeply connected with the problems that
(57:19):
you're solving. I've just found lots of opportunities to share that story
and applications of the way of analyzing just really deeply understanding
the problem that your users and customers are facing. So that was definitely
a takeaway. So in a data or analytic strategy context, it's really getting
to that perfect dashboard, right? That's really where. The ones. There's
the takeaways. Awesome. Alright, well this has been
(57:45):
sort of the intent, but we did pretty good and it's also nice
to see that after 10 years we're a 100%
on the same page and aligned on everything analytics related.
So, I guess we can't quit doing the podcast yet. We still got
work to do. So, 2025 is looking to be a good year I
(58:07):
think. Thank all of you, Moe and Julie and Josh and Valerie and
Tim, thank you for being on the podcast with me and doing this show together.
It's always not only enlightening but fun. And thank you too Michael. Thank
you too Michael. Oh, that's what I was looking for. Thank you. I'm gonna
(58:32):
leave that hook out there for a while to see if anybody was
gonna bite. Anybody? No. But I think as you're listening,
maybe you've got things you're thinking about in 2024 learnings that you
want to take into 2025 or things in Year 11 of the podcast.
(58:53):
You're like, you haven't talked about this enough. It sounds like we learned
today we're gonna talk about data strategy just a little more,
but there's probably lots of other things and we would like hear from you.
To be fair, it's been on the list for a while.
It has. I have had that on the list for like three years
have it been noted? Yep, you have. Yep. Yep. But it's not important
Moe 'cause we're too fast paced so. And I maintain the list that
(59:15):
backs up your contention. So anyways, but we would love to hear from
you. There's a bunch of great ways to do that. The Measure Slack
Chat group is one of them. Obviously you can email us at contact@analyticshour.io
and so please feel free to reach out. We actually really enjoy hearing
from listeners and as we hear from you, we do incorporate
(59:38):
what we hear into our show topics and to our guests and things
like that. So, we do appreciate it. No show would be complete without
a huge thank you to Josh Crowhurst. I know Josh, you're here.
So. This is awkward. He's got his work cut out for him on
this one too. Josh gonna say, you're welcome. You are welcome.
Don't mean to make it awkward, but we really do appreciate you.
(59:58):
Thank you very much. And I think, this has been... 2024 has been
a very interesting year. A lot of learning, a lot of growth,
a lot of change. And I think that's always the case in our
industry. And I think I speak for all of my co hosts when
I say no matter what 2025 brings in the next 10 years of
this podcast. Remember, keep analyzing. Thanks for listening. Let's keep
(01:00:25):
the conversation going with your comments, suggestions and questions on
Twitter at, @analyticshour, on the web, @analyticshour.io, our LinkedIn
group and the Measure Chat Slack group. Music for the podcast by Josh
Crowhurst. So smart guys wanted to fit in. So they made up a
term called analytics. Analytics don't work. Do the analytics. Say go for
(01:00:50):
it, no matter who's going for it. So if you and I were on the field, the
analytics say go for it. It's the stupidest, laziest, lamest thing I've
ever heard for reasoning in competition. You should try to troll them on
Blue Sky, Tim. Well, they're not on Blue Sky. Oh, I went to look at where
they were. What's the fuck is Blue Sky. I'm not on... It's the new social
(01:01:11):
network for liberals who don't like Twitter anymore. I don't want another
network. I don't want any networks. I know. It is like Twitter was
in like 2010. It's delightful. I love it. Like we wanna kind of wing that
or like, do we want to just sort of like
(01:01:33):
pick what order we want to go in ahead of time?
Sometimes people... Michael, I know which way we're gonna go. We're gonna
wing it. That wasn't really a question. I feel like we've been doing this
for 10 years. One's always been winging it. One's always over planning.
It is being won already. We are wanging it. And that's number wang. Okay.
Suffice it to say it will be wang. It will be wang. Well, let's
(01:01:58):
put it this way, if we go in order and we don't get
to all of them, I might not have anything to contribute except for. Are
you serious? Wow. When other people say theirs. So. Well, I'm hoping Valerie,
you can get a few. Do you mean to tell me,
Tim, you spent this long and then. People were still confused. That's right.
(01:02:25):
Well. Fuck guys. I tried. Was it really bad? No. No.
It was fine. I thought... That was great. Tim was gonna jump in
so I was being polite and waiting. Do you want to go, Tim? No. Like literally
I will always die. Like I always have something to say,
but I can't be like, I'm not gonna do it this time.
(01:02:47):
All right, I'll do it. I'll do it. I'll do it.
I'm up. Rough start. Poor Josh. He's gonna be like, this is the
worst year edit of my life. That's fine.
Every show that I'm on is the worst show to edit. 'Cause I
have to listen to myself. That's an outtake right there. That should be
an outtake. If it's not in there. We'll know who. Yeah.
(01:03:11):
All right. Rock Flag. And it's the data strategy. Power Hour rebrand. Woof.
Good one.