All Episodes

October 15, 2024 • 87 mins

For the first time since they've been a party of five, all of the Analytics Power Hour co-hosts assembled in the same location. That location? The Windy City. The occasion? Chicago's first ever MeasureCamp! The crew was busy throughout the day inviting attendees to "hop on the mic" with them to answer various questions. We covered everything from favorite interview questions to tips and tricks, with some #hottake questions thrown in for fun. During the happy hour at the end of the day, we also recorded a brief live show, which highlighted some of the hosts' favorite moments from the day. Listen carefully and you'll catch an audio cameo from Tim's wife, Julie! And keep an eye on the MeasureCamp website to find the coolest way to spend a nerdy Saturday near you (Bratislava, Sydney, Dubai, Stockholm, Brussels, and Istanbul are all coming up before the end of the year!). For complete show notes, including links to items mentioned in this episode and a transcript of the show, visit the show page.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
Welcome to the Analytics Power Hour. Analytics topics covered conversationally
and sometimes with explicit language. Hi Val. Hey Michael.
You know, this episode was something different and a little special.
For episode 256, we went to Chicago to participate in MeasureCamp. Val,

(00:26):
Chicago is your hometown, and you were very involved in the planning and
execution of the event. Besides great pizza, what were we all doing in
Chicago? I was so excited for all of us co hosts to come
together for the first time in my hometown where we did enjoy some
great Chicago pizza. And we were able to participate in the Day of

(00:50):
Chicago MeasureCamp, our inaugural Chicago MeasureCamp, and we were so excited.
We had over 200 people come together. It really felt like a homecoming/a
great meeting of a lot of analytics professionals in Chicago that I had
never gotten to meet before. But the Analytics Power Hour had a special
role in the day. So for each of the sessions at MeasureCamp and

(01:12):
on the cards on the board, we had a question that we posed to our
listeners, and we asked them to come to our recording room with us
to hop on the mic with The Analytics Power Hour to share their responses
to various questions about our industry and the way we work and think
about our role. And we got to chat a little bit, which was super

(01:33):
fun 'cause we don't always get a chance to chat with our listeners,
especially on the mic. Yeah. And some of the answers we got back
were so great. And then at the end of the day,
we got to do a quick get together with everyone to recap the
day. And so without further ado, here is Episode 256. All right. Thanks

(01:53):
for hopping on the mic. So what was one of your most embarrassing
hashtag analytics fails in your career, and what did you learn from it?
So maybe introduce yourself and then answer the question. Absolutely. Thanks
so much for having me. So just as an introduction, my name is
Pauline Gaynesbloom, I'm a Senior Manager of Analytics at Publicis Sapient,

(02:14):
and I would say my most embarrassing fail was very early on in
my career where I got pulled in to help with some BD work,
and everyone else was out on vacation, literally my manager,
all the senior analysts were out and they were like, who is left
in the office to help us with this data question, and what they
needed was a forecast. So I had to figure out how to forecast

(02:38):
something. I don't know, so I Googled it and
came up with what I could, and at the end of the
pitch or whatever it was, they kind of asked me to retrace my
steps and explain my process. Oh no. Which is where I found I
had hard coded some numbers, no idea why I hard coded them,

(02:58):
but they were supposed to be dynamic, and
I realized that the forecast was mainly junk, but they still used it
and I don't think they won the pitch, but it was a good
learning experience for me to be able explain
my process... There you go. And to double check for those little things

(03:19):
like having a space in the wrong place or having hard coded numbers
where they're supposed to be dynamic. Well. Hopefully, that was a learning
for the sales team too, to make sure that there was someone available
to do that before going on a vacation. And not the junior analyst on the
team. Yeah. It does feel like there's a tendency for BD to sometimes
be like, oh, and Anything, sure, you can drop in to do anything.

(03:40):
Yeah, we just need someone that looks ready. Just to fill the gap, yeah.
I was gonna ask whether they won the pitch. I was gonna ask
if you remember to who the pitch was too, 'cause I'm now cycling
through a few where I got dropped in way over my head thinking
yep, did not win. Did not win. Did not win that. Yeah, did not win that.
Wow. Okay, my name is Merritt Aho. I am a digital analyst at

(04:00):
Breeze Airways. And I have almost too many to name, but we'll start
with the one that happened at MeasureCamp. So this was the first MeasureCamp
in Austin, Texas. And shortly before it, I had started getting into simulation
work with data, that was just a really good way, by the way,
for anyone to learn about how test statistics work is by simulating actual

(04:22):
data. Anyways, I was doing some specific research on false positives and
I was uncovering some really interesting stuff to me, and so I decided
to do a MeasureCamp session, this is my first MeasureCamp too.
And I presented in front of a good audience and no one questioned
anything, Matt Gershoff was even in there. No one said anything. I'm sure

(04:44):
he was stewing, but long story short, it was wrong.
What I presented was, it was like, damn wrong. So you're here in
Chicago to basically, this is your make amend this is your correction,
likes your correction. Yeah, so I presented the wrong data, no one said,
Matt didn't even say it. Matt was probably being polite, he was probably

(05:04):
restraining himself, but I didn't realize it until I was going to publish
a blog post that was based on the stuff that I presented and
someone else corrected my work or pointed out a flaw, said that their
simulations were different, and I went back and checked the math and yep.
But can I ask you about, if I recall 'cause

(05:25):
you're kind of the person who... Not just to me, but to many
people kind of... 'Cause you use simulations when you're trying to figure
out understanding sequential testing, 'cause that now I see it coming up
more and more often. Is that stuff still posted
in various places somewhere. Probably. Yeah. I would challenge, yeah. I
don't know where, that one might... Yeah. I don't know what...

(05:46):
Still somewhere, maybe. Okay. I can't remember exactly where you presented
that they've been on... But I correct it before I went to publish.
Okay, that's good. And the lesson learned there is, have other people reviewed
your work very useful, even though it sometimes it's hard to expose yourself
like that, especially there are some very loud critics out there,

(06:06):
but it's pretty... That's a good one. That's interesting. We've had another
question that was a little bit of a... Another response that was kind
of a get a. Second set of eyes. Get a second set of
eyes on it. Is good I like it. Yeah. You're looking for the
blessing. Like that's great, and more often than not they're gonna be Well,

(06:27):
that might have a problem. Small or large. So. What is it?
Twyman's law, where if it's interesting, it's probably wrong?
I don't know. I mean, I felt that. Twyman, Twyman? Okay.
Awesome. All right. Well, hello. I'm Josh Silverbauer. I'm the head of analytics
at From the Future and just like musical analytics songwriter, dude, too.

(06:50):
Yeah, like I'm thinking about the long history of failure in my career,
and the one that just like I want to talk about today or
came in and popped into my head is I was working on this
very large, like one of the biggest museums, I'm not going to say
the name of the museum, but one of the biggest museums was like one
of my first clients in the United States. And I was in charge

(07:14):
of basically updating their universal analytics. And I was relatively new
to... Like I had learnt a lot about Tag Manager, but this was
really my first client as my own agency.
And I happenstanced into it through connections there. And I like built

(07:35):
a whole strategy, a whole plan, built everything myself in Tag Manager.
And then for two weeks, and I didn't realize this, I said everything
was good. It was good to go. I showed them
all the tags that I had built, etcetera. For about two weeks,
I had forgotten to publish in Tag. Oh, no. And so basically,

(07:59):
I had... And I forget exactly how this occurred, but
they had data, and then all of a sudden, it was a new
Tag Manager that they put on there, right? Because I was like,
yeah, to make sure that there's consistency, we put the new Tag Manager
on there, and it'll have all the tags in there. Yeah, sure Josh we trust

(08:21):
you. All you have to do is... We trust you. Exactly,
they're like, oh, yeah yeah, of course. We don't know anything about analytics,
so just do whatever you tell us. And so I said, switch out
the Tag Managers, and everything will just start humming along, right?
And I just didn't publish the tags, right? And
so for two weeks, their entire data dropped off,
and they just had no data during that period of time.

(08:44):
And then, but the funny thing about it was,
I found this out. They didn't find this out. So...
That's a little troubling. That looks great, Josh. Yeah I know, right? It's
like, they're like, everything's great now, I guess.
But yeah, it's just like, for me, it was a lesson in

(09:05):
just QA ing, because this is my brain in general, is like,
I'm so excited about the project, and I wanna do the project,
and everything about the project is fun and exciting up until QA,
and then I'm like, oh it's done, we're good, it's just that. All done. Yeah.
And now I am definitely the first thing that I

(09:27):
do at the end of a project, is just go into real time
and just make sure it's done. So when people are making fun of
real time, and so many people are like, real time, la, la, la, it's
like... As a QA. It's good for QA, as a QA,
it's very important. So that was definitely a humbling lesson in don't get

(09:47):
too cocky about the project, make sure that
you're really paying attention to the outcome and the end, but yeah.
That's a good one. That's a good one. I am gonna throw,
we got just a little bit left, so I would love for you
to do a quick little background and plug for Universal Sunset.
Yeah, okay, yeah, absolutely. Just for funsies. So

(10:10):
those of you who know me, you know that I
have created, probably the industry's only rock opera,
I would say. You're coming on the industry's only explicit analytics podcast
to talk about it. It's all about defining the size of the pond. Yeah. Johnny
Power Awards. Yeah. It's called User Journey Volume 1, and actually User

(10:31):
Journey Volume 2 is 90% done. And we have, basically,
just real quick while I have the time, we have
a bunch of different veterans in the analytics industry who join us on
this adventure about an alien named Cookie, who
has to go find a new universe after his universe was Sunset.

(10:52):
So the first album revolves around that. The second album revolves around
him exploring new universes. Amazing. Which would have been awesome if you
said that it was actually done, and then we actually found out in
real time that you'd actually forgotten to publish. Oh, yeah, I forgot to
press the... Yeah, there's like nobody's, I'm getting no feedback. It's
like, well, you listened to it then. I checked real time to know if it's
happening. Awesome. Well, thanks. Thanks for joining us, Josh. The twofer,

(11:14):
yeah. Yeah. Good one. So Val, what's one of your biggest analytics fails?
It's time to step away from the show for a quick word about
Piwik PRO. Tim, tell us about it. Well, Piwik PRO has really exploded
in popularity and keeps adding new functionality. They sure have. They've
got an easy to use interface, a full set of features with capabilities

(11:38):
like custom reports, enhanced e commerce tracking, and a customer data platform.
We love running Piwik PROs free plan on the podcast website,
but they also have a paid plan that adds scale and some additional
features. Yeah, head over to piwik.pro and check them out for yourself.
You can get started with their free plan. That's piwik.pro. And now let's

(12:00):
get back to the show. Alright. Well, this pain is still real, even
though it happened like 12 years ago. But when I was first getting
into digital analytics, I wasn't as business focused or business centric
with what I was doing because I was so wrapped up in,
like, are the tags perfect? Are we collecting all the data?
Like, is the e commerce store collecting all the events? Did we push the

(12:22):
tags to production? Did we push the tags to production? Exactly.
So I didn't have the appropriate focus on what mattered most,
but I would like go into my little cave and produce analyses of
things that I found to be interesting. And there was one occasion where
there was a lot of competition over who was going to get the
homepage here on real estate or the first position for the menu on our corporate

(12:45):
website. And one of the things that I noticed is the homepage wasn't
even in the top 10 pages for entries to the website.
And so I was like, you know what I'll do?
I'll call it a front door analysis. And these are the top 10
pages and here's what people do when they arrive. And I thought this
was like the hottest shit to hit, I don't even know,
the street. And so I thought I was going to like produce this

(13:05):
and like shop it around to each of the content owners so that
they could think about like, oh, imagine your experience is the entry point.
What would you do differently about the way that this is constructed or
thinking about the next experience? This is wild. This sounds so good so
far so... I'm waiting for the turn. So I poured an exorbitant amount
of time into this analysis and my team was really excited about it

(13:26):
too. And so we go into these meetings thinking that people are going to
be like high fiving us, doing like cartwheels down the hallway,
like, oh, thank you so much. What did we do before you shared
this analysis? No one cared. No one cared.
There was one meeting where I knew my stakeholder really liked printed materials.
This was an association and it was about 12 years ago.

(13:48):
So I printed the deck for everyone in the room and there was
someone who actually said, when this meeting is over, can we just destroy
all these and pretend we didn't have this conversation 'cause I don't plan
on updating this content for the next five years. And I was like
crushed, crushed, absolutely crushed. I was like, why isn't everyone just
like banging down our door for like, 'cause again, the beginning stages

(14:08):
of our program, we were still trying to ramp up interest in how
we use data to make better decisions. But I wasn't being
a really good partner in that scenario. And so
thankfully I started going to some conferences and realized, oh, maybe if
I recenter my priorities around what these people care about versus what
I find interesting in my little cave when I come into work on Saturday
morning when there's no air conditioning, maybe then people will care.

(14:31):
And so I started to make that pivot and just like never looked
back because making the priorities of the business, my data work is what
really opened the door to really enjoying my role in analytics. It does
seem like, 'cause the idea doesn't seem bad. So it seems like there's
a... Would have been, if you'd actually asked to gauge interest

(14:51):
to see if somebody perked up and they might have if they weren't
already looking at the results so... The thing is that only one person
really cared about the competition. Like everyone just felt like, oh, mine's
the most important. So like, I don't really care anyways. Like I don't...
It's no different to me if someone like thinks their content is more
important because of course it's mine. And so no one, I was trying

(15:14):
to like use it as a way to like diffuse the number of
meetings I was dragged into about who was going to get the homepage hero
from Tuesday to Thursday 'cause we had to like rotate them. This is
ridiculous. But yeah, so everyone just felt like, oh, I'll just make sure
that my boss has the squeakiest wheel and then I don't have to
worry about it. That's awesome. But anyways, that was a big pivotal moment
for me and quite a huge fail. But it's a learning you've taken

(15:36):
with you ever since. I sure have. Alright. Good one. Well,
a lot of those are some good stories and reminders to make sure
you are QA ing your work and getting a second set of eyes,
whether it's tagging or even a model that you're building.
Here's our next question. What's a commonly held belief within analytics
that you passionately disagree with? My name is Jessie and I work for

(15:59):
Wilson, their analytics manager on the e com team. So I would have
to say it's persona development based on demographics. Oh, I love it.
Okay. Yes. Yes. Yeah. I feel like now, like whether you're doing analytics
or you're a marketer, like it's like, okay, we need to develop this
persona. Who are they? Female, male, age group we've got to know them

(16:24):
so we know how to communicate with them. I feel that's what everybody's
trying to do. But for me, I feel like it's good to know
the demographic you can refine your message, whatever. But then sometimes
just based on my experience, there's not a lot of like actionable things
you can do. So for me, I think in addition to

(16:44):
demographic for persona, well, "development", it's more important to look
at kind of their purchasing behavior. So for, let's say like, so my
company, we're trying to figure out who are the people buying pickleball,
right? Because everybody's playing pickleball right now. And then, so we're
trying to see like, okay, do they purchase like tennis before and then buy

(17:06):
pickleball or, 'cause we're trying to figure out if there's any opportunity
to cross sell them. 'Cause in my company,
we have like different line, like sportswear, tennis, and then we often
see like cross out between those two, but not so much like other
sports. Like for some people who buy baseball glove, just buy baseball glove.
There's not like really like transcending the sportswear. But then what

(17:28):
we do from analyzing those like purchasing behavior, people who bought pickleball,
we see that it's really transcends like different sports, like people buying
basketball actually also buy pickleball and stuff. So that's kind of like,
tell us like, if we want to talk to them, maybe,
we can like cross promote like sportswear and then kind of see like,
okay, do they buy sportswear? What's the next thing they bought? Or like

(17:52):
within pickleball, like category, what's the first thing that drives like
new purchases? Is it like paddle or footwear? It's actually footwear. Like
people buy footwear first. So I think it's good to know the demographic.
Like, of course, it's gear to wear younger and also like older,
and then like a little bit more like female then male.

(18:13):
But then in addition to that, you also have to layer in a
lot of like the purchasing behavior, especially like what they purchased
together with like, what's the next best thing to
talk to them and then we can kind of like orchestrate a journey
and so on and so forth. So I would have to say like, the
thing that I disagree the most is to develop persona solely based on

(18:35):
demographic. I don't know it's a really fancy wear, people want to do with
persona, but you know, there's so many things to like consider.
Yeah. Absolutely. That's a great one. I love that. Really? Thank you. Oh
my god. I'm like so nervous 'cause I was like I love the
podcast. I've never been in a podcast before. No, for years,
Especially for just that example you provided. I was like, behavioral personas

(18:57):
are so much more powerful than demographic personas. And so it's sort of
like, oh, don't go looking at zip codes. Go look at other behaviors
that are like this one. Like, so for example, like people who bought
pickleball and basketball. Well. Who are these people? Right. You wouldn't
pick that up from whether they are soccer moms or drove this minivan
or whatever the demographic information you have. Exactly. Yeah. No, for

(19:20):
sure. Yeah. So it's like interesting to me. Yeah. Especially looking at
those like purchasing behavior. That's Awesome. Any thoughts on that Julie?
Have you gotten a lot of pushback on that opinion at work?
Yeah, for sure. Especially like, talking to the, like our partner,
like different business unit, 'cause they really love demographics data.

(19:42):
So I think it's good to kind of like show them the data
and you know, kind of, okay, this is what we see from the
data and then we just like saying just because we think.
And so I think definitely walking them through the data, help them understand
it's definitely helped to have a smoother conversation and stuff. And we're

(20:04):
just, trying to evangelize more, like using the data with different,
BU, for example, this is pickleball and then we also do some analysis
on for some baseball glove, right? Like what are the person's behavior?
Do they buying in the same position, baseball glove? And then do they
like buy more custom glove later. Like just trying to understand kind of
the first, second, third purchase. And then we can like, again,

(20:27):
like orchestrate the journey. But yeah, it's never easy. Like, 'cause you
know, like we talk about data, but then
people like to believe what they think. So. Yeah. They have those assumptions
and they expect people to behave a certain way or fit in a
certain demographic and then you're kind of like, let's just see what they
do. Which is awesome. That is Awesome. Well, Jessie, thank you so much.

(20:49):
Yeah, thank you. For being on the show. It was so great to talk with You.
It was awesome to have you. Thank you so much. Alright.
I am Jenn Kunz. I am a principal analytics architect at 336. And
I had written down a few because there are quite a few.
Let's get into it. So I am presenting later today partially because I
just have some passionate feelings about the way the industry's talking

(21:12):
about cookies and consent and server side tag management. The language that
we're using and the expectations that we are setting are off.
So I'll save that for my presentation later. Tim reminded me that for
a while I was the... It doesn't matter which tool set you have,
that if you have the right people and the right processes,
you're going to get value out of it. I'll admit, I've walked back

(21:33):
a little bit on that over the years as GA4 has gotten bigger
and bigger that maybe the tool does matter it sound. Shots fired. The
tool definitely contributes on some level. Not as much as most tool vendors
would necessarily want you to believe it does. Yeah. Yeah. But I think
these days, one of my biggest things is just we need to stop

(21:54):
buying so many tools and focus on the ones that we already have
and getting them working properly and getting value out of them,
'cause if you're three analysts who are stretched between five tools that
aren't getting enough value, Buying the six tool that you think will be
valuable is not going to make the difference. It's just gonna stretch them
thinner. So unless you have the resources for it, I think most people

(22:18):
still need to work on building their strong foundation before they go onto
the bigger fancier stuff. Yeah. It always shocks me how
a lot of businesses don't put the effort into mastering the tool sets
they've actually invested in. And that serves both of us as consultants
pretty well, in fact. Sure. Oh yeah. But realistically, when we walk into

(22:39):
those organizations, what we want for them is the success of using this
data effectively. And you just get presented with these challenges all the
time where it's like, well, yeah, I can show you how to do
this, this, and this, but you've gotta wanna
do this as well. Yeah. Yeah. Absolutely. And I think that
it's easier to throw money at a new tool than it is to

(23:02):
solve the processes and problems and technical debts and all of the things
with your existing tools. It's easier to start with a blank slate when
you have the naive vision of what you're going to achieve with the
tool before reality is set in. But yeah, I think we all need
to be much more honest with ourselves. Yeah. It's always more fun to
be part of a rollout than a maintenance. Absolutely. Yeah.

(23:25):
Yeah. Things haven't been bogged down and ruined yet, so. Well, that's a
good one. I like that. Yeah, that's a really good one.
Vendors love me. Yeah, That's right. Yeah. Hey, no vendors allowed, we talk
about. That's right. That's right. The analytics not tools. That's right.
Oh, that's awesome. Hi, I am Matt Policastro, a colleague to several people

(23:45):
who are on the podcast, previous guest to the podcast as well,
but, just general analytics and data science person doing experimentation.
Welcome back. Oh, I'm so excited. I heard you guys were asking this. Data
does not speak for itself. So good. Yeah. Our job as data practitioners,
analysts, data scientists, whatever, is that we are fundamentally storytellers.
We have to be able to communicate what things actually mean to people

(24:07):
and build narratives around those to get buy in. You cannot just show
someone the dashboard and expect them to be able to grok whatever you
think that they should be getting from this. It is not self apparent.
You need to sell your work. Wow. And you are a data scientist,
correct? Yeah. A Little, have worn many hats, but yeah, I've been in
a data science role and kind of. So

(24:28):
that's a challenging. Is it... Well, do you think that's a challenging view
for someone that's coming from that Background? I think so. I mean,
it pains me 'cause it feels like I have this conversation every time
I go to a conference and talk with colleagues. But yeah,
it's just, it seems like over and over and over again,
you run into folks who are like, I just don't get it.
Like we shared the Excel file with them, like, it should be clear what's

(24:50):
happening here. And it's like, well, it is unfortunately we do have to
be... We have to, bring ourselves down to the level of rhetoric.
And actually how to win arguments 'cause, like we live in...
We're political animals. We live in organizations. We have to get... We
have to work towards the things that we want. Yeah. And I think
too, like what you said was you have to bring the story to

(25:11):
the number. So just because you present a number, they may not understand
what is the consequence of that number. And so I think it takes
the practitioners with the data they have to step in and say,
what does this mean for your problem? It takes that extra level of
understanding. And if you're really used to probably being really technical.
It's hard to probably step over that line and play the full game.

(25:33):
Yeah. Totally. Totally. I thinking about it like the blood brain barrier,
it's like it doesn't go through that. So you have to like find
a way to jump across. Yeah. I don't know if that's a good
analogy or not. What an analogy. I Don't know. Well, I mean it's like, it's
also fascinating 'cause it's like, I mean, for many of us,
we're in the quantitative space. And we work extensively with quantitative
data, but like, there are also like, there are many types of information

(25:56):
that we can get access to. There's qualitative data, there's user feedback,
there's voice of customer stuff. And you can see situations where say rogue
UX architect gets one piece of feedback somewhere. That's right. And then
three months later it's like the CEO is like parroting this back and it's
like, what? Hold on. How did this get here? This is affecting like maybe
three, four people like a quarter. And yet this is now something where

(26:19):
it's like, okay, well now we have a scope of work to actually
go focus on this. And it's like, how did we get here?
And that's... Again, it's like that questions of scope and questions of
severity, like that stuff doesn't necessarily make it self apparent or self
apparent. Yeah. Well, and qualitative data can be so powerful because it
is a story in and of itself. If you watch a user struggle
with something in a user study or something, it's very powerful.

(26:40):
You're like, whoa, what is that? Like, Well, we should look into that.
But that's where I think as quantitative analysts, we can come back and
say, okay, now let's go look at our quantitative data set to see
what is the size of that, versus sort of like, well,
one person said it and we don't need to change the entire website
'cause it's literally like three people a quarter or whatever. So yeah.

(27:01):
I mean, again, it can be a virtual cycle, which I feel like I've said, I
may have even said it when I've talked to y'all before, but like, it
can be a really beautiful feedback loop of, you can get those stories
and then go and validate that with data, and then what does that
uncover or make clear? And then how do you bring that back to
the customer and bring that back to other people to get that additional
information. So yeah, I can throw out more non sequiturs non consensus reality

(27:24):
is broken, we have to adapt to the times. But, yeah. Outstanding.
So good. Wow. Those responses couldn't have been more different from each
other and they gave me a lot to think about, especially Jessie's persona,
hot take. I mean, it's so obvious now that she said it,
but it's definitely one that I think the whole industry hasn't come around

(27:44):
to yet. Absolutely. Awesome. All right, and here's our next question.
Can you please tell us about a book or a podcast that has
nothing to do with data or analytics that's had a profound impact on
your career or the way you think about a problem?
Sure. So my name's Heather Gassman. Interestingly, as a little tidbit,

(28:05):
I am in a book club that reads like four books a month.
Whoa. With Wendy Greco, Adam's wife. But one of the books we read
was called The Measure, which actually sounds like it might be a little
analytical, it's not at all. But it was really interesting in how I
thought about my life because in this book, everyone starts receiving a

(28:28):
piece of paper and it's essentially a certain length.
When you're 21, you get this piece of paper and people are trying
to figure out what is this strange delivery of piece of paper and
what does it mean and what they come to mean, and I'm giving a
little bit away, but is that it's the length of your life.
Ooh. So some people decide that they don't wanna open the box because

(28:54):
they turn 21 after this has been revealed as to what it is.
Some people definitely wanna open the box because then they're gonna decide
well, how do I spend my life if I only have this much
time, I need to live it to the fullest and others... So, you know
what other people's piece of paper sizes are as like a comparison.
And they actually did all this scientific treatments on the thing to figure

(29:18):
it out. And it really does come down to like, this is how
much time you have. So I just think it's a really,
really interesting concept and a really great book to talk to others about
in how would you choose, what would you do? Would you look in
your box and what would you have done differently if you knew? Do

(29:39):
you think like, I mean it's, we only have so much time in
this world. But it really helps you sort of like... Do you really
wanna spend your time optimizing those paid search keywords. Absolutely.
Absolutely. So I just think that's a, it's a good one to put
on your to read list, especially if you're in a book club.
I am in a book club and now I'm gonna have to look
it up and make. Does your book club do four books a month? No, we

(30:02):
do one a month. We're normal. Not everybody does the four,
but it's just the really nerdy ones, So yeah. And then what's your
second book? The second book is, there was just a chapter related,
and it was, I think I misread the topic when I came up
with these books, but there's a... The book called Wellness that's actually

(30:22):
set in Chicagoland. So that was kind of fun for me,
to see the locations of Chicagoland and maybe would be fun for your
fellow measure camp attendees. That's like, What's the white, what's the
one about the Chicago fair and the serial Killer? Oh, The Devil in the White
City. Devil in the White City. Yeah. That's another good one.
Erik Larson. Can't apply that to analytics, I don't think, but.

(30:43):
No, Probably not. But yeah, he does have a new one out.
But anyways, the Wellness book is a really long one, so not for
the faint of heart, but there's a whole chapter that I thought was
super interesting on the Facebook algorithm. So in the book, the character
has a really terrible relationship with his dad

(31:03):
and his dad starts getting all this Facebook stuff and essentially going
down a rabbit hole of believing all sorts of things are gonna happen
that really aren't true but... So it's total fiction, it's not at all
based on reality. Yeah, this is a fictional book, but actually I think
the author did quite a lot of research and it was interesting to
me. I mean, as all of us use Facebook and I personally have

(31:25):
spent a lot of time in my measurement career in the world of
search engines, and recommendations engines. So I thought that was a really
fun way to kind of build it into a story that helps the
common man, if you will, understand how Facebook decides to show one person
one thing and how you can get like, really caught up in fake news if you

(31:49):
will, if you're... 'Cause it keeps kind of feed you more and more
of what you already looked at and already consumed. But it's kind of
interesting. Ah, Thanks so much for stopping by.
Hello, my name is Prolet Miteva and I
am currently what I call a world explorer,
as I am taking up extra long sabbatical from working in analytics,

(32:13):
which I was in for over a decade. So I have a book
suggestion that had a very, very big impact on me personally.
And I will tell you also how it's relevant to analytics in kind
of like adjacent manner. So the book is, Die with Zero, and I
should have been more prepared. I did not get who the author is,

(32:33):
so just Google it, it's like, it's there.
It's available as an audio book if you're too lazy to actually read
it. And it made a very, very big impact on me personally more
from the perspective of kind of evaluating your life and your life choices
and how that should be affecting what you do right now in your
life. From the perspective of the high level, hey, like really what you

(32:58):
can do in your 20s, you probably cannot do in your 70s.
So if you go and decide to jump off a parachute,
you probably wanna do that in your 20s, 30s or early 40s and
not when you're in your 60s, 70s or 80s. So the book does
a very good job in really helping you evaluate what you're doing with with
your life now and, what you should do now versus later.

(33:24):
And how it made an impact on my analytics. Is the Die with Zero,
like no regrets or no, you can't take. Zero things left on your list, maybe.
No, it's zero money. Zero, okay. Because it actually. You can't take it
with you. Okay. Yes, it's also very connected to the whole idea of
financial independence and where you should be, but also not that you should

(33:46):
just be saving, saving, saving forever. You should live now, you should
live for the moment, and you should be spending at different decades or
kind of like in a bundle of five to 10 years together.
How it's relevant to analytics, I would say, is, it also gave me
a very different perspective of work and working with people within the

(34:08):
company. So I was in the corporate for
the longest time, basically pretty much my whole life,
and just reading the book, getting to the point where I was also
closer to my own financial independence gave me the perspective of just
like really not caring as much and not being as committed and kind

(34:30):
of giving crap about what other people are thinking. So
actually in my last job in my analytics work,
a lot of what my team heard from me often was
kind of what are they gonna do, fire me?
And just from that perspective and being able to kind of push in

(34:52):
different directions, push for what I wanted, push for what I thought was
right for my team was very, very valuable. Basically becoming braver and
becoming more focused on things that I really liked, the things that I
knew that my team liked, and pushing in that direction and not just

(35:13):
blindly following. It's like, oh, I'm told to do this, so I should,
and I should execute exactly how I'm being told.
And pushing that boundary was really, really helpful because in a way it
opened up my career, my opportunities to actually speak up, and my opportunities

(35:33):
to teach my teams to speak up. My girlfriend and I have post its on
our computers that say chutzpah, and chutzpah is about that, right?
It's about doing the brave thing that feels intuitive to you,
but sometimes the consequences might have held you back, or trying to think
about what those consequences are versus actually being brave to do the

(35:54):
right thing. I'm wondering when you read the book, 'cause I feel like
as long as I've known you, I can't imagine anyone actually telling you
what to do, or at least not doing it twice but.
Believe it or not, it got even worse after. Awesome. I like that. That
was a great, great answer. Moe, I'm gonna take this opportunity.

(36:14):
You've brought it up many times, but I'm gonna challenge you to actually
equate it to why it's useful for analytics. The Acquired podcast,
which is long form and talking about businesses, and you're sucked into
it, and you're constantly referencing it, can you
tie it to how it's... Data and analytics? Yeah. Which I think could
be the management or the management of teams and/or. Okay. So

(36:38):
the reason I like it is because it's almost like a book.
Because the podcast episode's like three to four hours. You are really delving
deep. I think what I get the most value out of it is seeing
about company cultures, which is something that's really important to me.
Like how do you drive high performance? How do you move fast and
not like burn everyone out? Like how do you... I guess what makes

(37:03):
great companies great, but then at the end there is also...
There's like obviously the stories of when companies IPO and I find that
really interesting, like the whole how they got investment when they go
to IPO, what was like... They know the stat about everything,
about how many shops they had, what their CapEx was like.

(37:23):
Like they just know every single detail. And so they go through that
and then at the end they analyze the company and they analyze,
like they have a whole framework for it, which someone else will know that
off the top of their head that I don't.
But what sets that company apart, like in terms of either their financial
performance or the factors that differentiate their brand from their competitors

(37:46):
and that sort of stuff. So I think the thing is,
as I'm growing in my career, the reality is, I definitely do less
data work, but you're actually having strategic discussions a lot more.
And so it's more about, what are ways of thinking that can help
me with those strategic discussions? I obviously still want to use data

(38:07):
to inform those decisions, but there are often
things we can adopt, right? Well, it seems like also, 'cause you're at
a... How much has Canva grown since you've been there? That's putting you
on the spot, but... 10 X. 10 X. Yeah. Exactly 10 X. It
was 500 people when I started. I haven't listened to as many episodes
of it, but they kind of go through the, I mean like the

(38:28):
Nike ones, one I listened to in video where it's like,
they go through sort of the growth and evolution that like, sometimes I
feel like we treat, we think like right now, our company is gonna
be very similar tomorrow as they were yesterday as they are today.
But recognizing that there's an evolution. There's not one magical path.
There are decisions that don't work out. It seems like, I mean, I'm not...

(38:53):
I wasn't seeding that. As you were giving that answer, I was like,
oh yeah, but she's like living through one of the stories that very
much could be an acquired episode at some point down the road.
But where you kind of ended with the... What they do is
they talk through and think about the company and the operating environment

(39:15):
and the competitive environment and what they did and what worked and what
didn't. And I'm like that for analytics, like that's the way we should
be somewhat taking a longer view. Not 'cause if somebody needs their weekly
report, it's not saying, no, let's talk about the overall evolution.
But I did not know what your answer would be to that.

(39:38):
And I wasn't sure what a good, but as you were talking,
I was like, that's what as a hook. The great thing about a
measure camp and all coming together, especially the Chicago measure camp,
is that everyone seems to have brought their spouses.
So I got to have a good old chat with Julie earlier,
which was delightful. And I think when I described this about you,

(40:00):
she would probably agree in terms of being incredibly well read across what's
happening in the industry. Probably so much so that you don't have enough
time for all of the other stuff. In addition to all of the
crazy things you do, volunteering and running things and all of that stuff.
Parenting, spousaling. Parenting. Yeah. Okay. Yeah. All the things.

(40:20):
So you're a man that probably has strong views on
non data and analytics books and podcasts that have changed your career.
Yeah. This is one. I mean, I think I might have contributed to
this question. I had in a very short period of time read Stumbling
on Happiness. Sorry, Stumbling on Happiness. Stumbling on Happiness, which

(40:42):
I think is Dan Gilbert. And I think it's actually in like the
self help section. Of course it is. I read Blink by Malcolm Gladwell. And
I read Brain Rules by John Medina, who's like a neuroscientist. And to
the life of me, I think many of them said some of the
same things. They were all basically about the brain and how we don't...

(41:04):
Stumbling on happiness that we think something's going to make us happy
and then we get there and it's not. And Blink, our intuition
can be very trusted and amazing. We don't fully understand why,
but it also can not be. So like those
three, and I just read them at a time when I was starting
to dig into data visualization and becoming more aware of the need for

(41:26):
communication. And somehow I just read all of those and I was like,
oh, there's a whole aspect of the way that we bring things in. We're
trying to analyze how customers are gonna behave, but customers are human
beings and they're messy. There's not a magic formula, probably more so
on the when communicating effectively to stakeholders, trusting people's

(41:50):
intuition. The kicker is that I will remember anecdotes like one of them,
I think it's maybe Blink, where somebody sees somebody across the street
and their memory is super clear. They see, I don't know, Queen Elizabeth
or something. Clearly my brain doesn't remember enough of it and just swears
that she was wearing something. And there's like photo evidence that they
weren't. So the same sort of thing that goes on with unreliable eyewitnesses

(42:14):
in crime, but like the various anecdotes from those like crop up all
the time when it comes to, okay, you're trying to communicate this.
Don't just think that because here's this messy heat map and you figured
out what it means that you can just flash it up.
And it will work. So, and they're bigger. Like, I'm not comfortable,

(42:36):
like, buying a book in a self help section just 'cause I've got
whatever various forms of... So did you just buy it online then and
send it straight to Gladwell? I don't know. I'm not sure where I
got it. I had a physical. I mean, I think I actually did
buy it, but I couldn't find it 'cause somebody had recommended it.
And I was like, why isn't it in a section that I would
go to? And you're like the section that you probably should go to.
Probably the section that I should probably constantly spend a lot of time

(42:57):
in. But so, yeah, so it was kind of like a three,
'cause I just can't keep straight which ones were which. And it's interesting.
Gladwell's starting, people either like him or don't like him. And I was
a big fan for years. But I'm starting to get a little sycophantic on
some of his podcast stuff. So, but it's always a great...

(43:17):
I feel like a couple of those books I
have in audible ready to go and haven't started them for various reasons.
Like, I do feel like you need to kind of be in the
right headspace too. Yeah. What's your point about, to your answer on the
Acquired podcast, thinking of that as a book that it goes for,
that's a good, like that can be the intimidating, like, I don't wanna
start it 'cause it's like three or four hours. Like, yeah,

(43:39):
you can stop it and you can pick it up later.
Just like a book. So cool. So Michael, after listening to those responses,
do you feel like you understand Tim a little better?
Absolutely not. No. No, those are definitely some good, interesting read
ideas I'll have to add to my either listen or read list.

(44:00):
I especially wanna check out that recommendation from Prolet, Dying with
Zero. That's a really novel concept. I'll have to look into that one
for sure. Yeah, for sure. So, thank you for joining us.
We would love to hear your thoughts on what function or feature you
get most excited about when you get to use it in things like
Google Sheets, Excel, Power BI, or anything you use in your day to

(44:21):
day work, and why. My name is Ken Williams. I'm the founder of
Dive Team. And my answer is hopefully one you haven't heard,
which is that I really get excited about
SQL orchestration tools. The biggest ones are DBT and Dataform. Yep. I've
heard, Moe, you talk about DBT before. We're like DBT's biggest user,

(44:45):
I think, at Canva. Yeah, I started in DBT. And then when Google
Cloud acquired Dataform, started using Dataform. And now I'm in it very
deep. And for me, the reason that I find it so exciting is
because there are problems that I find difficult to solve

(45:06):
with raw data because I spend so much time in my role working
across a lot of clients. I spend so much time doing the same
thing over and over again, writing queries to get data in a certain
format so that I can use it. And with Dataform, I just,
I do it once, I schedule it, I copy and paste it across
clients. It makes just getting data organized and set up in a templated

(45:30):
way that I can use it so easy. So
I have been very deep into using it with Google Analytics like a
lot of people. But what we're doing with the group that I work
with in my daily life is expanding it across lots of data sources.
So we've got this library we're building of off the shelf models that

(45:51):
we can just plug in. Oh, that's cool. Yeah. So if somebody's like,
I use meta ads and Google Ads and Google Search Console and Google
Analytics, it's like, well, obviously everybody does. So I've got all those
things, plug them in, five hours later, I've got like a really robust
data warehouse. And it's all possible. Damn. That's fucking genius. I told
you. This is the most interesting questions of the day. Okay.

(46:12):
So what is the primary difference do you find between that and a
DBT? Are there differences you've noticed or things that have made you prefer
it? When your day to day use of Dataform and DBT is super
similar. DBT has a few features that Dataform doesn't. It doesn't document
your models quite the same way. Although BigQuery has a lot of built

(46:35):
in documentation. I think that Google Cloud probably wants you to use that
instead of replicating DBT's models. And there are some little functions.
Like if you want incremental tables, it's kind of manual with Dataform.
It's really out of Bosch and DBT. You can do everything.
It's just a little different. Oh, I feel that's a sticking point.
I feel incremental tables is like... It is a whole thing.

(46:56):
The most important use case. It is there in Dataform, but you have
to write a little block of JavaScript. Okay, it's just a bit nigglier, like
it's harder to do. Yeah, exactly. And it's harder to do in like five
lines of JavaScript. It isn't that hard to do. But it's, you have
to know what you're doing. To me, though, the big difference is
with DBT. And the reason I switched is because with DBT, I was

(47:19):
almost always using DBT core, which means you have to spin up a
server to host it or run it locally. Whereas Dataform just runs in
BigQuery. So you don't have an application. And it's just one less thing
to monitor and that might go down at some point. So
almost everything I do is in BigQuery typically, unless a client really

(47:40):
insists that I use Snowflake or something else. So
it's just very convenient. If I'm already in BigQuery, I might as well
just spin up Dataform. So that's my answer.
Yeah, that's a good productivity one. And you don't find that you have
to customize it very often? Like the sources, your library of is pretty
robust now where you really are like, I'm just pulling things off the shelf

(48:01):
or? I wouldn't say we're there yet. Okay. That's hopefully. So we
have found that if we spend a lot of time in a data
source, it gets more mature over time. And so the way that we... Like Google
Analytics is really complicated. And there are a lot of people who have
built models and made them open source. And we've learned a lot from

(48:23):
different people. But we've kind of taken other people's ideas and put it
into a format for us that's very easy to edit the things that
need editing. So like different people track different custom dimensions.
So we've made that as easy as possible. Like we've pulled that out
in a separate thing. You don't have to get in and.

(48:43):
You can modify the core code when you want to change that.
It's like a separate thing. So stuff like that. But it's not,
it always, it gets you 80% of the way there. It doesn't get
you 100% of the way there. I mean, 80% is big.
80% is a lot. And in the world that we're talking about,
too, we used to charge lots of money for lots of time to

(49:06):
do lots of custom work. And we can do things super quick now.
Nice. Well, thanks for sharing the tips with us. You're welcome.
Cool. So yeah, my name is Adam Bowker. I'm a
director at Ricoh Digital Analytics. And one of the hacks that I've found
to be really useful is if you're dealing with
a mix on your team of very technical people, there's often a gulf
between people who are either end users or fairly technical end users that

(49:31):
might know some Excel or maybe a little bit of SQL but not
a lot, versus the very technical people in R or Python or SQL.
And often you have needs for the less technical people to still be
able to interact deeply with tables in your data warehouse or something.
And one way that we found that was really helpful for that is
to just use Google Sheets into BigQuery because you can just use those

(49:53):
as tables. So you can put line level data security on it.
You can do a little bit of that. But if you want,
end user marketers who are dealing with campaign codes or they have a
specific list of things they wanna analyze in some other table.
A lot of BI tools aren't good at integrating lists of 20 or
100 or 1000 values into something else. So it's a really easy way

(50:16):
to quickly get friendly user data into BigQuery in this case.
But I think there's probably other connectors too.
So it's just been a nice way to just get from a spreadsheet
to SQL and allow your SQL engineers to then do
your actual functionality. And is this something you're doing from the sheet
side to say, hey, push it into BigQuery... You can do it both

(50:38):
ways. You can do it both ways. Oh, okay. So you can have
a table or a query that just populates into sheets.
Or you can just say to BigQuery, this sheet is a table.
And then anything you change in the sheet will be updated in the
table. So you don't have to give them insert or update privileges for
them to be able to do that. And the bit that I actually
love the most about this is the opposite of what you've said.

(51:00):
Okay. So you've talked about basically making the data that's in the sheet
available to your engineers, right? I actually think it also provides a
gateway for your less technical data team to get an introduction to BigQuery
and to be like, oh, this is a familiar setting. This is what
it looks like here. Okay. And the funny thing is, like, I know

(51:22):
so many people that are like Excel gurus. And I'm like,
you can do SQL. Like, if you are very good at Excel,
SQL is the thing you should learn because it will be very intuitive
to you, I think. And that is creating a bridge for someone to
be then like, okay, I'm gonna learn BigQuery. I'm gonna start learning some
SQL, which I think is so cool. I found that to be generational

(51:43):
too. Like a lot of the people who are 30 or 40 and
up learned in Excel. All the newer people are learning R or Python.
So we found if you take the Excel concepts and bring them into
SQL, then that'll work for some people. Having clause, aware clause, that's
what this means. This is how you would even have pivot table.
But if you know Python first, then that's not going to make any
sense. Yeah, totally. Totally. Yeah. So Brian Hawkins here. I'm head of

(52:07):
optimization technology at AtSwerve. To answer the question,
this is a feature related to GA4 where when they deprecated, optimized their
internal testing solution, rather than put a new testing solution in place,
they opened up a new service, a new utility, a new API.
Where any third party testing solution can natively integrate with GA4.

(52:32):
Which is pretty cool. That's actually fucking cool. Yeah, very cool.
And so now everyone assumed you need Adobe Analytics to use Adobe Target.
And so what we built is, and it opened the door wide open
for me at Prova to support any testing solution with native APIs,

(52:53):
the ability to create audiences. And so what this service that Google does
is basically creates a special API where you can pass an event,
and then you can automatically create audiences within GA4 via programmatically.
And so this is my favorite, most exciting thing, because now anyone that's

(53:13):
using GA4 can use Adobe Target and just out of the box turn
key. And it's very similar to what Adobe Target and Adobe Analytics have,
where there's like a native integration, basically stitching and aligning
visitor IDs. But it's also cool from my standpoint, because Mia Prova has
always been, we've got lots and lots of Adobe customers, but now we
have several GA4 customers. And now, folks are coming to us talking to

(53:36):
us like AB Tasty or other testing solutions that's using GA4 as a native
reporting or so. And so it sounds like you're really taking advantage of
this feature. Do you feel that others in the industry know enough about
it or do you feel like it's not, like it's kind of... Is
it a known known or is it still a little unknown?
I think it's a little still unknown because it's really got opportunity

(53:59):
way beyond testing. So this was something that Google did with Optimizely,
with Convert, and I think AB Tasty. AB Tasty initially, before releasing
it open to everyone. So those three vendors had it, but a lot
of companies are still relying on the handoff from the testing solution
client sites of this, whereas this integration service basically automates

(54:22):
audiences. As soon as it sees an event, it creates the audiences in
GA4 for analysis, and it flows in a BigQuery. Oh, that's nice. Which
is really slick. Yeah. And so the Adobe Target community,
because it creates audiences automatically within GA4 and then in a BigQuery.
Those in the Adobe community can use those audiences in advertising,

(54:45):
which that's an activation layer that the Adobe Target community hasn't
historically had. Amazing. Such a good one. Hey, everyone. I'm Krista Seiden
from KS Digital. So I'm going to talk about Google Analytics 4. And
the thing that I find most exciting about GA4, I probably say that
all the time. I love this. It's my favorite feature. But I think

(55:07):
the thing that I find most exciting is the ability to customize any
report you want and then customize your left nav to meet your business's
reporting needs. So you can never do that before in GA.
And now you can, for example, if the primary metric of an out
of the box report is not what your company looks at,
you can change that primary metric or you can make a whole new

(55:27):
report collection that is just a regional based or a team based type
report collection in your left hand nav. So I think it's really flexible,
really cool. So previously it was all like preset and you had to
choose like from boxed up solutions versus like now you literally make your
own kind of navigation. Exactly. Yeah. So you still have the out of

(55:50):
the box thing that's like set up when you install Google Analytics,
but you can totally change it and customize it and make it your
own or make it you know what's going to be the best solution
for your organization. Nice. That feels like You can probably make it a
lot lower pressure, too, for newer people to the tool. It's probably a lot
less overwhelming to be like, I have to look through the whole list

(56:10):
and know what I need to pull out. Yeah, totally. I think there's
positive and negative to it, right? You can do almost anything you want,
so you can make the reports that you want to need.
But first, you have to know that you can do that,
which not everybody knows. And then second, it's
not an insignificant amount of work to completely customize your account.
But if you're... Especially if you're a larger company and you want to

(56:32):
set up reports for various groups and whatnot, if you have that kind
of like admin function, you have somebody you're kind of responsible for
really setting it up and making it the most useful for your company,
you can totally do that. That's so cool. And you obviously are seeing
like what different companies are doing through your work. What are some
of the main things that people customize? Like what are you seeing happen?

(56:56):
Yeah, so I think one of the biggest complaints people have about GA4 is
that there are no views like we had in Universal Analytics.
So you used to be able to have like a view for your
European traffic, a view for your US traffic, a view for your product
team, a view for your marketing team. So we don't have that in
GA4 anymore. Everything is property based, but you can make collections

(57:16):
of reports that are filtered down or edited to just be for certain
use cases. So you can kind of back your way into that use
case to have certain areas of the UI that are dedicated to different
teams or different needs. So, my name is Fred Pike. I'm a managing
director at Northwoods, an agency in Milwaukee, and I lead the GA and GTM

(57:37):
practice area. And my favorite tip is about GTM, which is my fricking favorite
Google product ever, so hands down. So the thing... There are two things
I do. I know you said only one, but two is better than
one, right? Yeah. Yeah. And we also, we love an extra.
We love a twofer around that. We love a twofer in analytics. So, one
thing is every time I create a tag,

(57:57):
I create a custom parameter called tagname, and then include the name of
the tag. And the reason that's useful, you look at that, you look
skeptical. I'm like, it's been a while since I've been doing GTM, so
I'm like, why is he going with this? Yeah, okay. We're gonna cut
this guy. No, no, I think that's where you're going. Sorry. So the
reason it's useful is 'cause when you're debugging and you're looking at

(58:19):
an event name, you don't know where that's come from unless the tag
name is associated with it. So, so many times, I'm trying to figure
out where the heck did this tag come from, or did this event
come from? And if I see the tag name, I know it's mine. If
I don't see the tag name, I know it's some other source.
And there can be... There's like five or six different ways that events

(58:40):
can get sent to GA 4, but at least I can try to
figure out where that event came from. And
once I know that, then that helps me down the troubleshooting line. Yeah.
Big time. So the bad thing about the tag name is that there's no
way that I have found so far to do this automatically.
Yeah. So you have to remember, okay, I'm gonna add it.
I'm gonna copy the name of the tag and I'm gonna paste that

(59:03):
in my parameter field, my value field. And so if you
don't do that, then you're gonna get bad information 'cause it'll default
to the Google config tag if you don't give it the proper name.
So that's the downside of it. But if you're consistent, it's really helpful.
Now, for the people like me who haven't used GTM or even GA

(59:24):
in a very long time, have I got it right that, so you're
putting it in the custom dimension 'cause the tag,
there could be multiple tags that fire for each event, or if I... Like,
I'm trying to figure out how you wouldn't know the tag's name and
how it relates to the event. So that's a great question.
You would think that the tag name would relate to the event name.

(59:46):
So like add to cart, and the tag name says add to cart.
But for add to cart, that's probably the case. But many,
many times people say, oh, the tag name is, we're going to do
the dog conversion or something, and the event name is something totally
different. Form submission or whatever. It's a real lack of taxonomical

(01:00:06):
control sometimes in creating implementations. And that happens a lot. And
especially if you walk into an implementation that's not your own,
right? Exactly. So you walk in as a consultant
and you start fixing stuff, and one of the first things you're doing
is trying to figure out how is this event even firing?
You can go back in and dig it out at GTM.
Yeah. And one of the things you can do in GA4 is in

(01:00:27):
the admin section, you can modify or create events based on other events
coming in, and I hate that. It works at the browser level.
It doesn't work on the server level. And so there's all types of
problems with that. But you can get such confusing tags in that modifying
event. Modify and create events section. So I always, once I see that,

(01:00:50):
if I can't convince a client to get rid of it,
I put a GA4 admin event as a tag name. So at least
I know it's where it's coming from. So
that's one of the things. That's a nice tip. And I promise you
too, right? All right. Oh yes. There you go. Lucky Second. All right. Yeah.
Okay. So the second one is related to notes. And in tag manager,

(01:01:12):
you can add a note to anything, a variable, an event,
a trigger, a tag, whatever. Use it, fricking use it. So
tell people, tell yourself six months from now, what was I trying to
do with this? Or if you're copying from somebody, if you're copying something
from somebody's blog, include the link to that, so you know where that

(01:01:32):
came from. So again, six months from now,
you're not figuring out why am I doing this? Where the heck did
this come from? So that has been really helpful to me too.
That's awesome. Yeah. 'Cause yeah, a lot of times you'll be like,
I need to find an answer for this. And you find a good
solution on like some blog post, but then a couple months later you're like,
where did I find that? And you're Google searching all over again.

(01:01:54):
And that happened 'cause I took over a client years ago and one
of the notes said, Larry found this in a blog. And it's like,
oh my God, who the heck is Larry, and what blog? So it's like
they documented it, but it was worthless. So that taught me to back that
up. You can go down your list of usual suspects. There's not that
many good quality GTM blogs out there. I'm like, there's also not that

(01:02:16):
many Larry's at the company, surely Well, unless it's a 50,000. Who's big
hospital there? Okay. All right. There was a lot. It
was a new client. So I got to know Larry over the years,
but... Well, those were some great tips. Thanks so much for coming along.
Yeah. Thank you Fred. Absolutely. My pleasure. So those responses all just
happen to be very Google ecosystem heavy, but these are some really great
ideas, I think for folks to try out inside of their own work

(01:02:37):
and organizations. Yeah. Absolutely. All right, jumping into our last and
final hop on the mic question from MeasureCamp Chicago. All right, well
thank you for joining us. The question that we are interested in your
response to is, when interviewing an analytics candidate, what is your favorite
question to ask and why? My name is Adam Greco, and I'm a

(01:02:58):
product evangelist for Amplitude. I've been in the digital analytics space
for, oh, 25 years. Early employee of Omniture. Been a consultant,
been on lots of different sides. So. You've seen some shit.
Yeah. You said I'll have a lot more hair.
So I kind of look at this question two ways. So you phrased it

(01:03:18):
as if you're interviewing someone, what is a question you ask them?
And I actually use the same question whether I'm interviewing someone or
if I am interviewing for a position. My big thing, and it's actually
something I just talked about at Measure Camp, I did a session how
to turn analytics from a cost center into a profit center.
And so the question that I like to ask an interviewee or if

(01:03:40):
I'm interviewing is to basically say, can you show me examples
of where you have turned data, not just into insights, but can you
point to specific things that have changed on the website or mobile app
that wouldn't have changed unless someone had figured that out through data?

(01:04:02):
So as an interviewer, I wanna understand, does a person who I'm interviewing
really have experience, not just running reports, but actually saying, here's
a way I could show you that I found data, I figured out
an insight. We decided this is a change to make, and six months
later here was the cost that we either saved the company or the
incremental revenue we drove. And if I'm interviewing for a company,

(01:04:25):
which at my age I don't do as much, I wanna ask the
team that I'm interviewing into, are you a cost center or a profit
center? And if you say you're a profit center and everyone loves our
analytics team, then show me examples of where you have done the same
thing. I love that. I love that. That's nice. One of my first
roles in digital analytics, we were like beg, borrowing and stealing resources

(01:04:46):
from other teams to run experiments and things like that. So one of
the ways that we really got the UX designers and some of the
graphic designers really excited is to add onto their portfolio some,
of the outcome numbers. Some of the like, but what did this do
for the business? And so it was like, sure that was beautiful imagery,
like those fonts like Chef's Kiss however what did it do for the
business? And that's what actually got them really excited to partner with

(01:05:08):
us. So I think that question where they're asking on either side really
inspires something that's very insightful about the experience you or that
candidate is about to have that organization. Yeah. And then I like that
you bring it to the interview when you're looking for a job because
that says so much about the company you're thinking about joining,
as opposed to maybe other companies you might have an option of working

(01:05:30):
with, which is a big determining factor in kind of the quality of
life you'll have inside of that company, 'cause it's... Yeah, exactly.
And when I worked at Salesforce, I'll tell you the team that I
took over, they weren't super happy at first, because they were just running
reports. And I basically either once a month or once every two weeks,

(01:05:50):
I said, listen, I know this is gonna sound crazy, but I actually
want you to work at home and this is way before COVID.
I want you to work at home for one day and I want
you to just turn everything off and I want you to go use
salesforce.com, find something that you think stinks, and then go into the
data and see if your theory is supported by data. And then I

(01:06:10):
want you to go work with a couple of designers or a couple
of people the company say, what would we do to fix this problem?
And every month I had like five or six people on the team.
I got five or six really interesting ideas.
I would go to my boss's name was Kendall, he was a CMO.
And I would run them by him and we would try a couple
of these and to tell you the look on an analyst's face when

(01:06:33):
the CMO says, I wanna try your idea.
And then we try it. And if that turned out to either save
us a bunch of money or make us a bunch of money,
I used to joke with my wife, I said, that employee of mine
locked in for a year because at what other company are they able
to say that they had a direct impact? They could say I impacted
salesforce.com, and that is such a powerful thing to do for employees.

(01:06:57):
And if you are an employee at a company and your company...
If you can't prove that everything that you do every day is leading
to the bottom line or helping the company in some way,
why do you wanna work there? And so go find another job and
find a company that would value enough to really give you that empowerment.
And I think that's what makes the digital analytics field really exciting

(01:07:21):
is there is an opportunity to have an impact, but I think we
get a little bit lazy sometimes and it's hard and I don't think
we try hard enough. And I always tell people like, try harder.
If your company's not doing this, go to another company or make a
change. Push them to do this. And if your boss doesn't get it,
then there's other bosses who will. Hi, my name is Sam Burge, and

(01:07:43):
I'm senior manager of data science and analytics. So a little bit of
background. We have done a lot of college hires in the past and
a lot of times there's a lot of nerves in it.
There's a lot of pre prep and training. So they go through a
very specific format to answer a question. So over the years,
I've actually picked two questions, if that's okay. Yes. We'll allow it.
No, thank you. No. Because one is more at the beginning of the

(01:08:08):
interview to break the ice, which is like, tell me something you're really
proud of. It can be in your school life, it can be in
your work life, philanthropy, a project that got you really excited and
can you just walk me through that? Nice. I like it. So just
opening up and like... Getting them comfortable. Getting them comfortable,
getting them to talk to about something that is really great.
And then if they continue to interview and I'm like on the fence, the

(01:08:30):
interview is a little stilted, you know they're going through a process,
you're not getting all the detail, and I hit him with a curve ball. And
I either ask them about Halloween or I ask them about like one
of their favorite sports or holidays. So what's the actual question you
asked about Halloween? So for example, somebody I asked before, I was like,

(01:08:52):
so tell me how do you celebrate Halloween, and do you enjoy dressing
up, going in the office or outside? Do you like trick or treat
with the family? And I feel like it kind of diffuses the moment
for a second, and they get excited and they talk about this or
I ask them like, what's your favorite holiday and what do you really
love about it? Like what makes this the time of year that you
enjoy the most? And I take it away and then I try one

(01:09:15):
more on point question about the job. And then I usually have the
lay of the land if it was nerves or something else,
or if it was really just... Not a good fit. Yeah, not a
good fit. I love that. I love the first question, 'cause a lot
of times in an interview process, I'm really looking for
what I call is like watching people spark.
So like what pops? And so if I can get that moment in

(01:09:38):
that interview, I'm like, okay, I finally learned something about you
and it's so valuable. So that first question is I think a really
great one for that. Great way to open. Right. And you learn so
much. So if they pick a project right, and they walk you through
it and you can see that they're taking certain steps or the way
they talk about their private life and that spark comes out,
but you can also tell is that the right fit for this position?

(01:10:01):
It sounds a little horrible but like, oh, is it more technical?
Is it more strategic? That's right. Which role did they take on?
What did they do? So that's why I really like it and people
get excited and they feel a little more comfortable I feel like.
Yeah. I love to see people share their passion
and sometimes when they've worked it really hard on something, it really
comes through. Yeah. And it's lovely. And you learn so much,

(01:10:21):
like it's so much fun and or you find common ground and you
can really start getting into a really great place of a conversation.
Yeah. I like that. It's good culture fit stuff comes outta that too.
Yeah. Well the Halloween question is also really good for culture fit,
right? We dress up in this office, so. I was asked, I actually
stole this. I was asked in an office before, 'cause that's what they

(01:10:42):
did. They went all out on Halloween and everybody's listening, who knows
which company that was, but yeah, they went all out on Halloween.
They're like, can you come with us? We have one multiple years in
a row. Very fun. Oh wow. Very fun. I like it. No pressure.
No pressure at all. But then when we went remote, I adjusted it and I
asked about the holiday or something else to just like get that feel

(01:11:05):
because that question diffused me in that interview and I've used it since
then. And it's like a good figuring out at the end if this
is the right thing. Love. My name is Ying Liu. I am the
senior digital analytics manager for Adobe's Experience League website.
So I typically asked this question in the last, 'cause usually you go

(01:11:26):
through and introduce yourself or do you do challenges, etcetera. That's
normal. But I really wanna get out of the candidate is how do
you improve your analytical skills? So typically, that's how you can ask
an open end question and then you can say if this person has
a fixed mindset or a growth mindset. I like that. I like that

(01:11:48):
a lot. I've asked similar questions over the years too. I'll ask people
like, how do you gain new information or build your skill sets in
the analytics space? Yes. I remember I hired an analyst just 'cause they
mentioned... I was just gonna say that. Sorry. Although you're like, check,
check, check. I was like, perfect. You're in the right head space if
you're... Yeah. Everything else. I'll do it. So what are some of the
things that someone could say like, oh I take Coursera courses or I

(01:12:13):
go to conferences. Like what are the things that when you hear it,
you like really light up, like you get really excited, 'cause you can
tell that they're very eager and proactive and like really passionate about
this industry. Oh yeah, a great tick for me is if people listen
to podcasts like, analyst Paul Hour. Oh I've heard of that. There you go.
Right. Podcast. Please tell us more about... Yeah. That's what really good

(01:12:34):
thing about. And also people go to conferences such as MeasureCamp Chicago
as we are here today. Or if they go
other things like, I don't know, Adobe Summit, they go to super week
in Budapest. Nice. So those are the opportunities you can network with people
and also grow your industry knowledge. As we are digital and fixed professionals,

(01:12:56):
things change so much and so quickly here, so I need to really
absorb all the opportunities you have to upskill yourself.
I have noticed, I wonder if your experienced, you've seen this at all,
is there are some people who are like not as outgoing or don't
learn in the same ways. And so like I've always tried to leave
open sort of like, okay, well maybe you don't wanna go to a

(01:13:17):
conference 'cause that's not your style or like you're a senior, kind of
more introverted or whatever. But like do talk about the ways that you
do that. But yeah, like people read books or take courses or
collaborate on like projects with others or take on little projects.
Like those are the things I often look forward to. Yeah 100%.
Yeah. And you mentioned about the blog before, right? And also even the

(01:13:41):
traditional media, like read a book. We still refer to Web Analytics 2.0
that's 10 plus years, but still the Bible in the industry,
right? So if that person say, yep, I read books and I try to
upscale myself through that channel, that's totally fine. Yeah. Yeah. Yeah,
or if they get fired up on Measure Slack, that's
another good one. Being participating in communities like that is really

(01:14:03):
helpful. Yeah. I like that. And also depends on your industry knowledge,
right? Sure. For instance, if you are more towards Google Analytics,
perhaps you read more about that or you go through Google trainings or
if you use Adobe products, you go to Experience League or you use
Adobe training materials on the website. So those are great examples candidates

(01:14:24):
can show their desire to learn. I like that. I like the way that,
'cause there's like... And to your point Michael, there's like something
for everyone, but like show us you're in it, right? Like what's your
thing? And so yeah, it's a good question. I might pick that one up permanently
as well. I remember 'cause when I started out I was like,
well I want everyone to be like me. And then as time went on it was
like, well Michael, unfortunately that's just not gonna happen. So

(01:14:46):
maybe you should be open to other perspectives. Yeah, that's good.
And the other thing we didn't talk about is social media as well, right?
Oh yeah. So for instance, if you follow people on Twitter/X, or if
I see the candidate has mutual connections in the industry, so you can
see how many connections they have on LinkedIn, for example. Yeah. How many

(01:15:07):
of those are mutual connections that you know really well in the industry?
So that can be also a good indicator, not necessarily a question
you ask during the interview process. But it's a great indicator that this
candidate is eager to learn, which is what I looked for the most.
Love it. Awesome. Oh, those were so good. Which one do you think
you're gonna steal when running your next interview?

(01:15:29):
If it wasn't obvious, it's absolutely gonna be the Halloween question.
I think that that's like a great way to open it up.
I know. It was really good. All right, well we had so much
fun asking these questions, but we also had a great time doing the
live show recap at the happy hour. So we'll transition over to that.
Hi everybody. Welcome. It's the analytics Power Hour.

(01:15:51):
This is MeasureCamp Chicago. Thank you so much for having us.
I wanna introduce you to my co host. Moe Kiss, all the way
from Australia, right? And Tim Wilson, all the way from Columbus,
Ohio. Julie Hoyer from Cleveland, which rocks. Val Kroll from right here

(01:16:14):
in Chicago. And I'm Michael Helbling. Alright, we also wanna give a here
shout out to also all of the sponsors that made today possible.
That is Tealium, Amplitude,. Of course, Ken Riverside and the whole Four
For productions crew, doing great things for our podcast. Alright, we just

(01:16:38):
wanna share a couple of thoughts from today. We had a great time
hanging out everybody, but what were some of the highlights of today for
all of you? Highlight for today, I mean, I didn't get to go
to as many sessions, I would have. Actually, one of the highlights is
like the needing to in your hands as to which session to go
to, 'cause every single slot had really tough choices, but I did go

(01:17:02):
to John Luvitz's custom GPT, which I've seen him talk about it.
I'm intrigued with what he's done. It was fun, kind of engaging,
building a custom GPT on the fly, so
that's gotten me that much closer to actually trying that out.
And then I'll say... Because Josh Silverbauer, I got your last name right,

(01:17:23):
didn't I? I just had one of those moments and I'm like, shit, did I just...
The parody song writing about analytics, which was like.
30 minutes and we had more songs than we could review to go
through, so that was like to me, it felt like the spirit of
like an afternoon that needs to be an afternoon session forward,
so that was another great one. Nice. That's a good one. Fun. go

(01:17:48):
down the road, I guess. Alright, so I guess... Because Michael really,
really guiding the flow of the... I did my job.
So I went to a couple of sessions. One, I think he's still
your Alexia session on third party cookie strategy.
Just kidding, I was like, your closing slide. So no, that wasn't his

(01:18:10):
actual talk. It was server side, the good, bad, the ugly,
and I've never seen a presentation that kinda walked through all the different
players and the pros and the cons, and it led to a really
good discussion, so kudos on that one. And there was another session that
I didn't go to, but I ended up talking to multiple people about... Are we
going to serve side tagging on the analyticshour.io site or you're running
with that now? Yeah, yeah, yeah. It's on my to do list. Cool. Awesome. So

(01:18:31):
to be little, I learned everything I need to know in your session,
so I'll set. Good. Bing, bang, boom. So I didn't actually go to
this one. It was by Florent doing vendor evaluation without losing your
head, and there was a pro tip in that one that I thought
was really cool. We're all familiar with RACI, responsible, accountable,
consulted, informed, but he added a B to the top for budget and

(01:18:51):
to be thinking about the person who sets the budget. And I was
like, That's fucking gold. So, I can't wait to do that.
Did you go do it. I did... I was doing Mike session at
the time, so. But you have... Where did you get the protest?
Yeah, I got a hot tip from your wife.
Or those... It was good. I was like... There's a lot going through

(01:19:14):
here that I should say that probably does not need to be on
the mic. Julie almost did a session today, I don't know if you
guys, but the board... Thank you to everyone who filled up the boards. If
the board didn't get filled up by the end of the second session,
she has the card to prove it, she's pulling it out,
it was gonna be called... Married Tim Wilson for 30 years,
ask me anything. I would like to point out... I'll go ahead and give you

(01:19:39):
the feedback now. Honey, we're recording this, you're not on the mic, so...
Married to Wilson for 30 years, ask her anything. So the topic was,
I've been married to Tim Wilson for 30 years, ask me anything.
And so, in the description was, if you know, you know.
Next year it might be, well, I was married to Tim for 30 years. Oh my God.

(01:20:03):
So good. So good. So for me... Look, I have a really bad
habit. I have been to many, many Measure Camps. Actually, I hop on
the one in city, there's as a whole crew of us organizing it
coming up. No sales pitches. We do events. Anyway, I love measure camp,
but I actually just like, this is kind of why I'm part of
the podcast, love hanging out and not always going to sessions.

(01:20:28):
You will normally find me just hitting people up, mainly asking for career
advice or hitting people up with my work problems, but to be honest,
just the nicest bit is, it felt like a reunion. There are so many people
far and wide that I have met across the industry or spoken to
and never met, and it has honestly just been incredible. And as someone

(01:20:50):
who has been to lots of MeasureCamps Chicago's first year, shit hot. You
guys nailed it. Such a great vibe. So yeah, just loved all the
breakout sessions and by tabs and drinks, and then we can really talk
about the good stuff. Can you translate that... Sorry. Shit hot is good

(01:21:10):
in Aussie. Like, real good. Yeah, yeah, we flip it. Yeah.
Okay, good. Alright, so can you guys hear me okay. Yeah. Okay. So I
got to go to two sessions and between our recording, which I'm really
happy with the ones I got to attend. There were so many I
wanted to try to get to. The first one I went to was

(01:21:33):
human centered, designed for AI with Gina Grant, and it was so good.
I feel like we talk about ethics and AI a lot and the
biases that we know are there, but I felt like this presentation was
so good at giving a tactical way of trying to get there,
we always talk about the outcome of being ethical and not bias and
understanding, and making sure we use it responsibly, and

(01:21:55):
starting with the human centered design, it was understand, IDA, synthesized,
prototype, implement, and it was all about reflecting on your assumptions
ahead of time, saying, what was the outcome you wanted, who do you
need to design it for and think of the people that you didn't
say you wanted to design it for, and would they still be involved,
talking about how it's an iterative process. And I just thought the examples

(01:22:15):
that were given too were so great. There were projects that she did
with students, some of them were really cool apps about for lawyers and
things, and it was just really inspiring what they were able to do
with AI and the way that they actually approached thinking through designing
these AI tools and agents and things like that. So I thought it

(01:22:35):
was a really, really great and inspiring session. And then the other one
that I was able to go to and loved was the emperor has
no clothes on, which I... I wanna do that way. It was wonderful.
It's one of those talks too, I felt like you have that wiggling
feeling at the back of your head when you talk about those topics,
and then when she started, it was like, Oh yes, that's exactly how

(01:22:56):
I would describe it. It's kind of all this wishy washy and nothing's
really clear, but it's really important. And it seems so obvious now,
but when she said, cookies don't equal tracking, it was very much like
a light bulb moment. I'm like, I can't wait to use.
That is so simple. So good. So that was the other one that
I loved. You took notes? I did. Did you take them on your
phone? Yeah. Fucking millennials. How do you do that?

(01:23:20):
I don't know. I listened to it and typed. Yeah. I have two things.
But Mike... Two things. Hold your fire Tim. Okay. There's many generations
for you to be mad at. So first, I got to attend Moe's session
today, which I really enjoyed, and especially there was a section there,

(01:23:41):
were Moe talked about how important it is to be delivered feedback and
delivered directly as a people leader, and I thought that was just a
really great and underrated point. And so I really appreciated hearing that.
The other thing I will say is, it's about the people in this
room and kind of tagging on to what you said, Moe. There's so
many people in this room that I've met before. And

(01:24:03):
so many people I met today for the first time, but I've interacted
with on Measure Slack or on Twitter or LinkedIn or whatever.
And I don't know if other industries have this kind of stuff,
but it's too late, I'm not switching, I'm staying in this industry.
It's so much fun to be part of
communities like this and get a chance to do that. And I'm mentally

(01:24:24):
grateful and thank you and MeasureCamp Chicago for putting such an amazing
event together where we could all be here and do that.
And it was so cool to see... I, so didn't expect to see
so many people in so many different places. Come to Chicago,
I guess it's an easy flight, so we'll see you again here next
year. And of course, no show would be complete without a huge thank

(01:24:47):
you to Josh Crowhurst, our producer, who's in Hong Kong, but we'll get
him here eventually and thank him for all he does for the show.
And all he's about to do for all... Yeah. That's right. No shortage of good
outtakes. And of course, I think I speak for all of my co

(01:25:07):
hosts when I tell all of you in Chicago, eat deep dish pizza, keep
analyzing. Thanks for listening. Let's keep the conversation going with
their comments, suggestions and questions on Twitter at Analytics Hour on
the web, at analyticshour.io, our LinkedIn group and the Measure Chat Slack

(01:25:29):
group. Music for the podcast by Josh Crowhurst. So smart guys want to
fit in, so they made up a term called Analytic. Analytics don't work.
Do the analytics say, Go For No matter who's going for it,
so if you and I want to feel the analytics that go for,
it's the stupidest laziest, lamest thing I've ever heard for reasoning in

(01:25:52):
competition. Enjoy the rest of Measure Camp and try to share accurate information.
Do we have time for me to grab some water? Sure. Are you
sure? Yeah, but why don't I get one of the girls to get

(01:26:27):
it for you. Okay. Thank you. May I have your attention?
Thank you. I love attention. All right. We're just going to do a
really quick thing and thank everybody for holding on through all the little

(01:26:49):
hiccups we've had with the audio. It's totally normal for podcast to do
this. And... If everybody could come back tomorrow same time, we'll have
this... Yeah. We'll redo this whole thing.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.