All Episodes

August 20, 2024 52 mins

Broadly writ, we’re all in the business of data work in some form, right? It’s almost like we’re all swimming around in a big data lake, and our peers are swimming around it, too, and so are our business partners. There might be some HiPPOs and some SLOTHs splashing around in the shallow end, and the contours of the lake keep changing. Is lifeguarding…or writing SQL…or prompt engineering to get AI to write SQL…or identifying business problems a job or a skill? Does it matter? Aren’t we all just trying to get to the Insights Water Slide? Katie Bauer, Head of Data at Gloss Genius and thought-provoker at Wrong But Useful, joined Michael, Julie, and Val for a much less metaphorically tortured exploration of the ever-shifting landscape in which the modern data professional operates. Or swims. Or sinks? For complete show notes, including links to items mentioned in this episode and a transcript of the show, visit the show page.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Before we start the show, we have a special announcement. This fall,
the Analytics Power Hour crew is headed to MeasureCamp Chicago. That's right.
Even your co host from all the way in Australia will be there
on Saturday, September 7, to join in all the unconference MeasureCamp fun.
I'm so excited that we're all going to be together. Well,

(00:21):
except we'll be missing Josh, but we'll have him there in spirit.
But I'm curious. I've never been to a MeasureCamp. What's it like?
What's it like? Okay, well, I've been to one of them in Europe,
and I've been to, I think, all of the ones that have been
in person in the US. And to me, the most iconic feature is
that the schedule is created on the day of the event,

(00:41):
and everyone who attends is encouraged to actually lead a session based
on whatever they're finding most interesting or most useful, or even maybe
what's kind of vexing them the most of late. So, it's really all
about an exchange of ideas and having some really in depth and rich
discussions with your peers. I've also been to quite a few,
and I've also helped with planning the one we run in Sydney.

(01:04):
And the truth is, it's just phenomenal. It's better than Christmas Day,
honestly. And one of my favorite parts of MeasureCamp is that they're held
on a Saturday, so it doesn't interfere with your work. And the tickets
are always free. Yeah. And I loved my experience at MeasureCamp Austin earlier
this year. I mean, it was so accessible to everybody, and it was
so fun. Okay, so what are we going to be doing there?

(01:26):
So we're going to be doing a couple of things. The first is,
we're going to have a room booked for us all day long,
where you can stop by and visit a couple of the co hosts
and talk about what you've been talking about throughout the day or maybe
one of the sessions you're presenting. And we're also going to have a
couple of questions posted up on the board day of, and you can
come in and give us your answer to those prompts. And then,

(01:48):
at the end of the day, during the happy hour, we're also going
to do a short live show. Will there be shots?
So, mark your calendars for Saturday, September 7, at 09:00 AM at the
Leo Burnett Building, downtown Chicago, right on the river and just a couple
of blocks from Michigan Avenue. Get your free tickets now by heading to

(02:08):
Bit.ly/APH Chicago and start thinking now about what you might like to present
or talk about. Awesome. We're headed to Chicago, but now let's start the
show. Welcome to the Analytics Power Hour. Analytics topics covered conversationally

(02:30):
and sometimes with explicit language. Hi, everybody. Welcome. It's the Analytics
Power Hour. This is episode 252. You know, some of the hardest parts of
the job in analytics is figuring out sort of how we fit into
the bigger picture and interact with the people and teams that are supported
by the data and analytics functions in a business. You know,

(02:54):
beyond the various hard skills that make up a great analyst or analytics
engineer, there's sort of a hidden navigation that has to occur to achieve
the outcomes that we all want to create, you know, positive impact on
the business. This episode, we might end up with more questions than answers,
but it for sure affects all of us to the extent that we

(03:14):
work within organizations and with other people. Speaking of other people,
let me introduce my co hosts, Julie Hoyer, analytics lead at Further. Welcome.
Great to be on the show with you. And Val Kroll,
head of delivery at Facts and Feelings. Hi, friends. And I'm Michael Helbling,
managing partner, Stacked Analytics. But we needed a guest for this conversation.

(03:36):
And who better than one of our favorite guests from before,
Katie Bauer. She is the head of data at GlossGenius. She previously held
data science leadership roles at Twitter and Reddit, and today she is our
guest again. Welcome back to the show, Katie. Thank you. I'm glad to
be here. I think the punch card that you gave me,
if I get a third one, I get a free ice cream.

(03:57):
That's right. I'll be angling to get on again. Well, there has been
a lot of discussion over the years of the five timers jacket a
la SNL. So, you know, you're now in the running for that as
well. Something to shoot for. Yeah. So, I mean, the sky's the limit,
really. But here's the reason. You keep writing these articles that we keep
passing around, and then we're like, we need to get Katie back on

(04:19):
the show because she thinks about this stuff the right way.
And so maybe as a starting point, let's just jump into a little
bit of a conversation around what you were writing about recently,
about the skills and the jobs in analytics and how you're feeling about
it right now, because I think that was pretty poignant. Yeah.
So this is in reference to a Substack post that I wrote recently

(04:40):
that's really a culmination of both seeing a bunch of thought leadership
types of posts that have been going out in the past couple of
months about why do data teams exist? Do they need to exist as
separate roles? Should they be disbanded, et cetera. Or like, is data a
job? Or is it a skill that someone else should have?

(05:01):
And it's also in reference to just conversations I've been having with friends,
with people I've worked with previously, and all sorts of different data
professionals about what actually is going to happen to the job of data.
And AI is really top of mind for people right now as you
see all these AI to SQL companies coming out, that people get kind

(05:21):
of perturbed by it, and I see them myself and I'm like,
well, my job's not writing SQL, so I don't really care.
I do a lot more than that. And that in conjunction with all
of the kind of doom and gloom posts that have been coming out
recently about whether data needs to be a job, I just eventually got
to a point where I had thought of what I wanted to say

(05:42):
on this, which is just, I do think it's a job.
And jobs are usually an assemblage of skills that get applied to a
particular situation. So this was kind of me working through it and really
just thinking about things that had actually happened to me that were examples
of someone thinks of my job as writing a SQL query or building

(06:03):
a model for them or something. But really, it's much more than that.
And they fixate on the specific deliverable because that's the thing they
interact with. But there's a lot that happens before that point comes up.
One of the things that you said in that article too,
that I loved, you mentioned how everyone needs to use data in their
position, but it's interesting that most of the time

(06:26):
they can't generate it themselves. To your point, right? Like there's someone
they always have to rely on to get the data. And so if
those skills took too much time of their original position and were spun
out to this job, I do find it shocking that so many people
are trying to claim then that the job will go away and that
it's not necessary. Like, where do you think that actually comes from though?

(06:48):
Yeah, I mean, I think with a lot of the thought leadership types
of conversations that happen in the data world, partly it's that it's data
people saying this, and they're kind of assuming that the cross functional
people that they work with enjoy the data work as much as they
do. And because it's getting easier for them to do, they're like,
oh, well, other people are going to be doing it at some point

(07:10):
too. And maybe it also comes from you start having cross functional people
working in your tools. The company that I work for now is a
very data hungry and data savvy company overall. And there are people working
in tools that I would never expect them to have worked in previously,
partly because the barrier for entry is so much lower, but partly because

(07:31):
I think it's just considered more expected. But I still find myself constantly
having to go in and advise around it or get them started or,
I don't know. Sometimes I almost feel a bit like
a lifeguard where someone's swimming in the data lake and you have to
help them when they get too far deep in because they maybe can't

(07:53):
swim against the current or they've just gotten a little in over their
head. Do you feel like that's a good thing, though? Because I feel
like half the time we talk about feeling like we're bogged down by
these requests for different breakdowns, different numbers, build a dashboard,
kind of like the light, the shallow end. Like, let's keep that analogy
going, the shallow end of the pool, and we want to be swimming in
the deep end. So do you think that maybe some people are being

(08:16):
too negative about it and that people being able to self serve a
little bit is actually going to free us up to do more of
the work we'd rather be doing? Yeah, I mean, I don't think it's
good or bad necessarily. It kind of depends on how you interact with
it. But I do think you're right that this frees up our time
to get to the meatier things. Like, maybe another version of this is

(08:39):
every SaaS tool you buy these days has a dashboard feature in it,
and the salespeople will be like, wow, isn't it great? You know how
to talk to a data person to, like, learn about this tool.
And like that's true, but the questions never end there. And figuring out
how to stitch that all together, tell a bigger story, that is really
valuable work. It may be painful or annoying for us to do,

(08:59):
but treating it as something that's an opportunity to influence, an opportunity
to say how things are working, whether that's how we expect or not,
like, that's a huge opportunity. And partly, I think the problem is data
teams are not insisting on being involved in valuable things. A lot of
times we want to have requirements dictated to us because it's a lot

(09:22):
easier to just do what someone asks us to do, and it's harder
to know what to do in this difficult situation. And like,
this is a thought I'm having in real time. I haven't thought this
in depth, but it's almost like you're getting a little bit of a
switcheroo with cross functional people wading into data tools. It's like
they're kind of going into this thing that's maybe a little bit better
specified sometimes, and abdicating their responsibility for knowing how

(09:47):
things are supposed to work and what the data is supposed to be
reflecting. Because they're going back to the shallow end, no one's in the
deep end. Suddenly, like, someone's out there doing something that they
shouldn't be, or this metaphor's maybe getting overextended. But, like,
you needed someone to be over there, like, steering and guiding or making

(10:07):
decisions about what needs to happen. And if people want to go into
the shallow end and splash around, and they are leaving that other stuff
on the table, I think it's an opportunity for data professionals to get
into that space and drive conversations that need to be driven.
I love the way you put that. The data professionals need to be
insisting to be a part of those conversations more. I'm curious,

(10:29):
and this is for anyone, if we're talking about other people on these
other stakeholder teams dabbling in our role in data, do you think anyone
on the business side would ever say, like, "Whoa, whoa, you're getting too
businessy on me. Stop asking all those questions about how things work around
here." I just... When you were talking, I'm thinking, like, I don't know

(10:52):
if it would happen in reverse. It would be healthy. Yeah,
no, I mean, that's a great point. Like, why shouldn't we wade into
territory where we're not explicitly invited? Like, I don't think people
are gonna shoo us away. Yeah, I've definitely been told to stay in
my lane before, but that's because probably my approach was poor.
It sounds like a you problem, Michael. I don't know. Yeah,

(11:14):
no, I'm fully willing to admit that. Yeah. Well, I mean,
I guess, like, a version of it might be you provide someone a
recommendation and they're like, "I don't need that from you, but thank
you." It's time to step away from the show for a quick word
about Piwik PRO. Tim, tell us about it. Well, Piwik PRO has really

(11:35):
exploded in popularity and keeps adding new functionality. They sure have.
They've got an easy to use interface, a full set of features with
capabilities like custom reports, enhanced eCommerce tracking, and a customer
data platform. We love running Piwik PRO's free plan on the podcast website,
but they also have a paid plan that adds scale and some additional

(11:55):
features. Yeah, head over to Piwik.PRO and check them out for yourself.
You can get started with their free plan. That's Piwik.PRO. And now let's
get back to the show. One of the other things I loved in
your piece, Katie, is where you're talking about data literacy. And it was
so well put, and I never thought about it that way,

(12:16):
about how it could be mistaken for raw intelligence and how it can
make people, just even the premise of how that's framed can make people
feel intimidated or even stupid. But would love to hear you talk a
little bit about that. That's been one of the ideas or the solutions
of how we can bring some of this together. It's a really common,
you know, programs being spun up to conquer it that way.

(12:38):
But we'd love to hear you just talk a little bit more about
what you've seen in that space and how effective or not that's been.
Yeah, I mean, this one is a big ball of wax,
but I guess this is from another piece that I wrote.
I think you're referencing the word "the SLOTH," with the SLOTH being kind
of a tortured background on my part, but it's a Statistical,

(12:59):
Logical, and Overthinking Hesitater. And that's proposed as an alternative
to the HIPPO, who's someone who just kind of does whatever they want,
doesn't care about the data. The SLOTH is someone who is so obsessed
with the data that it's actually a problem. There's a lot of different
versions of it, but one that I feel like I see a lot
is someone who, they know they're supposed to use data, but they're kind

(13:21):
of uncomfortable doing it. And they would never tell you that because the
expectation is that everyone is going to use data, and that's just how
business works. And if you're not able to think analytically, then maybe
you're just not that smart or not cut out for business.
And like, part of this comes from people telling me in private that
I've worked with, like, "I don't feel comfortable doing this. I would never

(13:42):
admit that I can't use the BI tool in public because it would

be embarrassing, but I can't." (13:45):
And I need help on this."
And it also comes from, like, I've had this happen across many different
jobs where some cross functional leader that I partner with comes to me
and says, "I want my team to be more quantitative. I want them
to use more data." And then I'm like, "Okay, great. How?"
And they can't actually tell me. Like, they want numbers and things.
Numbers and things. Yeah. We need more numbers.

(14:11):
Yeah, like, they don't know what the numbers should actually do.
It's almost like they want the numbers to help them make a case,
which maybe this is kind of like a HIPPO where they just kind
of want to do something, but they need numbers to justify it. Yes. So
they're trying to drag you in and help you, or help themselves make
a point, that maybe it doesn't have anything to do with you.
Maybe you don't have context. Maybe you don't even actually agree with it.

(14:33):
But they just want a number so that they can say that they
looked into it and they did the homework on it. Yeah.
It's almost like the data will tell me what to do if I
just somehow ask it the right thing. Usually... That's very common.
That's never... Usually, you have to think about what to do and then
let the data inform how you steer it, not give you all your
ideas. Yeah, well, and, like, a bigger problem related to that,

(14:57):
too. Like, the term "data science" maybe is falling out of fashion,
but it's one that I really like because science has theories and hypotheses,
and, like, there's, like, an actual model of the world that you're trying
to build. It's not just, "Here's a spreadsheet of all these observations
of our telescope." Like, you wouldn't do that. You would have a question

(15:18):
about, like, okay, like, what objects are in the sky that you're looking
at with this telescope, and what does it tell us to study data?
What questions do we have? But a lot of times, people have this
idea that if they just have a dashboard that they can slice and
dice in all the right ways, eventually something's going to come up and
it'll be obvious what to do. And obviously, I don't need to tell

(15:38):
any of you that that is kind of a troublesome path.
So, I keep getting stuck on the part of your article because I
hadn't thought about this before, how you kept saying data was being asked
of people in their roles, and it became too much. And so they
spun it out and said, "It's taking up too much time and effort.
I need someone else to own it." But I find it so interesting
because everything we've just been talking about, like, the data professional,

(16:00):
doesn't always feel like we're empowered to own the numbers. We are requested
and asked and told, like, "Go fetch XYZ", most the time,
right? Like, how come it is not more common to be looked at
as a partner? And then thinking back to, like, the pool,
right? Like, why aren't we allowed to dip our toes in the business
side as much? It's so hard to get that context. And we're kind
of siloed off, like, "Look at the numbers, but you don't get to

(16:22):
be in the conversations that are driving the, like, why they even want
those." And if we had that, we could be so much more helpful
to them. We'd probably feel less cynical about our roles, you know,
like, it's just this crazy, like, cycle I keep thinking through.
So, like, how did we get there? Why is it like that?
Have you found a place where that is less the case and there's

(16:43):
a more ideal way of working? Yeah, I mean, I don't know that
anyone's totally figured this out. But a thing that I tell my team
a lot, like, this is a company value we have at GlossGenius, which
is to strive for excellence and expect it. And that's something I really

hone in on with my team (16:58):
It's our job to look at what's
happening and say, "Is this excellent? Like, is this actually good enough?"
And data is often a way to answer that question, or to ask
that question. Like, it's maybe the start of a conversation where we're
not the ultimate decision makers, but we can certainly create the conversations
that need to happen. We can tell people when we don't believe answers.

(17:20):
We can be a thorn in someone's side until they actually figure out
something that needs to be figured out. I think that's one way to
start. Like, truly, I think one of the biggest issues with data teams
is that we do not act like partners. We want people to tell
us the questions. And even the way that people ask follow up questions
when they get asked to do something can be very passive.

(17:41):
There can be a lot of, someone asks you for something and you're
like, "Okay, well, what's the value of this?" You shouldn't have to ask
them that. You should be trying to figure it out. I mean,
you can ask them that, of course, but a team that is a
partner to another team should already have context. You should be engaged
with the metrics of the business broadly. You should understand the strategy.

(18:04):
Your priorities should be the same priorities as the company generally.
And that is something that helps you step forward and be more of
an active player. It's wanting to achieve the results that the company needs
to achieve. And your role in it might be to drive others towards
them or help them figure out what's going wrong or what they need

(18:25):
to do to get ahead. Even if you're not the person who makes
the decision, ultimately. It's sort of like being an opinionated and trusted
advisor. So, can I ask you a thought experiment that I have?
Some conversation took me down this path, and I would love your thoughts.
How come the data professionals can't be the decision makers? Like,
if we're the ones that are supposed to dig around, look at the

(18:46):
data, understand how to use it to answer these important business questions,
like, what would happen if, I think, Val you had put it at
the time, like we were given the keys to the castle to say,
like, "Yeah, I'm gonna be the one that gets to have the power
to say, go left, go right, or like, we're gonna do this,
we're not gonna do that, because the data told me." Is that feasible?
Could it ever happen? Should it happen? Should it not?

(19:10):
Yeah, I mean, I think it does happen sometimes, and it just gets
called something else. Like, it might happen in an operations role.
It might get called growth. There's definitely a school of thought people
have that eventually, if you reach a certain point in your career as
a data professional where you want to call the shots, you need to
go to a different role. And maybe you don't call yourself data anymore,
even though you're doing the same things. I don't necessarily agree with

(19:33):
it, but I do think maybe a reason why people aren't gravitating towards
it as much is that it's kind of a scary thing to sign
up for. Because part of calling the shots means you can also be
at fault if something doesn't go well. And not everyone wants that.
Even if they may want to be the authority, they want to call
the shots. They don't want to be the one who's held accountable when

(19:54):
it doesn't go through. That's something more and more data professionals
should be comfortable with, I think. And there are probably areas where
it's more feasible than others and there are probably something that's very
quantitative, like maybe it's running a lift testing program in a marketing
organization or driving pricing strategy or something. There are definitely
a lot of areas that I encourage people who work in data,

(20:14):
if they want to be decision makers, to sign up to drive an
initiative, and that means signing up for the consequences as well.
Yeah, that accountability piece, some people are really drawn to it,
but it can be intimidating. I know your thought experiment is coming from,
Julie, because especially spending time, you know, more recently, myself
as well, on the consulting side, supporting an extension of an analytics

(20:36):
team. Starting a call by saying, "So, did
they do what we recommended or did they just do whatever they wanted?"
And like, that's just, you know what, I should go back and check.
And you're like, "What? Like, how do you not already know?"
Yeah, very common. Yeah, well, and like, that impulse of being missing is

(20:57):
an issue with a lot of data teams. Like, you should follow through.
You should not consider what you did a success unless you actually see
the result of you made a recommendation and they followed it.
And ideally, the recommendation also went well, but you can only ask for
so much at one time. And I think, Julie, one of the other
observations I have about that is when you take on a role that's

(21:20):
not a data role, like a leadership role or an operational role or
something like that, for me, it was also sort of a journey of
figuring out how I bring my data skills into that role effectively.
And it was actually pretty cool as I got into it to understand
how they helped me. And so as I made that transition,
it was actually like a benefit and actually that helped me perform better

(21:42):
than other people in certain areas because I was able to break down
the numbers and understand the data much better and understand why the data
was the way it was and explain that much better because I had
that experience. So, I do think there's quite a lot of talent as
being a data professional that serves you extremely well in other roles,
should you choose to take them on. So two cents there. So,

(22:07):
I know that I accidentally already jumped us to talking about your SLOTH
article, which is also amazing, but could we talk about each of them
a little bit? Because when I was reading that, I mean,
I fully need to forward that to my therapist because you helped me
process and identify why I had friction with so many people who I
thought were my evangelists internally. Like, I thought they were like the

(22:29):
people I needed to get buy in from when I worked internally to
make things happen. But I was always like, when I get an email
from them, just get so frustrated. But I was like, "That's why."
And so, yeah, lots of faces are coming to mind as you were
sharing them. So, I would love if we could talk through a couple
of those and some of the key markers and even some of the
ways that you recommend data professionals work better with those SLOTHs.

(22:52):
Yeah, sure. It's funny that faces came to mind for you because I
definitely had specific people in mind. Has anyone reached out to you and
been like, "I'm sorry, am I the distrustful SLOTH that you were writing
about?" I mean, actually, unironically, yes, someone did ask me,
and they weren't the person. But I was like, "No, no,

(23:14):
you're fine." Yeah. And you can always just deny it. Be like,
"Of course not." Yeah, I mean, this is a work of fiction.
Any resemblance to real events is purely coincidental, but nothing's purely
coincidental. But anyway, the three that I outlined in the post were the

(23:36):
uncomfortable SLOTH, which is kind of what I was describing earlier,
where this is a person who, they're expected to use data,
but they don't necessarily feel prepared to do so. They're really recognizable
by analysis paralysis or maybe uncomfortableness or unfamiliarity with specific
tools that they would be expected to use, particularly BI tools.

(23:57):
These are people who, they're gonna send you digging. They're gonna expect
what we were talking about earlier, where they eventually slice the data
in the right way, and then suddenly, like, the shafts of light come
out of the clouds and angels sing. You have your answer,
what to do. These are people that, like, a really ungenerous way to
think about it is they're kind of outsourcing their thinking and their decision

(24:18):
making to you as a data professional because they just don't know how
to engage with data whatsoever, and they won't tell you that.
And they'll use that position of authority that they have to just keep
having you dig. To actually deal with them, I would never recommend calling
them out. You don't want to be like, "You actually are a SLOTH,
and you're really uncomfortable, and you don't know anything." A much better

(24:42):
way... I mean, calling someone a SLOTH is probably never a good idea
but at any rate, I don't think you should ever antagonize someone.
What I would recommend is just being really explicit about like,
"These are my assumptions. This is why we did this this way.
This is exactly what I mean. Here is exactly what I recommend,

(25:03):
and here are my exact next steps." This may feel frustrating to do
because it kind of feels like, well, you asked me for all this
stuff, why don't you know what to do? But this is kind of
what I was describing earlier in terms of, like, you can just tell
people what to do, and that is a form of organizational influence.
And maybe they don't have to say that you told them what to
do, but they're doing what you say. It's not optimal, but it is

(25:26):
still a form of organizational influence. It's not that bad. And in general,
I would say, like, lead by example in working with an uncomfortable SLOTH
of just like, admit when you don't know something. Admit when something
is complicated in data and try to build their comfort with it over
time. A second type is a distrustful SLOTH, and this one is a

(25:47):
little more insidious. This is a lot more of like someone is trying
to weaponize data against someone else. It might be a manager going after
one of their direct reports. It might be two rival stakeholders who you
work with both of them that are trying to get the dirt on
each other. It could also be someone who has a legitimate concern where
they think some team is underperforming, but they're not necessarily doing

(26:11):
it because they care about the business. They're doing it to get one
over on this other person. This one can be kind of hard to
recognize because it's very reasonable to ask questions about like, "This
thing is not working the way that I would expect it to.
Can you help me figure out what's going on?" There might be boundaries
outside of whatever systems this stakeholder works with that you as a data

(26:32):
person are aware of and can just help them connect the dots on
these systems that aren't automatically connected. A lot more of what you
should look out for and looking for a distrustful SLOTH is kind of
the context around the ask where, like, is this other person being brought
into the conversation that they're seemingly trying to get dirt on?
Are they considering a reasonable possibility that the person asking, would

(26:55):
they accept responsibility for this, or are they dead set on it being
this other person? Like, it's really more about what are their intentions
in asking this, which kind of requires a little bit of follow up
and context and understanding of the stakeholder relationships to get at
this. But it's important to dig because you really do not want to
get involved in fighting someone else's fights for them. These things never

(27:18):
go well, and it's just not fun to get caught in the middle
of something. And I guess in terms of dealing with them,
it's really about getting the surrounding context and making sure you really
understand where this is coming from and what they're going to use it
for and making sure that it is actually about business outcomes and not
just a spat between two people. And then the final type of SLOTH

(27:39):
that I identify in this article is what I call the dreaming SLOTH,
which is basically someone... This one is very relevant now in the AI
hype cycle, where it might be someone is really convinced that data has
some kind of magical properties that they've seen it used really well in
this one particular way, or they did this one thing at their previous

(28:00):
company, or they know this technique is really good and the data professional
is the one that is going to help them do it.
And for this one, you kind of need to be careful because it
can be really exciting to have someone come to you and be like,
"I think we could build a huge business around data selling or whatever
their pet thing is, or AI." And the thing to be careful with

(28:20):
on this one is really recognizing immediately, is this tied to anything
else we're doing? Is this a person who has credible organizational authority
to drive an initiative like this? Or are they just like kind of
coming up with a cool idea and trying to get you to go
do all the hard work of figuring it out? Like the example I
give in the article that I think is illustrative here is,

(28:42):
let's say you want to have a data selling business associated with your
company. If your vice president of business development comes to you and
suggests that, it's more plausible that they would actually be able to make
something happen there than just some random PM that you help design a
B test. Like that person, maybe they can really think through it,
but they don't have any authority to actually give you or grant you

(29:03):
or resources to put behind you. So once they make the case,
what's going to happen? Probably nothing. Companies usually don't just let
a random person drive a huge bottoms up new business line,
and that's the kind of thing that you need to think about with
a dreaming SLOTH is, is this actually something that makes any sense for
the business? And in general, I would encourage everyone to be skeptical

(29:25):
when you start getting pulled into something that feels sort of too good
to be true that's data related, because usually it's not going to happen
and it might be able to, but probably a lot of other things
need to happen first. And you shouldn't spend a ton of your time
working on something like this until it's clear that it's actually connected
to what your company is trying to do and actually going to be

(29:46):
valuable work. I find myself really drawn to those dreaming SLOTHs. I actually
think I've reported to several over my years because a lot of them
weren't analytics professionals. And yeah, myself, like, mesmerized and
gotten so distracted by some of the ideas that they had and just
would eat it up, and I was like, oh, yeah. Like,
that was why I was working Saturdays in the office, because

(30:10):
it wasn't tied to the actual work that we were doing there.
But yeah, that was a really good description. I definitely thought of a
lot of people within that one. And well intended. Right? Like,
that one's not like the distrustful SLOTH. Like, they're just big idea people
who can get really excited and want you to be excited about,
you know, the role they think data can play. Yeah. So that was
a good one. It's really cool to kind of try to picture yourself

(30:34):
too. So because I looked at that and I was like,
okay, so I waste most of my time with the uncomfortable SLOTH because
I just believe I can help them somehow.
And I'm most likely to be the dreaming SLOTH because that's my personality
is just sort of like, oh, there must be something we can think
up that's cool here. And I have to bring it back in.

(30:55):
So it was cool to try to think about, well, you know,
which one of these are you and which one are you likely to
lose cycles on? So that was mine. It's like a personality type for analytics
people. What SLOTH are you? Oh my God. What SLOTH do you waste
most of your time with? The uncomfortable SLOTH, we had an acronym for

(31:15):
someone at a previous company that was basically,
could we also see that cut by which I feel like, is like the prototype
for the uncomfortable SLOTH? Like, that face. Yeah. No, I do think there
are certain, like, phrases or analysis techniques that become memes within
companies, where maybe it'll be like, oh, like, can we see this by

(31:35):
new and existing users or whatever? Like, it'll be something that was really
important once, and then it becomes important for everyone. Right. It's
like, why are we? Why? And then the distrustful SLOTH will literally rip
data teams into multiple parts. And that's why you have, like,
little spin off data teams in different departments. And they'll be like,
well, I can't get what I want over here, so I'm going to
hire my own analyst and build a little tableau instance or something so

(31:59):
I can get the data I want. And yeah, it's brutal.
It could be terrible for the data for that stuff to be unchecked.
Well, at least we have it all figured out. Yeah, well,
I said it's a show. Like, it's not necessarily gonna be all answers.
Going back to the big picture, we will have job security.

(32:20):
Yeah, that's right. These skills are very intimidating. Let's see you handle
that, AI. Right. We got a little Skunkworks project over here.
We'd love your help. Like, we're joking a little bit by saying AI
can't do that but I think some of the value of the data
professional is steering people away from things that aren't a good idea
as well. Like having some big fight about who actually drove this incremental

(32:43):
new user population. Usually it doesn't really matter that much because
the company is doing better. And if a data team refuses to incentivize
that fight, it is valuable. You're keeping people focused on the right things,
and that should be said and appreciated is that keeping people focused on

(33:04):
the right things is a really important way for data teams to drive
the conversation and insist upon being a part of value.
If you are getting dragged into things that are a waste of time,
it's probably a waste of other people's time too. So how do we
land on the right level of depth that we want to be able
to expect from those business partners? So I think we started to touch

(33:25):
upon it a little bit with the data literacy. So if it's not
training on those specific card skills necessarily, like, if that's not
what's most important to us, what would be the data literacy renamed and
rebranded 2024 2.0 to help drive more fruitful discussions then? Yeah,

(33:48):
I don't know if I have a name for it, but I think
the most important thing is to clarify expectations about where data is
useful and have a clear purpose for introducing data into situations.
And by this, I mean like a lowercase data, not like data as
in a data team, although those are related. Like if you want to

(34:08):
use data to help you discover new opportunities, for example, like,
how does that actually happen? Like what physically do we want to happen
in that case? Is it that we want someone to look at things
that are related to products that we're already building or audiences that
we're already marketing to? Is it that we want someone to go do
some crazy experimental spike and figure it out? Like that kind of thing

(34:32):
is helpful to specify what you actually want. And it also comes up
a lot in things like impact measurement, where for data people,
I feel like we always want to be really accurate and precise and
get into the cool methods that help us quantify impact of something.
But before you do that, it's probably worth actually taking time to think

(34:53):
about the requirements of the situation. Where it might be, do we need
a really quick go no go answer? We don't need a really precise
answer on something like that sometimes. Sometimes we can just launch something,
doesn't tank in the first couple of days and put whatever guardrails you
want, whatever level of rigor you want on that, but don't automatically
default to a ton of rigor. I guess what I'm describing is you
have a situation like a problem that you're trying to solve,

(35:16):
and then you go and you apply data to that, which may be
how you actually use techniques. It may be the specific data sources you
want to integrate. It may just be talking about the process and the
expectations overall, but you should have a purpose and it should be something
that's actually going to help you operate better or drive an outcome for
the company. And that's where you should start. I don't think it should

(35:38):
start with, like, can you read a bar chart? That's important,
but I don't think that's the thing that actually makes a data team
valuable, is that we know bar charts. It's helping people to focus on
the right things. It sounds like the new training that would be best
for our stakeholders, if we're the data team and we're giving a training

(35:59):
to our stakeholders, it's almost feeling like, or saying, we want to train
you to know, in general what we can do for you and how
to identify where to bring us in. That's kind of how it feels
like it's going, is we're saying, do we need them to be able
to do the data skills on their own? Do we want to train
them in those, or do we want to train them in slightly better

(36:20):
partnering with us? That's kind of how it sounded. Yeah. I mean,
my general inclination, I'm like, not neutral in this, of course.
My general inclination is better partnering with us rather than people having
to go and learn all these very specialized skills, which tools make easier
to approximate doing them but not to do them well. But even more
so than training people about when to ask for these things,

(36:43):
I think we should spend more time learning what other people are doing
and inserting ourselves in the conversations of like, "Hey, it seems like
you're trying to do this. Here is a way that we can use
the data that we have for this." Or, "Here's the way that we
can use the tools that we have for this." It's being problem forward
rather than giving them a menu to pick from. Oh, the menu.
The menu is the bane of my existence.

(37:04):
I hate that. I'm curious how, because this... I mean, that makes so
much sense in the problem forward framing, wrote that down. Loved that.
How do you coach your team then, to work like this inside of
your organization? Because it feels like a pretty, unfortunately novel.
Because we're kind of all nodding our heads like, "Yeah, it makes so
much sense. Like, ingrain yourself and be more problem forward." But what

(37:25):
are some of the guidance that you give or ways that you instruct
your team to be more effective in that way? Yeah, I mean,
one place where it starts for us is in how we even talk
about our goals and the way that we operate in the company.
We always talk about an outcome. It's not like "improved data quality."
It's "reduce overhead of queuing the eventing in the product or whatever,"

(37:46):
so that engineers have more time to go and build XYZ thing.
It's talking to people about like, "Here is the value of the thing
that we are doing for you." Because if I go to people and
I'm like, "You need to register your schemas." They're going to be like,
"Why? That feels like it's slowing us down for no reason."
But if we explain, like a technical project like that, in the context
of all the benefits that they get and all the things that they

(38:07):
are not going to have to think about, they understand it a lot
better. So, one big piece of this is coaching the team to speak
about what they're doing in a way that is really in the language
of the people that they're doing it for. Otherwise, the people may not
understand it. And another piece of it is, just as we're working through
things, talking about like, "Okay, is this actually going to move them towards

(38:28):
their goals?" It's a lot of refusing to do work that maybe feels
low value. People will always try to get you to pull data for
them or to do an analysis that maybe proves that what they're working
on is actually valuable, even though it wasn't in the AB test that
they ran. And what you actually need to do in that situation is

(38:50):
be like, "Hey, we are doing these other things and they are contributing
to whatever metric, whatever company objective, and we will stop doing that
if you want us to do this other thing, which we don't think
is a good idea. So, you can show us that that is higher
priority, but until we have a reason to believe that it is higher
priority, we're not going to deprioritize what we're already doing." Also,

(39:13):
spending a lot of time building context for the business, that's another
really important thing too. And even people who are maybe further back in
the kind of traditional data pipeline model, like data engineers and analytics
engineers, I expect them to know things about the business and I talk
to them about the performance of the company and why things are in
the place that they are, and encourage them to think about what they

(39:34):
are doing and contributing to that so that they have the context to
be able to make calls like that about like "This task that I've
been asked to do is probably not super valuable." So, if there's something
that is more obvious and it's ROI, more plausible in it's ROI,
I'm going to do that. That's great. Help them build that intuition.
How much does it matter, the broader organization and its posture towards

(39:57):
that, in terms of the data team and the data professional?
So, in other words, you know, you're describing that for the company that
you're working in, and I think probably you're spearheading a lot of that.
But how, when you joined that organization, I'm sorry, I don't want to
pick on your company. I'm just saying, like, generally speaking, you know,
from your experience, where has that worked well and what markers of the

(40:19):
organization are good identifiers for this? Okay. This is going to be a
good partnership. Yeah, I mean, that's a great question. And I think it
comes down a lot to, how willing are they to have a conversation
with you? Yeah. People being receptive to feedback, understanding that you're
not just a vending machine, and really demonstrating to them you care about

(40:41):
the outcomes. And when you disagree with them, that it's about you working
towards the same goals as them, that makes them feel like you are
more on their team. And mistakes I've made in the past have usually
been about not convincing people that I was on their team,
and asking them to do things. And they're like, "Go away,
you're annoying. Why are you asking me for this?" But if I'm committing

(41:01):
to hitting the same goals as a team, or people on my team
are committing to hitting the same goals, that changes the conversation
a lot. It makes it a lot more of a partnership where we're
both trying to accomplish the same thing and may disagree and we should
have a discussion about it. And sometimes we will disagree and commit,
but we also expect other people to disagree and commit sometimes.

(41:23):
That's good. When you say that too, it makes me think of being
in consulting. It really is a big deal, that first impression you have
with a client on a project, because to your point, if you aren't
able to early on establish, I think, that you're there to be a
partner with them, to like help them achieve these outcomes, you very quickly,
quickly fall into what we call a lot like an order taker and

(41:45):
it's really hard to dig yourself out of that. And then if you
try to push back, even in a respectful way, right? Of like,
"Well, if we prioritize this, then we can't prioritize this other thing,"
they usually don't take too kindly to it, and you usually don't have
the ammo, to your point, to say, "We suggest prioritizing this other thing
because it ladders up to what actually matters to your leadership and the
bigger goals of the business." You don't get that. You kind of get

(42:08):
siphoned off into, "We're going to send you guys requests. It's going to
come in as tickets, or we're going to have our weekly meetings.
We're going to toss some things over the fence, you're going to toss
some things back and you might get to peek through every once in
a while." So, it's such a good point that you bring up is
like that beginning initial relationship's important. Yeah. Another thing
I value a lot is people on my team having standing relationships with

(42:31):
different cross functional stakeholders because it does make it more of
an ongoing partnership where it's not just like the only point of contact
that they have is that a Jira ticket shows up at one point. They
attend their meetings, they weigh in, they participate in their retros and
their goal setting processes. I think that's really important for data teams

(42:53):
if you can manage it from a bandwidth perspective, to be involved in
a lot of other team rituals because you are literally a part of
that team in that case. That's exactly the question I was going to
ask is I would love to get specific about the ways that you
demonstrate that you care about the outcomes. That phrase really stuck with
me when you said that, Katie. So, participating in those team ceremonies

(43:14):
and really being ingrained, even if you're a centralized team, even if you're
not distributed or you don't have a dotted line, it's really putting yourself
out there. I think one of the things that I've always felt,
but I'm feeling even stronger position for, is that the responsibility in
this relationship is really on the data professional. And maybe some people

(43:35):
will think that that's unfair, like why does it have to be my
job to push into these conversations? But there's so much benefit like that,
I just don't, I'm not seeing any downside to this, to show and
to demonstrate how focused you are on their outcomes and to really be
ingrained. And that's only going to help you be more invested and excited
about your work because every second you're spending isn't being an order

(43:57):
taker or a vending machine, which I love that one too.
It's doing really meaningful work. It probably is more inspiring to spend
your time doing things like that versus adding another filter to a dashboard
and throwing it over the wall. So, yeah. So, this is a call
to analysts out there. It's on your shoulders whether you like it or
not. I mean, the one thing I'll say on that is that I

(44:20):
don't think it's entirely on their shoulders to drive everything, like it
does take two people to have a partnership, but you can at least
start it. Like, the best way to make a friend is to act
like a friend. So, showing people what you want to be treated like
is a good way to actually start making that happen. This is great.
All right. We do have to start to wrap up, though.

(44:42):
So good. Katie, thank you so much for coming back on the show.
We really appreciate it. All right. One thing we love to do is
go around the horn, share something we think is interesting. We call it
a last call. Katie, you're our guest. Do you have a last call
you'd like to share? Yeah. One thing that I mentioned earlier,
a good thing for people to think about is what are quantitative things

(45:02):
related to data and how can you get more involved? A book I
really recommend reading related to this is called The Strategy and Tactics
of Pricing. Pricing is something that can be very quantitative, but it's
very cross functional in a way that probably is familiar to a lot
of data professionals. That book is kind of a theoretical overview of pricing,
as well as a lot of specific things that you can do and

(45:24):
analyses that you can do to figure out how much a customer would
want to pay for something. It's really interesting even if you're not actually
working on pricing, just as a way of learning more about how businesses
work. That sounds interesting. Awesome. I know I wrote that down.
I was like, right up my alley. All right, Val, what about you?
What's your last call? So my last call today is actually an episode

(45:46):
of Lenny's Podcast, which is a very product focused podcast. You can find
it anywhere you find podcasts. I love watching the videos, though,
because he has such amazing guests and they get into such fun,
animated discussions. But one that I listened to or watched recently was
when Claire Vo was on which I am a huge Claire Vo fangirl

(46:06):
ever since I saw her speak on stage when she was CEO of
Experiment Engine. But this one was all about, like, bending the universe
in your favor. And she talked a lot about her career and the
choices that she's made. But there was one quote that really stuck with
me because she's been, she currently is and has been the CPO of
many organizations. And she said, "People often think that I get hired into

(46:26):
companies because I'm supposed to teach them how to operate like a big
company. And in fact, I'm hired to remind them that they can operate
like a startup." And I just was like, so genius because she's worked
at a lot of these larger organizations like Optimizely and LaunchDarkly.
And so I thought that was really interesting. And like, she went into
the details of the values of thinking small and the ways to organize

(46:47):
this team. So anyways, I found it inspiring and interesting and fun as
always like the Lenny's Podcast, so definite good listen. Nice. All right,
Julie, what about you? All right, mine is an article that was about
fun new use of AI. Maybe not fun new, but at least to
me, I didn't know they were using AI in this way.
And so I found it interesting. It is an article called The Sperm

(47:09):
Whale Phonetic Alphabet Revealed by AI. It's an article on BBC and it's
kind of crazy. So they've been studying these whales for years and years
and years, and they have all these recordings of them. And I guess
humans were able to identify 21 patterns. They call them codas.
And then when they started using AI, AI was actually able to identify

(47:31):
156 distinct patterns of these clicks. I mean, they're literally just clicking.
I was listening to a recording of it. And so they're using AI,
though, to try to discover their actual language. And they've been shocked
to see that they may actually have much more sophisticated language than
we ever thought, which was just really cool. Oh my God.
Douglas Adams was right. Makes the book Moby Dick really different. Yeah.

(47:57):
I want the follow up piece to be like. So then we took
that, and are able to talk back to them using
AI to write the sentences. That's what I want the piece to be. So,
so crazy. We'll see. I'll keep an eye out for it. That's amazing.
What about you, Michael? Well, I'm going to recommend a book too.
It's something actually, I think I've had as a last call before,

(48:18):
but it's been probably five or six years, so I feel like it's
what you could do, a refresher. And it's because I recommended this to
someone recently, and then they texted me this week and said how much
they were getting out of it. It's a book called The Effective Executive
by Peter Drucker, which is a really old book. I think it was
written in 1967. So if you read it, just throw out the parts

(48:39):
that don't make any sense in our world anymore. But there are really
great things and actually sort of germaned even what we've been talking
about in terms of thinking about how do you as an analytics professional,
kind of elevate and think about the business as a whole?
It's a very short book, but really helpful in aligning thinking.
So that'll be my last call today. All right. As you've been listening,

(49:02):
you're probably like, hey, how do you find out about this stuff?
Well, first off, we would love for you to go subscribe to Katie's
substack, because it's where we find all this information, and then we...
So we're going to include that in the show notes. So we'll include
the links to that. So please subscribe to that because then you can
be in the loop and one of the cool kids like us,

(49:23):
and we'd love to hear from you. And we're always on the Measure
chat Slack group or our LinkedIn page, or via email contact@analyticshour.io.
So feel free to reach out. And of course, we want to give
a huge shout out to Josh Crowhurst, our producer, who is behind the
scenes making all this possible. And also maybe just a little shout out

(49:45):
to Tim Wilson for being a little bit behind the scenes,
helping produce this show this time as well. So thank you to both
of you. And I think that's it. Katie, thank you. Yeah,
you're in the two timers club, which I think as of this moment
and can't look in the future. But that's the most anyone's ever been
on the show. So the next time you're on, you're gonna hit that

(50:07):
third timers club. Yeah, that's right. Three punches, free ice cream.
Your loyalty card is in the mail. Anyway, but we are very thankful.
Thank you so much for taking on the time. And, I mean,
we just always get a lot out of it. And I always am
telling people, Katie Bauer is the one who thinks about this stuff better

(50:28):
than anyone else right now. You've got to read her stuff.
So it's cool for me. Thank you. That's a very, very nice thing
to say. Well, it's coming from me. So let's, you know,
it's not what you think it is. No,
no. I mean, it's heartfelt in that, at least in that regard,
I just, I really enjoy the way that you approach the problems in

(50:50):
analytics and our industry. So thank you. And I know as you're sitting
there trying to figure out how to partner with your org and be
problem forward, remember, I know I speak for both of my co hosts,
Val and Julie, when I say keep analyzing.
Thanks for listening. Let's keep the conversation going with your comments,

(51:12):
suggestions, and questions on Twitter at @analyticshour, on the web at analyticshour.io,
our LinkedIn group, and the Measure chat Slack group. Music for the podcast
by Josh Crowhurst. So, smart guys wanted to fit in, so they made
up a term called analytics. Analytics don't work.

(51:33):
Do the analytics say go for it, no matter who's going for it?
So if you and I were on the field, the analytics say go
for it. It's the stupidest, laziest, lamest thing I've ever heard for reasoning
in competition. And I'm Ken Riverside. No, I'm just kidding.
I'm Michael Helbling. Sorry. That's an alter ego we're working on.

(51:57):
We're just shopping it. Fourth Floor Productions. Fourth Floor Productions'
really big in the Chicago podcasting business, but I'm actually Michael
Helbling. Rock flag and data work is definitely a job.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.