Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hi, Sandy Estrada here. Welcome to this week's episode of How I Met Your Data.
This week, we have a long one. It's a bit of a double feature.
Anjali and I are going to talk about a couple of conferences that we've attended
over the last few weeks. And we also have a special guest, Rachel Workman.
She is the head of data at Sound Commerce.
And I am so excited to share Rachel with you. I've known Rachel for seven years.
(00:22):
She is a bottle of lightning.
She just has a very unique perspective. She has worked both as as an engineer.
She's worked on the business side. She's worked as a management consultant,
and now she's backed on the engineering
side as head of data for SoundCommerce. So let's get into to it.
Music.
(00:56):
That was your week. I was on vacation last week. Prior to that,
I went to a couple of conferences, one of which you attended as well.
A little bit of travel and meetings, etc.
It's It's been a whirlwind couple of years. Yeah, I mean, it feels like a bit of a whirlwind as well.
You were out last week, as you mentioned. I'm out next week.
So just kind of that run up to being out for the week has been kind of mind-boggling
(01:22):
a little bit, a little bit of mental gymnastics.
But I'd rather say that as opposed to telling you I had nothing to try to close out before I left.
Well, you can time go by before I go on vacation. Yeah.
But my seat's super warm.
Spoiler alert, when you get back, 10 times worse than the prep to leave.
(01:45):
I'll tell you that much. It's been hoey for me. Yeah, it's the luxury tax for going away.
You prep so much before you leave the office.
And then when you get back, it's like you pay the penalty for all the stuff
that didn't get done, plus all the stuff that went awry, plus all the stuff
that needs to get done in the future. But it's okay.
It's worth connection and time with my family, going somewhere new,
(02:09):
all of that. I'll take it. Yeah, same here. Same here. Absolutely.
Let's talk about the conferences we both went to. So I went to FEMA Boston.
It's the Financial Services Data Management Conference. It's been running around for a while now.
I really enjoyed it. I actually met the conference producers leading up to the
conference because I was going to emcee one of the tracks and run a couple of panels.
(02:36):
So I moderated a couple of panels during the event. And what really got my attention
was the agenda and the way they set it up.
There's a lot of round discussions, a lot of opportunity for folks to engage
and interact and learn from one another and network.
It was probably one of the first conferences I've been to in a very long time
where I really was able to meet folks from financial services that were really
(03:00):
in the trenches trying to solve problems with data.
One interesting takeaway from that event was one of the sessions I went to was
this boardroom session, which is more of a workshop type session on Gen AI.
I was astonished. The speaker, the person moderating the session,
asked the room, who's working actively on Gen AI? No one raised their hands. Interesting.
(03:21):
Yeah. Yeah, exactly. I was in shock. I was like, wait a minute,
wait a minute, wait a minute.
The moderator did a great job of outlining all the different areas we've seen
in terms of use cases for Gen AI.
Content summarization, being able to search content easily, being able to aggregate
trends and data, being able to code migrations, those kinds of things,
(03:42):
code facilitation use cases.
There were some others, but no one in the room.
Across all those use cases, there wasn't a single hand in that room.
And there must have been 50 people in there. That's really shocking.
You know, we talked a little bit a couple of weeks back about these shadow Gen AI.
Organizations that are spinning up kind of under the covers at clients that we work with.
(04:04):
So it's surprising that to hear that. Was there any key themes or drivers around why?
What I walked away with was that data teams are not involved because the folks
in the room were part of governance, data management, right?
They're talking about it. They're trying to prepare for it, but it's potentially
(04:25):
happening in other pockets within the organization.
And it's funny because I saw a LinkedIn post this morning about who should own
AI and your buddy Malcolm Walker responded saying that's no one should own AI.
AI is going to be in the work stream.
It's going to be in, you know, in the applications you purchase.
It's going to be built into your work stream. One person shouldn't own it,
(04:47):
but people should own data, right?
Like I thought that was interesting and I agree with that perspective.
So I walked away, all that kind of have informed my decision today to say,
I have a feeling the folks in the room are just not involved in the solutioning for Gen AI.
It's a different group. Totally makes sense to me as I think about it.
But yeah, it was a great event. Lots of talk around governance.
(05:08):
Yeah, I was really busy between the panels. The data democratization, that was the topic.
So the panelists on that were incredible. And we ended up recording with FEMA,
two separate podcasts on two separate topics with that panel.
So that was a lot of fun as well, but it was a busy two days.
Sounds like it. Sounds like it. And then you immediately hopped a plane,
(05:28):
met me in New York with the next conference.
Right. So we met at Data Universe, two-day conference there.
It was their first time running the event in the United States.
They have a sister event, same production company has a sister event in London.
Big Data London, I believe. Correct. So they run Big Data London and Data Universe
is their sister event here in the United States.
This was the first time. It was an interesting space, wasn't it? Yeah, yeah.
(05:52):
I mean, I've been to the Jacob Javits convention before for the big auto show.
So to go there for something that that's a little bit more professionally driven was interesting.
I think what was unique compared to some of the other conferences that I've
been to over the last year, two years, was the fact that there were so many,
(06:13):
like all the stages were in one place.
So it wasn't separated by different rooms or different floors for presentations,
but you were given headphones at each of the stages.
I really enjoyed that. I would love to hear your thoughts on that multi-stage approach.
Approach but I thought it allowed for a
(06:33):
little bit more fluidity between different different
talks as well as more networking
I ran into a number of people that I've seen
before spoken to before and just felt that
that open concept allowed for a
lot more connection with with people that
we knew as well as new friends yeah I agree
(06:55):
with the pros there I absolutely agree there was
way more more networking that I think other layouts allow
for because you spend so much time just walking from a
like floor to floor area to area having it
laid out the way they did in that open forum was really nice because
you could always everyone was going back to the center where all the couches
were and tables were to have a conversation if not that we're talking to the
(07:17):
booths we I've seen more people at booths there than I did at other places because
it was right there in your face you couldn't get away from it I really enjoyed
that I think the downside having been on one of of the panels,
as a panelist, the downside that I felt was the headphones.
It did not really allow for interactivity in terms of the conversation you're trying to have.
(07:38):
I think the only feedback I would give them is provide audience microphones
for the Q&A at the end versus having the iPad question seem so anonymous.
And I think the format in terms of the short hit time frame for the presentations
could be a little challenging as well, because I think that the presentations
ran about 20 minutes, which was really quick.
(07:59):
Felt like we were just getting to the surface, barely scratching,
whereas a little bit more time probably would have allowed for more depth and
some of those conversations.
Yeah, no, I totally agree. But there were a lot of great speakers there.
I met the head of AI at Airbnb and heard her story. And that was like super impressive.
It was funny because I met her on the couch before the event.
(08:22):
I got there super early that day. It was the second day of the event.
I got there very early, grabbed some coffee.
There was maybe me and one other person there. And then all of a sudden,
I'm looking around, it was all women. All the women showed up early. It was fascinating.
So we sat on the couch and chit-chatted. I met a number of very interesting people there.
So really, really cool layout for networking for sure. For sure.
(08:43):
Looking forward to see how that evolves next year. Yeah, for sure.
I mean, they had a very kind of wide-ranging set of agenda topics as well. Thank you.
But kind of as a counterpoint to your FEMA experience, there were a lot of sessions
on AI and Gen AI at this particular conference. I think I know why.
(09:06):
I hate to say this because I appreciate it. I get a lot out of it.
But I think some of these conference coordinators need to keep an eye out.
There are a lot of data influencers out there these days. I mean,
when I see data influencers creating companies to literally become data influencers,
I get concerned because now you're just talking to talk.
(09:27):
That's concerning to me in terms of a marketplace where technology is moving
really fast to see that happening.
Because the reality is the messages that are being put out there,
like Gen AI is going to solve all world problems, is tough.
It's tough to hear that. because I know on the ground that is not happening
and there's still very large problems to solve.
(09:48):
The Gen AI is not going to be the magic bullet for us. So yeah,
I think that was kind of the disappointing aspects there.
But I'm hoping that conference companies, organizations do what FEMA did.
Like literally, if you're not part of an organization, a company and have a
case study, you don't get to be in the room.
I want organizations doing that more and more so that more of these real life
(10:11):
use cases actually get on the table.
Yeah, exactly. Make it real. Make it something that we can then look at and
say, how do we embed this in our organization?
How do we replicate that level of success? Similar fashion. Right.
And then they wonder why clients aren't there. Well, clients aren't learning from other clients.
(10:32):
You want the local, you know, CPD company or the manufacturing company,
they're not going to show up to an event where they don't see themselves on stage.
You want to see other organizations solving the problems that you're trying
to solve and you want to learn from them.
And if you're just hearing from, you know, talking heads or product vendors
or even I'll even say consulting firms, that's a challenge. Yeah,
(10:53):
I'm gonna eat my words on that later.
But for now, that's where that's where I'm staying.
You're cringing over there. I love it. You're like, Oh, no, no,
because my mind's now racing going, Okay, so who are we bringing with us to
really talk about a meaty success story?
So we aren't that consultancy that's just, yeah, talking about what we could
(11:15):
do, but really anchoring on what we have done, and what benefit it's brought
to brought to an organization. Yeah.
And, you know, I say that kind of tongue in cheek because obviously I was on
a panel. That entire panel was consultants.
We kind of turned the conversation on its side where the moderator acted like
a client and we're trying to convince this client what this new terminology
(11:37):
was of data intelligence.
So it was a great conversation. I definitely got a lot of feedback from it.
It seemed like it's a, you know, with newer topics where there aren't too many
case studies, where there aren't too many folks who actually have done it or
can you want a different perspective? Yeah, that makes sense.
But to do that all time for the entire conference, that's a challenge. Yeah, I agree.
(11:59):
That's why I called out the Airbnb one because it was fascinating.
It was a real use case. It was about blockchain and Gen AI, a topic I had not even thought of.
So that was super Super fascinating to me to hear that case study and to understand the application.
Awesome. So we're going to, we're a couple of minutes away from Rachel Workman.
(12:20):
Hey, Rachel. Hey, that was fun.
That works. I was like, oh my goodness, I'm going to fail on the very first step, the technology.
No, no. That was my sending you the wrong link trick. It works every time.
You're like, we'll start with panic and then it'll be all uphill from there.
(12:40):
Exactly. A little bit of panic helps.
You get your heart pumping. You got to be ready to go and excited to be.
Hopefully that did it for you.
Sure. How are you today, Anjali? Pretty good. Pretty good. We're leaving for vacation tomorrow.
So I'm starting to get into vacation mindset.
So you get to that edge where you're like, I'm going to be free soon.
(13:05):
You're never really free.
See, I'm not gonna have to worry about my existentialness at all.
You guys share that trait.
Well, Sandy and I were actually talking about the luxury tax that you pay when
you get back from vacation, right?
It's almost like you're already tired from this amazing vacation that you had,
(13:27):
but now all of a sudden there's this pile that you just have to start working through.
And it's just almost like you're paying a penalty for having gone away for a little bit. You are.
Like, I agree with you completely. It's not almost, it's you are because the
work like doesn't go away and no people will cover for you on like really,
really critical stuff, but they don't have time.
(13:49):
Nobody has time to like pick up your work so
all you've done is condense your work into smaller time period right
i'm still a fan of vacation and right but the work didn't go anywhere that's
hilarious i'm definitely a fan of the case i take way too much so rachel i actually
(14:09):
did a little digging to figure out when did we meet i was wondering that Yeah,
it was February 2017, in case you were wondering.
So it's been a while. Seven years? Yeah, seven years now. And we engaged for about two months.
I think I would have put it in time, but I can't believe we only engaged for like two months.
Two months. It felt so much longer.
(14:34):
I don't know if that's because of me or because of the product or because of
the situation at the organization, which shall we name?
Nameless. Yeah, it's really all like you were the bright spot in that project.
I felt like you were one of the lone voices of sanity.
You know what it is when you're in a project and the words come out of your
(14:54):
mouth and you look around and you're like, no, nobody has an idea what I'm talking about.
Everybody says things and it all sounds like it's like Peanuts characters.
It's like you're like, nobody's making any sense. And then somebody makes sense.
And you're like, this person makes sense. So it does. It like it leaves.
This imprint in your brain. So I felt very much like that experience.
Well, I appreciate that. I appreciate you. And quite frankly,
I was surprised too. I was like, oh, it's two months. You left an impression on me, obviously.
(15:18):
I think you and I have been reaching out to one another over the years.
So it's mutual was my point.
But at the time, you were the head of operations, the head of customer success.
It seems like you had multiple roles there. Yeah, 2017.
I think I was still over North American customer success and services.
So I've got to know you for the past seven years. Maybe we can spend a little
(15:40):
time educating Anjali and our listeners in terms of your past and who you are. Sure.
So I'd love to. Rachel Workman. I currently am the head VP of data at a series
A data platform startup company, SoundCommerce.
I'm immersed in this field every single day. I'm doing cool things.
My company built a tool that basically makes pipelines, building pipelines,
(16:04):
more accessible and flexible.
So the whole, you're playing in the whole low-code, no-code space,
more flexible for how you shape data, pull semantic modeling in stream,
as opposed to, and data at rest in the data warehouse,
lands that data in maybe a more usable format, comes to like time to value and
avoidance, cost avoidance of some of the data processing.
(16:24):
So super cool things to be working on in the data space.
It is my obsession, my passion, and my life in that space.
But as you know, because you met me during that time, my path here was anything
but conventional. If I look back, I started out, I started out on the right path.
Well, how biased am I? One of my first jobs in grad school was database programmer
(16:46):
for a high volume shipping system that, you know, built databases through SQL server.
I think that might be where my love of this might have been born.
You want the whole dinosaur story. We also build in bb.net. Yeah, that.
Played me in time, but it wasn't too long after that, that I moved into,
I wanted to see the business side, understand really more about why we were
(17:06):
building the things we were building.
And I moved into management consulting and then it's services leadership.
And I spent the preponderance of my career on the business side,
solving things from a business standpoint and running PNLs and all of those things.
It was always tertiary to data and analytics as in every company I worked for,
I always worked for software companies, had data products or analytics product.
(17:28):
And so it was never too far, but definitely not on the technical side.
And then really wasn't until age started to take over and you start asking yourself
the questions of like, am I really doing the things I love that I gravitated back towards data?
It was right around when you and I met that I was getting my data science master's
big old leap after that, which is, well, let's go see who's going to buy into
(17:49):
the fact that I can run on that side fully.
And luckily, a great startup, Outlier AI, did and let me spend a couple years
mired in the modern technologies and really hands-on stuff and my time preparing
data from all different kinds of industries and companies for time series modeling.
(18:10):
And I haven't really looked back since. That's my origin story.
Such an interesting story. And I definitely want to dive into kind of your focus
today and talking about data.
But before we get into that, I'm just kind of curious, like what keeps you busy
outside of the exciting world of data?
(18:30):
Yes. So I am the mom to an 11-year-old boy.
So that is the primary thing that keeps me busy.
You know, as most parents know, your life revolves around things like common
core math and mastering YouTube videos of common core math so that you can help
with math worksheets and,
(18:51):
you know, practicing spelling words of words that you thought you knew how to spell,
but you look at them and go, maybe I don't know how to spell.
How many M's are in commemorate? Right. So a lot of my time is taken up with those things.
But when I am not basically revolving my life around him, I have two dogs, love them.
My one dog's my jogging partner. I was going to say running partner,
(19:15):
but I think my days of running are past. Definitely jogging territory now.
I like to hike and I still love, as I have since probably the day I picked up
my first book, Fantasy and Science Fiction.
So read a lot of that. Well, read and listen to audiobooks because I could do that while I'm jogging.
Combining multiple loves all at once, right? You got to be efficient, right?
(19:39):
Well, being a busy mom, it's the efficiency, right? Right. I'm curious.
I'm really into science fiction, mostly consuming it on media television.
But is there a science fiction book that you're reading right now that you would
recommend or a recent one?
So recommending, I would recommend everything I read. Generally,
I've actually mastered the art of putting a book down that I don't like.
(20:03):
That took me a long time, but I will. If I don't like it, I'll put it down.
So anything I read, I would recommend.
But I definitely have interesting tastes. So right now I'm reading the Crossbreed
series from Danica Dark.
So I tend to gravitate towards strong female characters, usually in some type
(20:24):
of urban fantasy type novels.
So she's definitely somebody I've read a lot, and I think she does a good job.
And I have to mention Shannon Mayer. I won't go on forever because,
I mean, a list of authors would be hundreds long.
But I think she would be a favorite of mine.
And also, again, really strong female characters.
But she does a 40-proof series, which is,
(20:47):
you know, her heroine is, you know, across the 40 and has to learn how to,
like, be a hero and not be 20 and tough anymore. So maybe that's closer to my heart.
That sounds fascinating. We'd have to check out the 40-plus heroine stories.
It takes some inspiration, I think. But so, Rachel, you talked about kind of
(21:12):
your your journey and your career
path and how you went from data to the business and then back to data.
And in that transition, were there any surprises now that you're back on the data?
Yeah, I think that some of the biggest surprises that I had going from side
to side like that is how much of a goal there is,
(21:35):
like how big the chasm is and how with all the maturing of business strategies
and all of that, that we really haven't made tremendous stride in closing that chasm so much.
Or as much as I think maybe we have the opportunity to.
Having worked on both sides, not only are the languages just,
(21:58):
and I'm not talking about programming languages or anything,
but the language you speak, the vernacular you speak, the concepts that you
ideate on and really rally around are just very different.
But the biases are super strong.
You see it surface in memes and stuff on LinkedIn where somebody else,
and they all make us laugh.
(22:19):
Right? So we know there's truth in them. Like where somebody's like,
hey, can you throw that data to.
For me real quick and you know stuff like that and you're like oh yeah
that's a that's a week long task that you just
think it's going to take five minutes and and vice versa where
you know the the business side is like i really don't understand
how it could take you that long to get to this number there's at least been
acceptance in the last you know half decade to decade that we all like there's
(22:43):
no solve to this problem other than like we all work together but i don't i
i was just super surprised to see that we're still so far from that yeah i mean
as you said there's no simple
solve to this problem besides collaboration and communication and openness.
And I mean, without that, you might as well just pack up and go home.
We spend all these times and, and by the way, I'm a fan of some of the things that I'm mentioning,
(23:06):
like I'm a fan of thinking of things from conceptual or framework things like
data mesh and stuff like that, and different ways that you can build out organizations
and collaboration strategies and all this to aid in these things.
But we spend so much time thinking about those things, conceiving them,
articulating them, and braiding them down and stuff like that.
And we kind of skip over the whole like work together part of it.
(23:27):
But I feel like it's really weird because the business world is almost the antithesis of academic.
But at the same time, we take these colossal academic approaches to like,
let's define a whole new way of working as opposed to like, let's sit down and
solve this problem together.
Right. Which should be the new way of working, right? Let's collaborate and
(23:48):
drive that forward. word.
But I guess, you know, as you think about that chasm, right,
the wider that chasm has gotten, I've experienced at least, is the wider the
chasm, the more friction between the data teams and the business.
So how are you navigating that friction and that chasm in your daily life?
(24:08):
Yeah, that's not a simple question at all, because that, you know,
the chasm is real, and the vocabularies are not the same.
And even the slant on the same vocabularies aren't the same.
So, and I'll give you an example. Data trust is super important to data teams
and data trust is super important to the business, right?
(24:31):
But if you're on the data team side and you're talking about data trust,
you're going to talk about things like fidelity to source or observability or
visible lineage or complete lineage or complete and transferable metadata. of data.
And you're going to think about these things and you're going to think about
the pieces and parts that give you trust that the data is accurate to your specifications. But
(24:55):
You may be okay with things like 1% variances in certain things,
right? And stuff like that.
And you're going to maybe think of things like mathematically or statistically,
but you're going to be okay with parameters like that.
If you immerse yourself fully on the business side, you walk into a conversation
that has similar things.
You know, people are going to want things like clear definition of metrics,
(25:15):
right? You know, how did you build this? What's the math?
You know, show me the math, show me the path. Like, where did it come from?
Show me, you know, how do you create that?
And what's the math behind it? But I've seen whole conversations just completely
implode when somebody's, oh, it's within the tolerance, the air tolerance.
And the business side is like, what?
Like, what's missing? Why is it missing?
(25:37):
Where is it going? And how do I know that it's not something really important
that's missing? Now everything's suspect, right?
And difference in attitude that seems to cause a lot of challenges.
Challenges and navigating that, I have found some success or at least learned
some things about trying to
avoid different trigger words that come off as kind of flippant like that.
It's important if there's a rattle in your engine and you're driving your car
(26:01):
down the road and you have no idea how engines work, you're going to be pretty scared.
But if you're a mechanic and you're like, that's the heat panel.
You're like, I'm driving along
with my stepdad. He's like, that's the heat panel. Don't worry about that.
You're fine. And I'm like, yeah, really? Am I? Is it going to blow up?
Because I don't feel fine. That's a weird noise. And those types of like really
relatable things are what's happening in these this type of dynamic between these teams.
(26:23):
Again, to the thing that we seem to throw away super easily,
which is the human connection of really trying to understand each other a little bit better.
Yeah, I find that if you throw the jargon out the door and meet somebody where
they're at and try to understand how do you think about this and what are the
things that matter matter to you as an individual.
(26:43):
It's hard as a group, but I think if that individual is a representative of
the group, their peers, you can kind of try to get an understanding from at
least that individual in terms of how they think, what is it that they're worried about?
What is their tolerance level in business terms?
And try to set that all aside ahead of time so that you kind of know where you're
going to go as you engage with them and as you move forward.
(27:05):
And I find that a lot of people miss that step. They go in and they immediately
engage, assuming that they're on the same page, assuming that they understand
what's important to one another.
And that's a complete misstep for most people who are in this world.
And that's why the chasm just gets bigger and wider.
Because everyone's making assumptions about the other side of it.
Yeah, the spectacular miscommunications that come from those assumptions.
(27:30):
One of the biggest ones that I have seen, and again,
you know my bias from being on the business side so much of my career and now
being completely immersed in the data side,
is the assumption that the business side somehow doesn't understand maybe the
math of the situation as well.
It's the rare individual on the business side that sits down and really likes
(27:55):
to actually work out the math of a gradient regression.
That's things that maybe a few amount of people like to do. That doesn't mean that they can't.
Once they sit down and talk to one another, they're surprised that there's people
on the business side that can work out supply chain forecasts in their head
and really understand very complicated concepts at a very detailed level.
(28:19):
And there's a lot more commonality there than you think there is.
It's just coming from different angles. Yeah, absolutely.
It was funny because I actually met someone who once said, he said that companies
don't understand basic economics.
And I'm sitting there going, wait, most people I meet who are on the business
side either have economics background or an engineering background,
(28:43):
like C-level executives I've met who have engineering backgrounds and they're business executives.
And I'm sitting there going, do not dismiss people because of their title.
They know more than people think.
And I also think that reality is most of these concepts can be distilled down to very basic terms.
Yeah. Yeah. And I think part of it is also around language, right? Language matters.
(29:06):
And so one of the things that we've seen happening quite frequently is this movement of resources.
So as one priority comes up, we're moving our people from, you know,
from priority one to priority two, and they're taking the language and learnings
that made them successful initially to this new role,
(29:27):
new priority, new shiny object.
So are you encountering that today?
Yes. I, and I, I think that I would have answered the question,
the, that question a little bit differently before Gen AI became a thing.
Now, something that like keeps me awake at night, there's always been a bit
of whole, like shiny new thing.
And I think there's a couple of things that personally in my life have changed
(29:50):
that maybe put me in closer into a group that does that more,
you know, which is the startup culture and Silicon Valley.
There's things you can't ignore. People give money to shiny new things,
you know, more than they give money to boring old things.
And that it is what it is. But there's a real cost to that.
Context switching has a cost.
(30:11):
It is a mental tax on the people who her context switching and it has efficiency
cost and it has a tech debt cost.
On things that are basically done short so that you can move on to that new thing.
And all those things need to be talked about and honored within the context
of what we've been talking about, which is that gulf between the business side and the data side,
(30:36):
making sure that we really use our voices and try to articulate the cost of
making those moves and the risk and find the right words.
I see it as like a tug of war Never give up. Just always keep pulling and don't
lose your ground on those things.
Because the cost of allowing that to happen, especially if you hold innate knowledge
(31:02):
of possibly can go wrong.
And that's why I said Jenny and I kind of hold a scar that I think every single
one of us who works in data every single day watched this whole LLM thing unfold
going, what about the data?
And all of us, it was almost like watching a slow-moving accident unfold.
(31:22):
We all knew there was so much excitement and so much momentum.
And I feel personally, in my circle, we just never found the words.
As we all have found out in business, I told you, so it never matters.
By the time you get to it, I told you, so you've lost the battle, right?
If I look at that from a lessons learned standpoint of how we can learn to work
(31:43):
together better, or like finding those words to articulate and not bore the
people who are excited, you move the resources to the shiny new thing.
You might have a super cool prototype that is flashy and everybody loves.
And if the purpose of that, if the objective of that was to get like an investment
or something, that's fine.
If your purpose was to productionalize, was to get a business goal and business
(32:08):
value out of it, you moved resources from the thing that was building the underpinnings
of it. And now you've made your path longer.
And being able to have those discussions is super important.
Yeah, I was just gonna say, I find that to be the case regardless of whether
it's JAI or any other new tech that comes along usually or concept or issue
(32:31):
that an organization is trying to address.
Orgs still have not gotten smart to the fact that you still have to run your operations.
Innovations and if you're if you're going to do r&d innovation that should be
a completely different arm of your business even in a data it should be separate
because then you can make those calls and say all right we were going to innovate
(32:53):
on x here was investment we made
If we're going to shove that, we already have a sunk cost on that investment with that team.
Do we spin up a secondary team to run this secondary investment so that we don't
lose sight of the sunk cost we already have, you know, proposed value?
Get in there, iterate real fast.
You do have to R&D it. There's a concept that I'm sort of obsessed with on the
(33:15):
side. It's sound metric trees.
One of the things that I find very useful in a well-constructed metric tree
is that That idea of isolating the
portions of the business and the proportionality of the business impact.
You can isolate the business impact at the right level.
You can't drink two consecutive cups of coffee without somebody asking you about
(33:38):
the value you're driving or articulating the value you're driving or quantifying
the value you're driving.
Finding ways to make you more immune to being led off track into shiny things
that might be super cool, but are going to take a lot of time and impact a very
small amount of the business seems like a survival skill these days. Yeah.
So you, in order to do that, you've been using this idea of creating kind of
(34:00):
a visual metric tree to state, really quantify the impact of the ask.
I actually just wrote down metric trees because my brain just started firing
on all cylinders as you said that, because there's always a challenge in terms
of being able to articulate ROI.
I've done some workshop on this. And the first thing I always ask is,
is do you really understand what drives the business?
(34:22):
Every ask out there has to help whatever drives the business.
And if it doesn't naturally impact it and you're not able to quantify it, then you have no ROI.
You can understand whether it influences some of that, but even the impact of
the influence can be measured.
I love that idea of metric trees, having that within an organization.
And that's an opportunity for these data catalog companies to actually create
(34:46):
it as part of the catalog.
Data catalogs are actually pretty flat. There's no this metric impacts that
metric impacts that definition, et cetera. Right.
And it is that relationship that's key. And it's why I get fascinated by it
and really excited by it.
That you can create an ROI story about anything.
You've got a whole concept of what are you trying to optimize?
(35:08):
What are you trying to maximize? What are you trying to minimize?
It's all interrelated and impacted by business strategy.
You know, am I trying to, you know, increase my brand awareness?
Am I trying to increase my local or high intent conversions?
Like it's all in the context of business.
And if you look at the metrics by, I like the word you used,
(35:29):
flat, You've lost the context because you can be focusing on minimizing or maximizing
a metric that actually is like three levels deep from a more meaningful metric.
And you're focusing all your effort on, let's say, you know,
minimizing something that's supposed to minimize costs and it's responsible
for, you know, less than 5% of your overall cost.
(35:51):
Until you map it out and really start to drive those relationships,
you lose the context. Next concept's been around for a long time,
whole balanced scorecard.
Let's understand how our business actually runs and what are all the drivers
to this part of our business.
And that could lead to not only just for the sake of understanding the value
of the data, the impact and the proportionality, as you said,
(36:12):
of the value of those things, but also how we report on it and how we look at it.
There's so much value in that and nobody does it.
And then you can also almost circle around it because you have different parts
of the the organization that all feed into the same number.
So you can kind of start looking at things differently in that respect as well.
These parts of the business work in isolation.
(36:34):
And then somebody at the top who needs data from all different parties is sitting
there going, well, I don't understand why I can't get this number.
Let's go look at the metric tree. So let me tell you how data comes into a warehouse.
Now I'm going back to that metric tree and telling you it's because I have 18
different metrics that impact that number, and I can't get six of them.
Right. Yeah, I can't get six of those. Or you show somebody at that level that
(36:58):
you're talking about, you know, you take a C-level person, they can then do
the kind of that mental math and be like, okay,
there's some key areas of concern here in the pipes that are causing me to not
be able to get this number.
How does that help me contextualize my priorities? Exactly.
Because we hear time and again, you know, I'm asked what the value of my data
is, and organizations struggle to articulate that. So I think this is absolutely fascinating.
(37:23):
I'd love to kind of pick it. How do we, you know, roll that out to our teams
to ensure that they are looking at developing those metrics,
developing those metric trees, and then actually communicating along those lines? Yes.
So I think I've hinted at it. It's kind of where I'd stake my,
you know, my flag is on almost like back to the basics type of stuff,
(37:46):
which is dispense with a lot of the lofty ivory tower stuff and give people
real project with clear boundaries and clear objectives to work on together.
What I'm really talking about is the fundamental team building and trust building,
the type of synergies that comes out of real teams solving real problems together.
You sit down in a room together with a whiteboard and you spend a couple weeks
(38:09):
really solving a tough problem.
There's some real teamwork and trust and shared experiences that comes out of
that. They're going to share information.
They're going to learn to trust one another in the situations where they don't
understand one another.
You need that because you can't take somebody who's entrenched on the data side
(38:29):
and teach them everything about business.
I'm sure you can, given enough time and their inclination, you could.
You don't have that type of time and money in business and vice versa.
You can't take somebody on the business side and you don't have the time to
make them a fully skilled data practitioner.
You have to learn to trust others when you don't understand them and those real
world practicum practices.
(38:49):
Activities seem to be really key. And I would tell any data leader out there
to find those opportunities and rally around them.
Yeah. And I would add to that as well, making it okay to take a risk and potentially
fail in that exercise. That's key. That's super key.
But I heard the saying, you don't succeed or fail, you succeed or you learn.
(39:14):
And I forget where it came from. I think it was either some commercial or some
like athlete or something said it, but it really is true.
If you look at it the right way, when you fail, that's it, you learn.
And really engendering that within our teams is really core.
Yeah, I think it was actually a Tiger Woods commercial where at the end,
his dad, I think so, where his dad says, and what did you learn from that?
(39:37):
So I know we're coming up on time. So just one last question for you, Rachel.
If you can have dinner with anybody from history over time and discuss the future
of data, who would it be and what would you ask them?
This is one of the hardest questions that anybody ever asked.
I think this might be a little weird of an answer because it really,
(39:59):
in my opinion, doesn't have anything to do with data, except for in a tangential way.
But I would pick somebody like Sophie Germain or Nellie Bly.
There are people in history, little known, usually obsessed with,
obviously, I'm a woman, And so I'm like, oh, what about the woman in the woman's
voice and the lost woman's voice over time?
I would want to sit down with them and I I wouldn't ask them anything about data.
(40:23):
I mean, I would ask them, what do you do when it gets too hard?
Because being in a technology space as it is, is tough.
That's tough. Being in a space that is dominated.
By other than women. And this isn't, you know, I'm speaking women-centric.
This has to do with a lot of people who are underrepresented in this field.
(40:45):
We've talked about so many tough concepts and challenges about getting people
to work together and challenges about getting ideas across and all those things,
right? They're all tough as it is.
I think I mentioned earlier, I see a lot of things like a tug of war where if
you let up for a second, you've lost ground that you now have to spend capital to make up.
(41:06):
And there are days when you look around, you're like, do I have it in me to
try to find the words that I can't find?
Do I have it in me to try to really dig to understand that other position?
Because I know that's what's getting in the way of this. And you look back and
there's just these amazing people that did amazing things.
(41:26):
Sophie Germain, she had no support, no representation, no models.
She completely dominated a mathematical space, basically was able to bring to
fruition a math theorem that many tried and failed.
Nellie Bly picked one of the most unrepresented groups of the time,
like mental patients, knew they would be treated unfairly. She was a journalist.
(41:48):
She could have lived her life quite happily, I'm sure, reporting on things other than.
She went undercover in a mental hospital where she knew people were being treated
abhorrently and used that to drive amazing and lasting change in the world.
Those people could really tell you what was tough.
I'm not minimizing anybody's struggles, but I'm curious, what did they think
(42:09):
about when the days were hard? Or did they just not think about it? Was that their secret?
I want to know that. And it might not be the best dinner conversation.
But I would love to have that conversation.
I think that'd be a fascinating conversation. I love those figures.
I love their stories and what they
overcame to really drive that long-term improvement in people's lives.
(42:34):
And so much of what we see in data isn't the technology, but it's the people,
the people surrounding the data and the people working with the data.
I love the data is going to change. It's going to get bigger.
It's going to get different.
It's going to have more privacy rules. It's going to have less privacy rules.
But the people are going to stay, you know, the consistent part,
(42:55):
you know, up until the generative AI LLMs are, you know, embedded in the cyborgs
and those take over, you know, next week. Then we'll all be talking about it. Exactly.
Well, Rachel, thank you so much coming in on to the podcast with us.
We really appreciate it. And thank you both.
I really appreciated the opportunity. And this was by far the best.
(43:18):
Music.