Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome back. Sandy here, and I have some good news and some bad news to share about this episode.
The good news is we have a guest joining us. The bad news is he's a vendor.
But all jokes aside, Anjali and I had a fantastic time chatting with James Anderson.
He's an ex-consultant and current director of cloud data architecture at Snowflake.
James and I go way back, and I've always enjoyed our spirited debates,
(00:22):
which is exactly why Anjali and I decided to invite him to the podcast. guests.
We couldn't think of a livelier guest to kick things off with.
Yes, we start by discussing Snowflake, of course, but then we move into other
topics like data mesh, changing dynamics of dashboard usage and data analytics.
And we have a quick debate on the future of data work. We hope you enjoy listening
(00:43):
to this discussion as much as we enjoyed having it. So let's get started.
Music.
(01:06):
Nice. You know when your recording's on and you have the little red flashing
light back in the studio back in the day? Uh-huh.
This is outrageous. And I like that
we're recording now, this part of this conversation. It makes me happy.
I hope this makes it into the podcast. You said I didn't hit the button earlier. That's what I'm...
(01:26):
All right. Are you mentally prepared? You want to get into this?
Am I mentally prepared? I was on a red-eye flight last night.
I'm never mentally prepared. So let's just... Let's roll.
Red Eye. Where did you come from? San Francisco. I've been flying around a lot. Back and forth to HQ.
So James, I've known you for a while now.
For a long while. I don't even know how long. But how about you? Almost 10 years.
(01:49):
I've been at Cervelo for 11. So that's kind of incredible.
Now that I think about it. I started working at Cervelo in November of 2014.
I didn't realize that for some reason. But why don't you introduce yourself
to Anjali and our listeners while you're at it? Sure.
Yeah. So my name is James Anderson. I used to work with Sandy and the team at
(02:09):
Cervelo for I think it was about two years I was there.
I then moved on to another consulting firm.
You know, I started my career at Cervelo as a burst developer.
I was a front end front end dashboard guy building very beautiful dashboards in burst as one one does.
And then as I moved in through my consulting career, I moved myself down the stack.
And by the time I left the consulting space, I was running the platform architecture
(02:34):
team for the firm I was at focused on how do you build and deploy large scale
enterprise wide data platforms?
And how do you fit that into a broader enterprise architecture,
I left consulting in 2020, right at the height of the pandemic moved over to
snowflake because I had been doing almost exclusively snowflake implementations
for at least the previous four years before that.
(02:54):
So I've been in and around the snowflake ecosystem for a long time.
And I have been at snowflake now for just shy of four years.
For the first couple of years, I ran a number of different sales engineering
teams focused on first large accounts in the greater New England area.
And then for the last two years, I had been running the sales engineering team
focused on large life sciences organizations.
(03:15):
So pharmaceutical, medical device, medical distribution, CRO.
And starting February, I moved into our new data cloud architecture team focused
on how do you fit Snowflake into to a more broad enterprise architecture.
That's my complete end-to-end life that I have had in the data space at this
point. What do you do outside of the data space?
(03:35):
What kind of hobbies or interests do you have? Or do you work so hard you have no hobbies, interests?
Well, in the last month or so, let's see, I've been on 10 planes in the last 20 days.
So I don't really have time for other things outside of that right this second.
In general, I have two young children who I adore.
Immensely and spend most of my time entertaining or finding ways to entertain.
(03:58):
I also like to play golf like any good data executive likes to do.
Those are my main hobbies if I have time for those things.
And we'll see if that ever comes back, if I ever get more time for that.
So is that a newly formed team at Snowflake? Yeah.
So historically, when our customers had come to us and said,
hey, what is our recommended architectural approach?
(04:20):
Hey, should we run a data warehouse? Should we transition to to being more of
a data lake type of construct?
What do you guys think about data fabric? Definitely getting a lot of people asking about data mesh.
In the past, our answer was, oh, you can do anything you want.
As an organization, we're really trying to get our customers to think about
Snowflake, not just as a cloud data warehouse or cloud data lake or something
(04:44):
of that nature, but really understand that this is a full-blown end-to-end data platform.
And so we've built an architectural framework that we call the data cloud architecture,
to help our customers who want to truly adopt Snowflake as a platform and get
better value out of the data that they have.
A lot of this framework is around how do you tie value to what you're building?
(05:07):
And how do you treat the things that you're building not as tools for decision
making, but instead as an asset that has a measurable ROI?
In the past, when a client says, says, hey, where does this fit in my larger architecture?
The answer would naturally be go to consultant X who's worked across a lot of technologies, right?
(05:29):
To help you pull that together. So now it seems like Snowflake's building a
team that has an opinion, which is fine.
Find it interesting to see how clients react to that because you're still getting
the advice from the vendor at the end of the day.
Yeah, so one of the things that I tell customers from the jump.
Is that my team is certainly going to look at this from a snowflake first perspective,
(05:52):
but we're not going to look at it from a snowflake only perspective.
We are a little bit trying to provide some level of free consulting to help lay that foundation.
Now, we're not going to then turn around and do your whole implementation for
free for you for all the way through, right?
We're going to rely on our partners like Cervelo to actually take this sort
of architectural vision and implement it and put it in place.
(06:14):
Because there's a lot, It's not just a technical underpinnings, right?
There's a shift in how you think, and there's a change management process that
has to be taken into account.
So as you're talking about this, I guess because Snowflake made this shift,
it sounds like that's been a big trend, right?
Clients really trying to figure out not just what is this technology itself
(06:37):
in isolation, but how does it interplay with all the other things I'm trying to do in my stack?
The question you're trying to answer for them is, here are the other pieces
of my stack. Here's the other things I'm trying to achieve.
Help me put this wonderful modular puzzle together that has been created as
the modern, quote unquote, modern stack.
(07:00):
It's modular, right? That's usually what people push for.
I know companies are moving in certain directions, but that's typically what people push for today.
Are there any other trends that you're seeing out there? I mean,
data mesh is the most popular conversation that we've had.
I mean, is it not the most popular conversation you've had in the last year
and a half? Like, really? No.
(07:20):
Really? Yeah. That's surprising, because the number of customers that have come
to me and said, Hey, I have this partner who's really pushing the data mesh paradigm.
What do you guys think? And how can Snowflake support this data mesh strategy?
Right? Is outrageous. I think last summer I had this conversation with more
than half of the customers that I supported.
(07:42):
Everybody likes this idea of domain ownership and putting the capabilities and
the infrastructure in the hands of the people who actually understand the data.
On the flip side, when it comes to a true data mesh, the idea of domain ownership
includes the idea that you have engineers inside of each domain who will take
who will ingest process and create these data products.
(08:05):
And I have a whole beef on the term data product, which we can get into later.
But there's a level of sort of organizational change that goes into how you
staff. Yeah, or how you organize period, right.
So from our frameworks perspective, from this sort of data cloud architecture
perspective, we're taking a lot of the positives of data mesh in terms of,
we want to enable that free flow of data and applications across your ecosystem,
(08:27):
we want to make it easier for you to collaborate with your business and with your partners.
And with your customers even, but not do it in a way where you have to actually
transform how you run your business.
And I understand that there are consulting firms out there who specialize in
business transformation.
So this is very appealing to them, because there's a huge project that comes behind this.
So I think that's one. The other is obviously generative AI and everything around
(08:49):
the whole generative AI space.
And I know when I was listening to your first episode, I was texting Sandy with
all sorts of opinions on this.
Well, that's how you ended up on the podcast. You were texting me right after
our first episode, So you're sending me these rant texts about Gen AI.
And I was like, wow, we got to get him on. He's ready to go.
I was live texting you as I was listening with my live feedback. Yes.
(09:12):
But would you expect me to do it any other way, Sandy? I mean,
come on. No, absolutely not. That's what I love about you, James.
I think when it comes to data mesh, yes, we do have clients who are thinking about it.
We don't push it as a consulting firm. And I think because we know,
we know what it takes. And when I can't get a company to even think about how
to model data for consumption, why am I going to start pushing a concept that they're not ready for?
(09:34):
As a consultant, I'm not even going to talk about our company,
but as a consultant, I personally have a challenge with that.
With the push of new concepts to clients when they are not mature enough to
get to that space, that's something that I holistically will not do.
I want people to be successful in what they're trying to achieve.
That's my goal. So yeah, I don't push it personally.
(09:56):
I think data products, I know you don't like that term. I don't like many terms
because people define things however they want to define them.
So I always say, you have your definition, I have mine. Let's just make sure
we all understand where we're coming from is kind of my goal in life.
But data mesh, yeah, we have clients have talked about it.
And most of the time, they're looking at it as a tooling question.
(10:19):
What are the technical pieces I need to create a quote unquote data mesh? And I tell them,
That's nice. That's not going to solve your problem. Well, yeah,
I mean, I agree with the maturity aspect, but there are companies out there
who are structured in this way.
Extremely large e-commerce retailer here in the Boston area is structured that
way, right? That's how their business is set up.
(10:41):
So fine, you know what? That makes sense.
Good for you. Good job. For everybody else, yikes.
So my BeatBoot data product, in my opinion, a data product is something that
you build in order to sell to the rest of your mesh.
You don't know that anybody's going to use it. You just think that it is.
But you have to like spend cycles building it in such a way that it feels like
(11:02):
an actual product, which in my mind actually lowers the ROI that you might get
out of that because your input costs are higher.
Your resource costs are higher. The technology costs that you're using in order
to make this thing look like a product are probably higher.
And you could probably do something with lower input costs and get the same
level of adoption across the board.
That's why we're really focused on this idea of assets sets and tying everything back to ROI.
(11:25):
Because then it's really about are you building the right things?
Like previous consulting firm, we did a project for a large hospital network
in general Massachusetts area.
And we built them a dashboard. It was there was a lot of data that came into
this and it was a pretty hefty project and program that we did.
It was a bed management dashboard where they would be able to see sort of the
layout of how many beds were available and so on and so forth,
(11:48):
and some level of a sort of a light forecast against sort of where the beds
would be, so that they could do their staffing in such a way that they saved
something, they told us they saved something in the realm of like $70 million in staffing costs.
And the fact that they like made an effort to track that number tells me that
they actually understood the ROI that they were going for.
(12:09):
Yeah, I think I look at it similar, but different than the approach you just
laid out, right? I guess, James, like the example you gave, right? It was a.
There was a specific ROI that we were targeting, a very pointed product that was being created.
I look at something like that. I'm like, all right, that's the goal of the business.
That's your business strategy, right?
And ultimately, there's a product that's tied to that.
(12:29):
But I look at all the data elements that are tied to that product.
And I think to myself, this concept of, because this is what happens in organizations,
right? They have this ROI item that they're trying to achieve.
And they're like, all right, we need to get all this work done for this one
use case. And then all the work that is done for that use case is only for that use case.
Everything around it gets orphaned. That big data product probably includes
(12:51):
20 others underneath it, pieces and components of that product that are very
specific to domains that other people can re-leverage.
So you need to distill down what you're actually trying to do into components
so that people can re leverage that information.
And then the ROI goes way up. And not only for that one project that you just
finished on the bedside, but the reuse of those analytics and those different
(13:15):
data products that made that larger one, right?
That's the way I look at it. But the problem is people don't think that way. Right.
Agreed. And this is why, this is why we're, again, we're focused on assets and
why we're focused on ROI, because to your point, there's 20 different other
things that are part of that, that are part that go into building this,
this more broad, larger asset.
This is, and my issue with data mesh becomes that you end up actually building
(13:38):
in silos and then trying to share at the end. Yeah. Right.
Versus like building in a collaborative fashion is is ultimately how you're
going to drive the most efficiencies across the board.
And so part of our goal as a data cloud architecture team is to help to drive
those efficiencies and help reduce that and drive that collaboration by making
it clear who owns what and what's out there.
(14:00):
Anjali, from a data mesh perspective, have you been hearing anything around
that? I hear a little bit of chatter, but I think it's similar.
Sandy, to the approach that you laid out in terms of.
Our clients aren't ready for
kind of what it takes to be successful with data mesh type of approach.
(14:21):
And one of the key foundations for being successful with the data mesh is having
a highly governed approach to your data, setting up a standard to say,
in order to be considered a data product in our mesh,
you need to meet these handful of criteria and actually track against that.
(14:41):
In the consulting space, I always remember governance being the third rail of a proposal.
Don't put governance in there because nobody's going to buy the project or it's
going to be the first thing that gets pulled out.
MDM and governance were the two things that always got stripped out.
It's just like, I just don't even touch it. Those who toil in the governance
space on the consulting side, I feel the most sympathy for ever.
It's not- You're talking directly to Anjali. I know. That's why I'm talking.
(15:05):
I'm sending this at Anjali and telling her how much, how bad I feel for her
in this particular scenario.
Thank you for that. You know, a lot of our clients are actually struggling with,
you know, with data challenges and have opened up their eyes to the need for
finding accountability and focusing on the confidence and fidelity of their data.
(15:30):
So I think that as we kind of look forward, maybe governance will no longer
be that forgotten forgotten child and really be the golden child of the family.
You know, back to the other topic that's been very popular around generative
AI and everybody needing an AI strategy.
So at our summit last year, Frank Slootman, our former CEO on stage said, you can't have.
(15:51):
An AI strategy without a good data strategy. It doesn't surprise me,
Anjali, that governance is less of a problem and people are looking at it more
because they're being asked to build an AI strategy.
And you can have the best LLMs and the best generative AI capabilities and the
best chat bots in the world.
But if your data quality isn't there, you're training models against trash.
(16:13):
So you're going to get trash back.
The good thing about Gen AI is that it's
highlighted highlighted the fact that good quality sound
data and mastering of metadata is
critical to be able to do something like that right
now everybody wants it because it looks so cool and i'd love
to chat with my data someday hopefully we'll get there
you can't get there if your data is out of quality you can't get there if your
(16:36):
your context around your data is aloof right this is the conversation i had
yesterday with the ceo everybody wants to everybody wants a chatbot And I refuse
because they don't trust the data underneath it. And then they all go off and
build in their own little world.
And, and then there's four different definitions of ACV or whatever it is that
they're looking at and revenue and or, you know, four definitions of churn and none of them are right.
(16:59):
And then when you ask them, well, how do you define that? I would say maybe
not none of them are right. Maybe all of them are right.
It's just a matter of the intent behind it. What's my intent?
What am I trying to drive?
It's intent, it's context, and it's quality of the data asset itself that you're going after.
Ultimately, none of this is going to work if companies don't bring together.
(17:20):
Governance with the data teams and the AI teams and the digital teams into one
package and figure out how do we collaborate to solve these problems.
Because if everybody keeps working in isolation, A, they're only going to have
part of the building block.
No one's going to know how to use it. No one's going to trust it.
And it's just going to be a mess again, except on modern platforms.
(17:41):
Well, you keep saying modern platforms. And this is a conversation I've been having as well,
where people in the sort of mid 2010s started to migrate off of their on-premise
data centers into the cloud with the intent of, oh, we built all these silos on-prem.
We're going to move into the cloud and and the silos are just going to magically disappear.
(18:02):
And they picked up and moved their silos yeah and then they kept being siloed in the cloud so
modern stack doesn't always apply if you're if you're not breaking down the
silos and i think a good and you know back to the whole governance aspect like
a good governance is a double-edged sword here right there is such a thing as
too much governance right you put you make it too hard to participate everybody's
(18:24):
going to go off and build in their own silos and shadow it runs runs rampant.
My whole thing around this modern stack has to be in conjunction with a modern
way of thinking, right? Modern ways of working too.
Yes. A hundred percent. Right. And you can't. Yeah. I mean, you're talking about
governance and I agree with you, right?
I look at this and I know Anjali agrees with me as well is governance should
be focused on outcomes and impact that you're trying to achieve.
(18:47):
At the end of the day, it's an enablement factor, right? What am I trying to
enable and building a process that supports that use case or enablement that
you're trying to go forward with, whether it's mastering of data,
you know, reviewing quality and building a process around that,
or even security and masking policies, etc.
But if you're creating processes that restrain the business from doing things,
(19:11):
then you're doing governance for the sake of governance.
And you're actually, you know, not doing it for the sake of the outcome.
And I think people go one way or the other on that. And it's tough,
because the first conversation people typically have is.
Yeah, what's the organization I need to have for governance?
It's just like, I don't want to answer that question. I actually don't care.
The question I want to answer is what do you need to solve for and put the right
(19:32):
process in place for that?
Yeah, I mean, ultimately, governance needs to be an enabler for innovation.
It cannot be the inhibitor to innovation.
It has to be flexible enough to allow for innovation.
Exactly. One question I had, you know, I've been reading a lot about some of
the bold moves that Snowflake has made around MLOps, LLMs, etc.
You guys have kind of taken a shift into the world of advanced analytics, data science, etc.
(19:58):
So I'm curious, what's the thinking behind that?
And how do you envision that shaping the future of data management?
Because historically, right, and I love this idea, by the way,
because I'm all about bringing these worlds together.
As I just said, these teams have to collaborate, they have to be working on
the same platform, not platform necessarily, but the same data sets,
(20:19):
right, the same data products.
So is it really just a shift in that direction of trying to bring these worlds together?
So I think there's a couple of different answers. And I don't,
I don't promise to speak for our product organization and for our leadership in that way.
But yeah, you know, we've made a lot of strategic acquisitions in the last 18 to 24 months, right?
(20:42):
Companies like Neva, which is where Sridhar, our new CEO came from, right?
And And partnerships as well, where we're trying to, again, be a more robust
platform and bring these capabilities.
But at the end of the day, I have never worked for a company or even when I
was on the partner side, worked with a company who listens to their customers
as much as Snowflake does and makes decisions in that direction.
(21:05):
Many people in the sales organization who have been here for a long time can
point to specific features and then point to the specific customer who asked for it.
And this goes back to things like Snowpipe. You know, data sharing came out
of a very specific ask by one of our largest accounts.
We want to be able to make this data available to our customers.
This shift that you're seeing comes from a desire for our customers,
(21:27):
for some of from some of our largest customers to consolidate from their tooling perspective, right?
You know, you look at that slide that gets published every year of the tools
of the modern data stack, and it just keeps getting bigger, right?
It's just like every tool under
the sun all of a sudden fits under this concept of modern data stack.
And there was a scenario in sort of the early 2020s.
(21:49):
So in the last kind of couple of years where everybody was buying all these
like bespoke tools to do this job, and then it spend was running rampant,
and then the pandemic hits, and then everybody's trying to like, save money.
And they're like, wait a second, why do we own data IQ and data robot?
What is happening here? This doesn't make any sense. So from a Snowflake perspective,
(22:10):
right, for many of our largest customers who have consolidated and brought a
lot of their data estate.
To be stored and processed using Snowflake, they wanted to say,
okay, wait, I want to be able to to code in Python, right?
I want to be able to write data engineering pipelines in Python,
here is Snowpark, right?
So you have this ability now to code in the language that you feel more comfortable
to build those declarative pipelines.
(22:30):
But now my data science teams want to use Snowpark and, and we're not necessarily
optimized, you know, our warehouses aren't necessarily optimized for those types of workloads.
Okay, here comes Snowpark optimized warehouses. And then now we've We've been
building applications. We love using Snowflake to power some of our analytical applications.
We want to be putting more of our applications and consolidating this into Snowflake.
(22:51):
Great. You guys build with containers.
Awesome. Snowpark Container Services. It's very much a feedback loop with our customers.
And yes, all of these things have the word snow in them. And we all have to be okay with that.
They will always. And we're getting over it. Yeah. I think about 1999 moving
forward in the on-prem world, let's say Oracle, right?
The big challenge back then was that Oracle went off and bought all these companies
(23:16):
and tried to integrate it.
And as a consultant, I remember saying to them, yeah, they bought all this stuff,
one tool, but you still have to self-integrate everything.
It's painful, right? It's painful to integrate all these platforms.
And what I find intriguing and interesting about where we are with cloud today
is that everyone's pushing for at least the idea of these modular solutions
(23:36):
where everyone has to have all these, the best. and I don't want vendor lock-in.
I want to be able to swap things out.
You're not going to do that. Like I look at it, I'm like, you're not going to do that.
You're just not. Vendor lock-in is my least favorite term.
Vendor lock-in is the funniest conversation I think I have because I'm just
like, you're trying to say the problem we had in 2000 was vendor lock-in.
(23:59):
That was not the problem.
The problem was technology was not advanced enough to allow you to change.
It's advanced enough now, regardless of what vendor you use,
as long as it's cloud-based. It's actually pretty easy to move.
I do love the idea of the consolidation that is starting to happen in the cloud
where companies are getting smart about, all right, how do we do this?
But they're trying to do it. Some do it well, smartly in terms of if I'm adding
(24:22):
capabilities, it's built through our platform.
Some people are acquiring tools and solutions and trying to integrate them in
the cloud. And then it gets a little clunky and scary.
We're now in a place where things are starting to get consolidated because of what you said.
Look at the math. There's a billion little little dots on icons of companies
on this giant capability map, and people are overwhelmed and confused.
(24:44):
And I think Snowflake is and will always be a single SKU.
When we make acquisitions, and when we build out net new features and capabilities.
It's not something that you then have to turn around and pay for another license for.
You buy your bucket of credits, and then you use them as you see fit.
So back to your question about these decisions around LLMs and MLOps and and
(25:05):
all that kind of stuff, right?
It's it's for us, it's all about how are we driving that
customer satisfaction in that NPS score, right? Like, we at Snowflake do not
have a customer success org that does not exist at Snowflake.
I didn't even, you know, I've worked with you guys since you worked with us years ago.
And I didn't realize that this entire time. So that's yeah, yeah.
(25:28):
And there's customer success office, like the idea of a customer success team
not being there is support.
Yes, success. No. And that's, that's an interesting differentiator,
actually. so I appreciate that.
I do want to pivot and give you the opportunity to continue the conversation
we had over text regarding Gen AI.
We have 15 minutes and I think it's important for us to cover this ground.
(25:51):
When we were going back and forth, we were talking about the change in work, right?
How is Gen AI actually going to impact work and maybe not just work for the
general masses, but work for data teams?
Yeah, so I want to go there with you. I'm curious of your opinion.
I'm reading back through our texts to make sure I'm tracking to what I say.
(26:13):
I think you've made a comment because people think that they're going to lose their jobs.
Yes. But this is also what people who had tied their whole life to Oracle or
Teradata, the DBAs of the world, felt when people started moving to the cloud.
They pushed back really hard because they felt threatened. It is okay.
It is okay to feel threatened.
(26:33):
It is okay to feel like your job might be at risk. it's
not okay to stop innovation because you don't want
to change yourself right i think the the best advice i could
give to to anybody in this space at this point is to be
flexible in my own career right i started as a visualization guy
i was a front-end bi analyst kind of person
right and as the cloud became more prevalent the the bi end of the spectrum
(26:56):
became less relevant and for me it it became important to diversify my skillset
and be able to get an understanding of how ultimately do these BI products get fed with data?
So I would say to anybody who is feeling threatened by generative AI that,
but somebody's got to build the model. Somebody's got to tune the model.
(27:18):
Somebody has to deploy the model. Somebody has to own the application that then
the users interact with.
If you feel like your job is going to be eliminated because you're not building
ETL pipelines and you're not building dashboards anymore, then go take a course on LLMs.
Go take a course on GPUs. Like, I understand that these are extremely technical
constructs. And so it takes a lot to learn about them.
(27:40):
If your other option is to just be a curmudgeon and not let innovation happen,
because you're worried about your job, that's not ideal either.
This these things will happen whether you get in the way or not.
I'm going to print t-shirts curmudgeon.
I totally agree with you.
And I kind of mentioned this right during the first episode when I was a front
(28:07):
end developer, I built BI solutions from scratch.
And the pain and agony of
seeing a what you see is what you get drag and
drop lizzy wig bi platform in 2003
i recalls when i first saw it from oracle yeah i almost had a heart attack because
(28:28):
obi it was pre-obi it's kind of like it was it was the precursor to obi and
i i still remember that i almost had a heart attack i didn't let anybody know
that i was scared out of my mind and i I pivoted,
I think the thing here is for people to realize that somebody has to design these solutions,
(28:48):
you have to understand how these things are going to work. And you have to understand
data quality is part of the problem.
All the issues surrounding the work we do are still going to exist.
It's just how we solve those issues are going to change. And you have to get
uncomfortable more often now than we did in the past.
I had a customer, he basically was like, so can I just train a model and get rid of ETL entirely?
(29:11):
And I was like, no. I was like, Like in theory, sure.
But is the cost of training that model going to be so great?
And oh, by the way, is it going to perform like crap? Yes.
So you're not going to get rid of ETL. And in the same way that,
you know, Sandy, you and I had this debate over text about dashboards, right?
I understand that the chat interface will make it easier for users to interact
(29:33):
with the data in a way that is more exploratory.
But what that's really going to get rid of is more of the Excel type of model
of exploring and slicing and dicing data in certain ways.
I pray to God that that happens. Your lips to God's ears, honestly.
But there is still going to be a space for dashboard.
It's not going to eliminate other ways of interacting with data,
(29:54):
except for obviously in the more of the exploratory style, right?
You're going to get a lot more people who are going to ask questions of data
themselves and not go ask a BI analyst to go pull a report for them to answer
one specific question. I think those days are gone.
I think those days are, they need to be gone.
I want everybody to move towards decision intelligence, where you're actually
(30:17):
supporting decisions people have to make in the workflow in which they're making them.
That's where I think this all needs to go towards. And the days of I go to a
dashboard need to go away. way.
I spent probably the first 10 years of my career doing nothing but front-end
solutioning. And all I could think about is getting rid of it now.
(30:37):
I'm just like, I'm so sick of it. I just want to help people make the right
decision at the right time, in the right workflow, et cetera.
I think that needs to start happening sooner rather than later,
but there's so many barriers to that because BI teams are still working in isolation
from the AI team, from the digital team.
I think I understand where you're coming from, But I think that the dashboard
(31:02):
paradigm is never going to go away, right?
And even if the dashboard paradigm shifts to being basically,
how do I embed KPIs into your workflow?
That still has a dashboard-like feel, right?
And part of our debate over text was around operational, right?
The operational dashboards will continue to exist, period.
That's just how it's going to work, right? So I get decision intelligence and
(31:26):
I get making that point. But like, when it comes to a user interacting with
data, you have to make sure that it fits in terms of their own maturity.
Putting it all in chatbots is good, except for the users who don't know how
to ask questions, who just want the specific answer to the specific question
(31:47):
that they ask themselves every day to do their job. It's that bad.
You don't know how to ask questions? Wow. I mean, come on.
You and I have spent enough time with enough clients in this world who ask the worst questions.
Let's be clear. here. I plead the fifth on that one. I am not subscribing to that, James.
(32:07):
Well, you still have clients to deal with. I get to be off the center side.
I don't have to do that. There are no dumb questions.
Well, I do tell people there's no such thing as a dumb question,
only a dumb answer. But it doesn't mean you don't. It doesn't mean you know
how to ask the question, right?
I mean, that applies to to generative AI as well, right?
I mean, there are dumb answers that come back from from these models as well.
(32:29):
We have to be able to one, except you're going
to get a dumb answer but to put guardrails in
place the human in the loop to to make
the decision around whether or not that's the answer we can actually use
and trust yeah i could not agree with that statement more like i as much as
as much as ml and this idea of prescriptive analytics in terms of like actually
(32:49):
letting the computer make decisions i like there's a world where i understand
why that that that happens so like google maps right nobody's in nobody is sitting at the terminal.
Seeing all the google maps requests in and then telling them
which directions to go right a
computer is making that decision i i get that and guess
(33:09):
what google maps does it wrong a lot and so
you have to you as the user has to decide i trust
this or no i really don't so we were we
went up to north my my family and i went up to northern maine for
the eclipse last week it was beautiful it was it
was a beautiful day it was the coolest experience i've ever had but
it took us what should have been a at most five
(33:30):
hour drive took eight hours to get home google maps just like
didn't work and everybody ended up on the the area at
franconia where 93 and three come together and they went
five miles in five hours i i i
agree i think it was we can find instances where
it's not going to work when there's an event that is so mass
in terms of the number of people jumping on the highway and the
(33:51):
information doesn't get back fast enough for it to update its
algorithm yeah of course but i
but i i looked at the map and i said i don't
like what this route is and i had to go manually put in a point on the map to
force google maps to take me in that direction right so like that's that's sort
of the manual intervention that you're talking about anjali of like i as a user
(34:14):
have to be smart enough and say this doesn't make any sense Generally,
I might get rid of some level of IT jobs in theory.
Yeah, I don't think the jobs are going to go away. I think they're just not
going to scale the way they've been scaling. Sure.
That's kind of where I'm at. The scalability is going down a bit.
Like you don't have to have thousands of data engineers if you're a data heavy company.
(34:36):
That's going to probably simmer down a bit. You'll be able to do a lot more
with less. So I don't see anything completely disappearing.
However, I will say the logo for their podcast, a logo for a friend's Facebook group.
She was pinging me yesterday, just two days ago, asking for a logo.
Hey, you did this AI thing for your podcast logo. Can you go do one for me?
(34:57):
And it was like around the barrel, I think it's called. She's like,
it's a new Facebook, but I'm trying to do it. I was like, fine.
So I literally opened up my phone.
I didn't even use my computer. I had chat GPT on my phone.
I asked it and within the third hit, it had it. Five minutes later, she's got a logo.
And so I feel bad for all the artists out there on Fiverr that was creating
logos for five bucks or a hundred bucks a pop. That industry is gone.
(35:20):
So apologies to them. But I think that's going to be the tough one for us.
I do have to wrap this up. I have a client call that I had scheduled after this,
unfortunately, so I got to run over there.
But James, James, what is your last piece of advice to anybody who's scared
to death in terms of, I'm a data engineer, I'm just starting my career.
(35:40):
What should I focus on? Learn Python.
There's a lot of people who will tell you, oh no, SQL will never go away.
Yeah. They're not correct.
Python is, from a declarative engineering perspective, Python is where people are moving.
And Python is super flexible. I said earlier, being flexible is the most important
thing that you can do for yourself.
And I think, at least in terms of what's happening in the next five years,
(36:00):
having an understanding of Python is is going to allow you to do a lot of things
and be and help be successful.
So if I had any advice for a budding data engineer, the first thing I would
tell you is to go learn Python, learn Python.
I love it. Well, thank you, James, for being on.
Of course. Thank you for having me. I mean, it was exactly what I expected.
I expected this from you. So I appreciate that.
(36:23):
And we probably need to have a round two in 10 years to figure out was I right
about operational dashboards?
Or was James's right about operational dashboards?
I'm putting the that down now. I'll Venmo the money to you and let's do it.
Done. Thank you. Well, have a great weekend. Bye, Anjali. I'm off.
Music.