All Episodes

June 12, 2024 36 mins

Join Sandy Estrada and Anjali Bansal in this exciting episode of "How I Met Your Data" as they welcome Junaid Farooq, a seasoned expert in data governance and strategy currently leading enterprise data initiatives at First Citizens Bank.  Junaid shares his journey and insights on data democratization, data governance, and the impact of AI on data management.

Discover how First Citizens Bank navigated through significant acquisitions and the challenges they faced in integrating data and aligning on culture. Junaid also discusses the essential disciplines of data management and offers advice on staying versatile and continuously learning in the ever-evolving field of data.

Tune in for an unscripted, insightful conversation.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hi, Sandy Estrada here. Welcome back to another episode of How I Met Your Data.
Anjali and I are thrilled to be joined this week by Junaid Farooq.
He is a seasoned expert in data governance and strategy.
He's currently leading enterprise data initiatives at First Citizens Bank.
I first met Junaid in April at FEMA Boston.
It was clear to me then that we had to have him on the podcast.

(00:22):
Junaid and I had so many conversations on data democratization,
data governance, data strategy. He left a lasting impression on me, to say the least.
So as we go into our seventh episode, I just want to take a moment to thank you, our listeners.
We've gotten a lot of feedback, some via email, some via LinkedIn,
some via one-on-one meetings that I've had with folks.

(00:43):
And we're just very thankful for you continuing to listen and continuing to
just kind of be there with us in these conversations.
And as we continue to have these conversations and record these episodes,
I am personally growing more and more excited about our decision to keep them
topic forward, but completely unscripted.
And the conversation that you're going to hear today with Junaid perfectly exemplifies

(01:05):
what we were looking for when we took this approach.
We cover so many topics, and I'm not going to go through them right now.
I'm going to let you discover those as the conversation unfolds.
So without further delay, here's...
Music.

(01:38):
So, Janae, thank you for doing this again. Ever since I met you at FEMA in Boston.
I was immediately, I think I came up to you after our first day hanging out
together there and said, I would like to have you on my podcast.
Really enjoyed just kind of
your perspective on the topics that we were talking about at hand, right?
Data democratization, data governance, data strategy. I mean,

(01:59):
we covered a lot of different topics during the FEMA Boston event.
So thank you again for taking the time to be here with us today.
Yeah, thanks for having me. I similarly felt the same.
I just loved your energy, your passion for the topic.
And it's always fun meeting others who have the same energy and passion on the
topic and similar views.

(02:20):
Yeah, absolutely. I agree. I think your passion came through as well.
And I appreciate that. I always find that if I'm energized, it's because of
everybody else, not necessarily my energy.
I'm just feeding off other people's energy. That's a plus.
I always think we said once at our experience level, we didn't study data.

(02:41):
We didn't choose this as a career. Many of us just wound up here.
If you wound up here, it's generally because you really enjoy the topic.
I have this joke. I say it like no one gets into the data game for the accolades and for the applause.
It is a tough, tough business and you have to really love it to stay in it.
If I have my data governance hat on, no one's really happy to see me, right?

(03:02):
If I have a leather hat on in terms of data quality, our data is never good
enough. No one's ever come up to me and said, you're done, Junaid.
We've achieved our utopian state of data quality. It's a tough field and you
have to love it to be in it. Yeah, absolutely.
Maybe where we can start is a little introduction to you, if you could share
with our audience a little bit about yourself.

(03:23):
Sure. I'm currently at First Citizens Bank. I lead our enterprise data governance
function, and I also lead our enterprise customer data function.
My job's been very interesting. First Business Bank has had two major acquisitions in the last two years.
They acquired CIT Bank in January, 2022, and they doubled in size when they.
That. And they went from roughly $40 or $50 billion to $100 billion.

(03:46):
And that put them into a category that required some regulatory scrutiny.
So data governance and data management became a focus at that time.
And then last year, out of nowhere, First Citizens Bank also acquired Silicon Valley Bank.
And with that acquisition, they doubled again. So roughly $100 billion to over $200 billion.
And quite suddenly, they're a category four bank. And so work has been exciting.

(04:10):
You can imagine bringing three institutions together.
Even in a single institution, there are so many different views and opinions on data.
And even within the same institution, you sometimes struggle with lexicon and vocabulary.
And then when you start assimilating two other large institutions,
you can imagine the cultural dynamics, the varying views that everyone has on

(04:33):
data and approaches to data.
So it's been an interesting journey. It was probably very modern,
very forward thinking, a little different than a more traditional bank.
Yeah. One of the benefits that I think we had in acquiring Silicon Valley Bank
is they had a very good data program in place and they were headed in the right
direction on a lot of things.
And so they were pointed in the right direction in terms of their data maturity journey.

(04:56):
And there was a lot of synergies that we found that there were some things that
Silicon Valley Bank was further along than we were. And then there were other
things where we were moving further along.
Bringing those efforts together, there's definitely a lot of synergies to be
found. But culture is very different.
And you know, you and I said when we were talking about data strategy,
strategy is great because it's a type of culture, eat strategy for breakfast.

(05:17):
Aligning on culture, aligning on literacy, essential to anything data related.
So you have mentioned that you're leading data governance, politician in that
respect, but also you're in charge of the customer data strategy as well. Well,
How pervasive is AI? How much pressure is there within the bank in terms of

(05:39):
the things you're trying to do with AI at the moment?
Is there a ton of pressure around it or is it very measured because of the high
regulation being part of a large financial institution?
Yeah, I'd say the latter. First Citizens has a history of doing everything in a measured way.
And we are generally not the first to market in these spaces.
I think it's good to sort of observe what's happening and where things are heading.

(06:01):
Do you remember a few years ago, Sandy, like people were all about blockchain,
right? The way that we talk about Gen AI today, it was blockchain.
And everything was going to be, there's gonna be some component of blockchain in everything.
And like, where are we today with that, right? So definitely a very measured approach.
I do think any institution would be foolish not to at least do some exploratory

(06:25):
work in this space. My personal opinion is that this is definitely more real than blockchain.
And I do think this is going to be a game changer. We're still very much in
our infancy, but I do believe that this is not going to go away the way that other things have.
Yeah. I believe that Gen AI has put a highlight on AI just in general,

(06:47):
right? So even the things have been around for a while, right?
Everybody was hot on machine learning over the last five years.
And now it's even accelerated because a lot of organizational executives are
coming to individuals saying, what can I do that's different and novel?
You know, starting with Gen AI as the point of the conversation,
but then that evolves into other topics and areas that AI can help with,

(07:10):
which then brings along a lot of machine learning opportunities.
I, for one, find that there's still this challenge of,
yep, these are all great ideas and the technology can move quickly,
but most places still struggle getting their house in order in terms of ensuring
they have the right data and data quality.

(07:31):
Yeah, listen, I think that there are a few fundamental things like the term
Gen AI and the work around it sounds so exciting.
It's hard not to get sort of get wrapped up in it.
But this work, these initiatives, these platforms and these technologies are
still dependent on the things that we've been working on for the last decade.

(07:52):
Good data governance, good data quality, excellent metadata strategy.
Like these are still foundational.
And I think that these are essential. Poor data quality will skew results, will create bias.
You know, it's interesting how we're so ready to try this and look at ways that
we can leverage it without really understanding the criteria like needed for the foundation.

(08:17):
Like, you know, what does a good model look like? How do we know when we're done training a model?
That's a very basic question that I don't think I've heard a good answer for.
You know, I would say we know that the content that's produced out of Gen EI isn't 100% accurate.
And so the question is, how accurate is it? Is another question that we can't
really quantify very easily.

(08:37):
We're still in that phase of asking questions that we still don't know the answer to.
I'd almost be astonished if people felt strong that they knew the answers to
every question in this space.
What I'm finding to be the case is that even if you think you're getting it
part of the answer, the answer changes just down the line because there's a new innovation.

(08:58):
There's a new way of handling it. I went to Snowflake's event last week and
they were showing their front end to JetAI capabilities for many different models.
And a lot of the use cases they showcased, it had citations.
You can click on the citation and it will show you the document it found the
answer from and the section of the document and highlight it for you.

(09:21):
So So in the past, people said, well, I can't cite what my model is coming from.
And they showed that you can.
You can do that. There's opportunity for that kind of engagement with Gen AI,
which is, again, moving the needle forward and answering a question that people
couldn't answer before, right? Yeah.
It's definitely moving in that direction. And I was super excited just to see

(09:41):
that one capability that I hadn't seen anywhere else.
That's an interesting point is like, how do you create a roadmap, right?
Even if it's like a couple quarters, right? And say, okay, I'm going to go do something.
And you'd be foolish to think that you'd ideate on something,
create a pilot, define an outcome, understand the dependencies,
and then go do it even within a year.

(10:01):
Like that's a thing kind of aggressive. But if the pace of change is faster
than that, then you're almost always playing from behind.
You know, I'd say it's sort of tangentially related to this.
I was talking to somebody about data, data programs, and how I've been in and
out of so many institutions, both as an FTE and a consultant.
I've never implemented anything the same way twice, whether it was like rolling

(10:23):
out a data catalog or putting in data quality framework or even setting up governing bodies.
And I always caution others, if they feel like they have a data governance playbook,
there are thematic things that you will follow.
And there are themes and principles that will guide you. But you will never
do something the same way twice, period.
You shouldn't, right? Every institution will have its own sort of nuanced way

(10:45):
of working. And Gen AI just is an exponential.
And now you layer in technology and the pace of change in technology.
And I told this person that I was looking at work that I did three and a half
years ago. I thought I can go back and leverage some of that work to what we're doing.
In many cases, that work is obsolete today.
The way of thinking about data, even three and a half years ago,

(11:06):
is so different today and has become obsolete.
And so that's another question that's come up, which is if we endeavor to do
something and we create a plan to do it and it takes 12 months,
how do we adjust for the change?
How do we be good about seeing the next corner in the road?
Yeah. Well, one, it makes me laugh that we're talking about playbooks and really

(11:28):
not having a playbook because I really think of it more as a checklist.
What are the handful of things we need to get right?
And, you know, we need to adjust those criteria to meet the culture of the organization,
where they are in their maturity.
But I'm curious, Junaid, because you talked about multiple acquisitions over the last few years.
So as you're onboarding your acquired bank, I guess, are you changing or really

(11:53):
reflecting differences in the banks and the cultures and their expectations
during those integration points?
Listen, I think there's, again, like lots to be said about culture and the cultural challenges.
You know, when we think about integrating data, our biggest challenges in simulating
the data is that there are just different approaches that the different institutions have taken.

(12:15):
Hi, Sandy here. Pardon the interruption. Junaid is about to introduce the concept
of a CDE, and I wanted to make sure everybody was on the same page in terms of what that is.
So a CDE is a critical data element.
So for example, in a banking context like First Citizens Bank,
a CDE might include something like customer account numbers,
transactional amounts, regulatory reporting fields.

(12:38):
Now, these elements are critical because errors or inconsistencies in these
types of elements can lead to significant operation issues, compliant risks,
or inaccurate insights.
You can imagine what that could lead to in the context of a bank.
So with that in mind, let's get back to our conversation with Junaid.
We have this notion of CDEs. That's a thematic thing that most people have.

(13:01):
What is the most critical data that we want to govern?
We have an additional layer of nuance, which is let's tier our CDE.
So you might have a CDE that is derived from other CDEs.
And now what is the difference between a tier one and a tier two?
Getting alignment on these things is sometimes very difficult because you don't know when to stop.

(13:22):
We had one particular case where we had a CDE where if you followed it to the
end, it had something like 53 or 63.
And we're not going to increase our governance from one CDE to 63 times that amount, right?
It's just not. And so the question and the debate becomes, what's the right
number? Where do we stop? And how do we tier?
And things like that. So it's the nuance where we find our challenges.

(13:44):
Yeah. Just in talking to organizations about their critical data,
I think one of the first challenges that we run into is really aligning on a
definition of what critical truly means.
Yeah. And then from there, starting to align data that fits that criteria for criticality.
Yeah, it's like, again, it's like we know that thematically we should identify
critical data and then the definition of criticality will vary from organization to organization.

(14:09):
And what's critical to someone in reg reporting will be very different to someone
who sits in marketing. today.
Yeah, I love the comment you made earlier where you're not going to take a playbook
and repeat it in the next org or even division of an organization, right?
And apply it because it is unique for the reasons Anjali stated and then some.

(14:30):
Every organization even is structured differently and you need to restructure
how you do governance because the org structure is inherently different.
They may have business and IT, they may have something in the middle.
They may have a product team, an engineering team and a business team,
right? There's no one size.
Yeah, there's absolutely no one size fit, but you have to find that middle ground
of having that checklist, having maybe some frameworks that you can reuse.

(14:53):
You know, parts of to ensure that you're covering all the bases, right?
At least that's the way I think about it. So, and I also agree with you that
things are outdated as soon as you put them on paper.
One thing that I believe is completely outdated is how we think about data quality
within organizations and how we address it.
So I wanted to maybe jump on that topic with you a little bit.

(15:15):
Given that you're in a highly regulated environment, you're probably doing things
a little bit of an old school way, is my guess. There's probably some legacy systems involved.
If the organization were to move on to AI, what do you think would be critical
to get right before you start that?
I think that there's, without a doubt, out, there are probably three disciplines

(15:36):
in data that I would say are essential.
And again, this is debatable. You could add four or five, right?
And I would say your metadata strategy, like your underlying metadata is crucial
in having a really well thought out, robust.
Data catalog. Without a doubt, good data quality. You need data that is fit for purpose.

(15:58):
And you need to be really clear on what the purpose is and what the criteria of good looks like.
So data quality is another discipline that I would call essential.
And then your governance and oversight function, the policies and procedures
that define how you do your metadata strategy and your data quality,
the usage of things like AI. These are three essential.

(16:18):
Without a doubt, if you're not thinking about these three, you would run into
trouble with any initiative, especially a Gen AI initiative.
And then when I think about other fundamentals to a Gen AI initiative,
there are probably three things that I would tell people to be comfortable with.
Machine learning and data science.
You should be grounded in the basics and fundamentals in both of these.

(16:42):
I would advise people to learn programming.
Python, R, and these languages Languages are very easy to learn.
I grew up, you know, I'm going to date myself a little bit. Like we studied C and C++ in college.
And the amount of debugging for the older list, all the, you know, commiserating with me.
These new languages you can honestly learn in a weekend. It's not very, very hard.

(17:03):
And then the other sort of concept or skill that I tell people to develop is
be comfortable with math.
That's a little bit harder for some. But these algorithms rely on some foundational
understanding of probability, statistics, some linear algebra.
I think doing those data disciplines is foundational.

(17:24):
And then really thinking about these other things and grounding yourself in
some basic knowledge is essential.
We talk about Gen AI without talking about all those other things.
Grounding yourself and experience in those is essential. And depending on your
level, a certain amount of effort or granularity you'd want to get into with those.
But even executives like senior leaders, I would say the same thing to ground

(17:46):
yourself in what it takes to write a Python script, how to leverage those libraries,
have an understanding of probability and statistics.
Those, I think, are the foundational sort of concepts.
I'm sure others would have so many other things, but those are the things that
come to my mind. Yeah, I would agree with that list.
I also learned C++ in college.

(18:07):
And after that class, I never looked at it again. But I'm jealous of the younger
generation because the amount of hours I probably spent looking for that road semicolon.
And right now you can go into Hugging Face and just copy or put it in there.
And it'll be like, oh, you're missing this here. And you're not.

(18:28):
Yeah, I still have PTSD whenever I have to use a semicolon in a regular session.
It always was worse, right? You spend like literally 15, 20 minutes looking for it. Oh, yeah.
And then your colleague will find it in two seconds, right? It's just like, can you look at this?
And like two seconds later, they're pointing right at the line.
The new generation definitely has it so much better.

(18:50):
The amount of tools and technology and platforms that they have at their disposal
and free, the best part of it.
The amount of technology at our disposal today is just mind-blowing.
I also have another element to add to your list.
You mentioned executives being familiar with coding languages to some extent,

(19:12):
just to understand the efforts that it takes for a team to facilitate the solution of a problem.
I think that's really important. I would flip that on its head as well with
the teams who are developing, because oftentimes those teams are making decisions
in terms of what questions to ask of the data to find answers.
And they're lacking those economic concepts that they need to be aware of to

(19:33):
ensure they're asking the right questions in their analysis in order to find
the right statistical model to go after.
Just because you have an answer to a question, it doesn't mean you're asking
the right question. That's a very, very good point.
And you're right. I think that there is still so much learning to be done at every level.
The people who are actually working hands-on keys probably have the biggest challenge.

(19:56):
And I would say for any data person, one of the challenges that I have isn't
those guys, isn't the team that has hands-on keys.
I feel like they're invested and on the journey. right?
It's the executive sponsorship, to be honest. And I feel like getting executives
and leaders who don't have a history in data on board is to me is an age old problem.

(20:20):
And I heard somebody say like the way that executives today have to have some
foundational understanding of finance, like they have to know earnings per share.
They have to know EBITDA. There are many of these CEOs, they're all grounded on certain mechanics.
I heard somebody say that they're They're going to have to equally,
to be successful CEOs, they're going to have to be equally grounded in the information

(20:40):
age when it comes to data literacy.
Yeah, I would agree with that. I would agree with that. I was in board readiness
events and there were a number of current board of directors,
advisory boards, individuals that they're currently on multiple wards and some public boards.
And one of the things that they highlighted, which I could not believe,
was the fact that most board members are actually illiterate in terms of digital

(21:05):
and data and technology concepts.
They're all over cyber because they have to be, right? That's part of risk.
But they're unable to think about, is the organization doing the right things
in order to ensure they're ahead of the curve, getting access to data or ensuring
they're getting value from their data? They're just not there yet.
And further, the CEO, they may delegate out to a CIO.

(21:27):
But how do you find the right C-level
executives for your team if you're unaware of what good looks like?
Yeah, you know, I'm hopeful that that change is coming and it's coming quickly.
The CDO office didn't exist, right?
Not in the way it does even 10 years ago, 15 years ago. It was virtually non-existent 15 years ago.
And it really started catching on probably 10 years ago.

(21:50):
In the last five years is where you've seen probably a spike.
I haven't seen any statistics on this, but I'd imagine it'd be like a hockey
stick curve where in the last 10 years, you'd see this sort of like low number of CDOs.
And so that's probably the first
phase of this, which is the advent of the office and the role of CDO.
And then I think what I've seen is that there's been often a misalignment of where the CDO sits.

(22:13):
And I think that's still a debate. We probably have another podcast on.
And so that's a whole other thing. So at least they exist.
And then I think the other challenge has been, what does the CDO do?
The CDO role is on average 24 months, some 22, 26 months, right?
And so just as you're getting settled, there's a shift.

(22:33):
There's a lot of that that comes into place. I have the personal view that a
CDO should be a direct to a CEO.
That's often not the case. If you can bring that CDO closer to the CEO and peers
with everybody else, that's how you would accelerate the data literacy objective
that exists at the C-suite. Right.
I agree with you wholeheartedly on that. I like to think of it as you're either

(22:56):
a capital C DO or you're not.
And the capital C is when you're reporting to the CEO.
And you have a seat at the table. You know, non-capital C is when you're in
a division or you're part of a business unit or you're, because they have those as well.
I've seen a lot of organizations where I have the CDO of this part of our region or our division.

(23:20):
They really are in those positions. They don't own the technology stack.
They don't own the engineering team. They don't sometimes don't even own the
product side of it. And you're literally just brokering conversation and ideas
for strategy without any of the ability to actually execute on it.
That's a good point. And like, it begs the question, where should AI initiatives sit?
Should they sit with a CDO or should they sit with a CIO or a CTO, right?

(23:45):
Because it's actually a technology, but it's producing, Gen AI is a content
producer, so it's producing data and it requires data.
I've started seeing cheap AI officer roles too, right? Right.
And maybe that's the answer. I'm not quite sure.
It's another one of those questions where I don't know that there's a clear
answer. Yeah, I could debate that all day.
I think it just depends, right? It depends on how the organization defines those two roles.

(24:11):
And if both roles are in charge of leveraging data to push business strategy
in the right direction, it's the same role.
I think that's a valid argument. That's the only way I look at it. It's the same role to me.
You basically undress your chief data officer, the moment you hire a chief AI
officer, because the AI officer is going to create that value within the workstream,

(24:34):
within the operations, within the business.
And a chief data officer then becomes a governance body immediately.
To me, it just depends on what you're looking to achieve with that role.
And maybe that's the evolution, the CDAO title, where the A is then replaced,
where the analytics is then replaced with AI, perhaps, is where we're headed.

(24:55):
I think it's going to be interesting to look back at this conversation in 24
months and see where the answers are, right?
And say, where is the industry headed?
That's one of the reasons I like talking about this topic, about how just new
it is and how we don't know the answers. I think it's one of the most exciting aspects of it.
Yeah, well, it might be actually interesting to reflect back on this conversation.

(25:17):
In about two months, I have an upcoming talk with a friend and a thought leader
in data and analytics in July, where we talk about embedding data and analytics
within an organization's data culture.
And really it comes down to the responsibility of the chief data officer.
Yet we haven't really seen our CDOs being as effective as we hope.

(25:40):
And now where does AI actually sit?
And our intention is to actually start solutioning some of the challenges that we've seen,
with a group of CDOs as part of this discussion. I'll share back what we learn later in the summer.
That would be interesting. You know, you got me thinking about something,
Anjali. So, you know, Sandy and I talked about data democratization a few months ago now, I think.

(26:02):
You just made me think of the concept of AI democratization.
You know, I'm a big proponent to data democratization. I think,
you know, there are noted challenges around governance and security and access.
And somebody asked us the question on our panel, other than data governance,
what's the best way to ensure appropriate use of data and data democratization?
And I said, I know you don't want to hear that the answer is data governance,

(26:23):
but the answer is data governance is what I said.
And I think that you want to put data out there. I'm of the belief where we
should view any institution where everybody's a data person.
If you're an accountant and you're producing data, consuming data,
and you're producing an earnings per share report, you're a data person.
And so I have the belief that you should democratize data. Now the question
is, do we democratize AI?

(26:44):
And do we put AI in the hands of everybody? I'm convinced it's on the doorstep.
At the end of the day, if you look at enterprise applications.
They're every single enterprise application company in the world.
There is no enterprise application company that is in the market today.
And if unfortunately they're not doing this, they won't be in the market for long.

(27:05):
But they are all infusing Gen AI capabilities into their platforms.
I haven't seen a demo in the last six months that does not include some aspect of that.
And if you work for a company and they're not doing this, find another job.
I'm telling you this right now, because they're going to be left behind.
Everyone has it. And if I think about AI governance within an enterprise,

(27:27):
one of the tasks that they have is understanding how is that Gen AI used within
an enterprise application that we now license or looking to license and ensuring
that it's not going to lead us down a bad path, right?
I mean, if you look at just some of the things that came up with HR functions,
for example, that was an issue.
Group inside, it was an issue on performance management, helping people figure

(27:49):
out who to promote, those kinds of things.
They were all issue areas that I think have evolved since those issues came
up, I don't know, it was a year ago or so.
But it just kind of shows every enterprise application is moving in that direction
because they want to make the use of the application as frictionless as possible
and user-friendly as possible.
And there's nothing more user-friendly than a chatbot. So that's where they're headed.

(28:10):
I mean, even Copilot on Microsoft, they're throwing it onto your desktop to
the point where I don't have to go navigate to a file.
I could just say, give me X, Y, Z, and a file opens up on my Microsoft computer,
right? So I have an Apple.
But anyway, you get my point? I think that's where we're headed.
I also have seen, we're going to have another episode where we talk about the

(28:30):
Snowflake conference, but I'll just add this to quick little thing I saw at
the conference, which was they were releasing all these capabilities.
And at one point they said, everyone should be able to build a chatbot.
And then he kept repeating it. And then he said, we're going to grab an audience member.
We're going to have them come up and build a chatbot. And in five minutes,
this person who isn't a coder, and I believed her, she was in the back of the

(28:52):
in the middle of the room, they called her name, she came up,
she did what they told her, she hit the wrong button a few times,
they messed up the demo for a second, but they built a chatbot bot off of PDFs.
And it took her five minutes, and it worked.
Pretty enticing demo. And I'm sitting there going, well, if anybody can do that,
imagine the possibilities of the ability to just really ensure that all the

(29:15):
knowledge within an organization is readily available to everybody within the organization.
That's amazing, right? And it adds to the excitement of what's out there, what the potential is.
It sounds amazing. And I think you're right.
Institutions that don't embrace this will likely get left behind.
Even if it's a measured approach, I think it'd be advisable to get grounded in the topic.

(29:38):
Absolutely. I avoided it when it first happened as well. Because like you,
we've all been burned by these ideas, right?
When I graduated from college, the dot-com era busted out of my first job.
I was laid off because those jobs were gone overnight.
The market failed and all these startups turned into nothings, right?
I was part of a unicorn company and

(30:00):
that turned into, I think my stock was worth $2 by the time I was gone.
So it felt very much like that.
And I think we will see that wave of companies getting created,
entering semi-unicorn status, and then being thrown out the door because maybe
they were trying to build something that should be part of a different application, et cetera.

(30:21):
So that's definitely going to happen, I think, in this wave.
But what happened to me initially was, oh, here we go, another interesting flash in the pan thing.
Even Gartner was like, this is a shiny object a year ago, right?
And now we're all over it all I would say I have a slightly different view of
what will happen to all these tech companies.
My view is that I'm with you, that they sprout up and they create a bunch of very niche solutions.

(30:48):
But what I've observed is that they won't necessarily go out of business the
way that dot-coms did overnight.
I see them just getting assimilated, like being acquired by other larger companies.
And then I think that you'll see consolidation.
Sometimes I think that's why they start these companies is to plan to get acquired.
Right. And some players in the marketplace, some cloud providers out there,

(31:09):
keep it nameless this for now, but they are playing smart in terms of enabling
a platform that people can build on. I've seen it in a number of instances.
Salesforce has always been one of those players where, as an example,
and they've always kind of been in this marketplace of build for Salesforce,
build an app on top of Salesforce.
And anybody who has a startup, and I'm not saying Salesforce specifically,

(31:34):
but they should look at the marketplace and say, should I make a bet on a provider
on whether it's AWS, Microsoft, Salesforce, Snowflake, Databricks, whatever it may be.
Should I make a bet on a provider and build my application to work specifically
for that platform and already is integrated inherently into that platform?
Because that provider may gobble you up. That's your exit strategy,

(31:56):
right? I'm going to make it so compelling for them. And I'm going to sell to
every single one of their clients.
And I'm going to ensure that I make a splash every single year with them from
a revenue standpoint so that they can't avoid me.
And at some point, they're going to acquire you, especially if it's on their
platform, it's fully integrated, it's kind of native to their ecosystem.
I think that's an easy play for them to say, hey, I'm going to go ahead and

(32:19):
acquire you and relieve you of this.
I'm with you. What would be your advice to someone who's either mid-career,
kind of later in their career, just trying to keep up with the madness?
You know, one advice that I've given recently is, you know, I think that in
the last 10 years, there's been sort of these nuanced roles that people fit into.

(32:39):
You know, I'm of the belief that versatility is a key to success.
Having this skill to work across data disciplines is the advice that I would
give people coming up. I'll give like a basketball analogy.
So in basketball, you have these positions, you have a point guard and you have a center, right?
And historically very different, right? In terms of size and style of play.

(33:03):
But the most exciting basketball and the ones that people, teams I see most
successful is where there is this concept of positionless players,
people who can play at any position at any given time.
I think the future of like data professionals is someone who is so versatile
and can be sort of positionless and have enough skill working across all these

(33:26):
disciplines is the advice that I would give. And be innocent.
That's the other piece of advice. Expect continuous learning.
And if you aren't going to embrace it, this isn't for you.
I wholeheartedly agree with you primarily because I see myself as someone who
started in left field and ended up going through all these iterations.
In my career was not a straight line.
It was a bunch of loop-de-loops to the point where the way I look at it is you

(33:53):
cannot influence change without understanding what is required of the other person's role.
So how are you going to influence them to primarily do the right thing if you
don't have enough understanding to push them in that direction,
to get what you need done?
I think that's going to be critical. And the only way to get there is to have
the knowledge of of how those individuals have to work and what they're trying

(34:15):
to achieve and what they're doing. Is that the right thing?
Or is there something else and you should be pushing them in the right direction?
Yeah, fair enough. I will say that I think I personally prefer a career that
was like loop to loop versus a straight line as well.
So I think you're fortunate. And I think it probably adds to your skill set.
I'm very fortunate, yes.

(34:36):
I didn't appreciate it until later in life. So we completely went off script on this.
For those listeners out there, we were actually going to talk about how Gen
AI can solve data quality issues.
I still think it's an incredible topic. Yeah, I do think Gen AI is prime for
solving data quality issues.
You know, I'll say this, you know, I think I shared with you,

(34:57):
I started my career as an accountant. My first job was...
There's an intern reconciling accounting entries for inter-entry transactions.
What it was, was an accounting problem, a data problem. And the data problems
were missing data, format of data, just anomalies around data.
And if you think about it, Gen AI is perfect for these kinds of problems to

(35:19):
solve when you think about data cleansing.
But yeah, maybe another podcast hyper-focused on data quality and Gen AI,
data governance and Gen AI.
Yeah, absolutely. I would love it. I went in last week with that in mind,
and I actually ran around the conference floor looking at companies who are
solving data quality issues with ML, AI, Gen AI capabilities.

(35:43):
And they are many, many out there.
It was eye-opening to see that because it's addressing an old problem in new
and novel ways that removes some of the headaches of how to solve the problem.
Definitely interesting conversation to be had.
Thank you so much for having me. It was so wonderful just chatting without boundaries.
Yeah, as advertised, I told you that when we first chatted about the episode,

(36:04):
I said, hey, we're going to have a construct.
But if we go in another direction, we'll just take it there if the conversation is good.
And it was. So thank you again for your time, Junaid. And I hope you have a
fabulous weekend. Thank you, Sandy. Thank you, Angela. Appreciate it.
Music.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.