All Episodes

July 26, 2022 16 mins

In this episode, Sara Hanks (Senior Director Program Management) and Andreas Welsch discuss the top 3 learnings for new AI leaders. Sara shares learnings on becoming an AI program lead and provides valuable advice for listeners looking to move into a similar role in their business. 

Key topics: 
- Learn what to expect as an AI lead
- Find out the top challenges in that role
- Hear how to apply it to a new AI project 

Listen to the full episode to hear how you can:
- Identify early adopters and keep them engaged
- Think like a marketer and identify message for your stakeholders
- Learn to prioritize which AI use cases to pursue

Watch this episode on YouTube: https://youtu.be/Z6J69rXGmmQ

Questions or suggestions? Send me a Text Message.

Support the show

***********
Disclaimer: Views are the participants’ own and do not represent those of any participant’s past, present, or future employers. Participation in this event is independent of any potential business relationship (past, present, or future) between the participants or between their employers.


Level up your AI Leadership game with the AI Leadership Handbook:
https://www.aileadershiphandbook.com

More details:
https://www.intelligence-briefing.com
All episodes:
https://www.intelligence-briefing.com/podcast
Get a weekly thought-provoking post in your inbox:
https://www.intelligence-briefing.com/newsletter

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Andreas Welsch (00:00):
Today we'll talk about the top three learnings to
help new AI leads have asuccessful.
And who to better to talk toabout it than someone who's
actually been in that role.
Sara Hanks.
Hey Sara, thanks for joining.

Sara Hanks (00:15):
It's great to be here, Andreas.

Andreas Welsch (00:17):
That's awesome.
I'm so glad we get to do this.
I know we've talked for a littlewhile and so I'm really excited.
But hey, why don't you tell us alittle bit about yourself?

Sara Hanks (00:27):
Yeah, sure.
So my name is Sara Hanks.
I currently work as a seniordirector of project management.
And if I were to break my careerup, I would do it into two
sections.
The first half of my career wasreally focused on more of like
mechanical manufacturing,engineering.

(00:48):
But then for the last decade,I've really been a translator
between the business andtechnology.
And a little over five years ofthat, I was a program manager in
our data lake organization.
And then I was also the seniordirector of data analytics for a
number of years.

Andreas Welsch (01:08):
That's awesome.
So it sounds like you've reallyseen quite a few things when it
comes to data and in AI.
So that's awesome.
To our folks in the audience, ifyou're just joining the stream,
drop a comment in the chat.
What do you wish you knew whenyou got started in AI and
automation?
And if you're just starting,that applies as well, what you

(01:32):
would like to know.
But Sarah should we play alittle game to kick things off?

Sara Hanks (01:37):
That sounds great.

Andreas Welsch (01:39):
Perfect.
So you see, I don't have myusual setup here.
But maybe let's let's still playa game that I like to call fill
in the blank.
Okay.
The way I usually run this isthat I'll start a sentence.
I'd like you to complete it withthe first thing that comes to
mind and why.
Okay.
Fill in the blank.
And so to make a little moreinteresting, I'll only give you

(02:01):
60 seconds for your answer.
And again, for those of youwatching us live, drop your
answer in the chat as well andwhy.
Sara, are you ready for What'sthe BUZZ?
Yes, I am.
That's awesome.
Great.
Let's see.
Data is just dot, dot, dot.

(02:24):
fill in the blank.
60 seconds.
Go.

Sara Hanks (02:27):
Data is just one of the most fundamental things that
you can do to solve any problem.
Quick sentence.

Andreas Welsch (02:39):
I have a feeling I need to tweak my questions.
Huh?
They're way under 60 seconds.

Sara Hanks (02:45):
To the point, no, I think, anybody that knows me
understands that I have asignificant passion for data.
Even back in my early days whenI was in manufacturing as a
quality engineer, I was highlydependent.
Data and getting it out of thesystems to be able to solve
problems.
And I think for me that's reallywhere the benefit of data comes

(03:09):
from is it cuts through theopinions.
It can show you things thatpeople might have perceived to
be true to be not so correct.
And it helps justify opinionswhere you do believe or you have
that intuition or that hunch,right?
It can back those things up aswell.

Andreas Welsch (03:28):
So that's awesome.
Yeah.
I'm just looking at the chathere.
So from data is problem solvingstuff.
What Aamir was saying to it canbe a pain.
So I think we've all been therein different variations, right?
Working with data in all the upsand downs and challenges.

(03:55):
Yeah.
But maybe let's pivot a littlebit.
So I know you've been in this AIlead, program lead, even CoE
lead role before, and I'm suremany in the audience are eager
to learn from you and hearwhat's the number of number one
advice you would give to anyonethat wants to move into that

(04:15):
kind of a role.
I'm really curious, what do youwish you knew when you started
in that?

Sara Hanks (04:22):
So my number one advice for anybody moving into a
program management or an AI CoEleadership role is really to
identify those early adopters,but then also looking to keep
them engaged throughout.
An example for me in 2015, itwas the first time I was a data

(04:42):
lake program manager.
I had a small team of datascientists and product owners,
and one of our first use cases,around the commercial test of a
big engine, and the engine wouldget a whole bunch of sensors,
and then each sensor wasevaluated.
And if it was outside of itsspec limit, the engine was

(05:03):
failed.
It was put off to the side forengineering to review the data
later.
And what we wanted to do isreplace that, not replace it
completely, but to speed up theprocess by using machine
learning.
to create a recommendation forthe engineer and to deliver that
recommendation real time so theydidn't have to take the engine
out and it would be able to saveup the ability to ship the

(05:27):
engine on time.
You would actually be able toget through that process faster.
And we had a ton of enthusiasmand early adopters out of the
gate, but one of the things thatwe failed to do along the way
was to keep the team engagedand.
Decisions along the way, weencountered a number of
technical hurdles and throughoutthat process we should have been

(05:52):
pulsing our stakeholders andunderstanding like, is this
hurdle significant to the pointwhere maybe it's offsetting the
overall benefit of the project?
And unfortunately, by the timewe got through everything, the
team had changed.
They were less enthusiastic.
Unfortunately the project didn'tget adopted like we had
originally hoped from thebeginning.

(06:15):
And then the second part of yourquestion is, what did I wish I
knew going in?
Is I think being a leader, youneed to think like a marketer.
And what I mean by that isreally understanding who are
your stakeholders and how wouldyou define their persona?
How do they need to consumeinformation?
And then how do you craft yourmessaging on what the project.

(06:38):
What AI is doing.
And then also how does it needto be delivered so that it's
done in a way that they relateto it, but then also focus on
things that are relevant forthem.

Andreas Welsch (06:52):
So focus on early adopters, but also make
sure that you keep them in enengaged and that it's relevant.
Yeah, I remember seeing some ofthese things that you mentioned
in, in my previous roles as, aswell.
It's one thing to, to get theproject started, it's another
one to keep that flame burningor even growing.

(07:12):
And because cycles tend to berelatively long because it's not
as straightforward project.
Yeah.

Sara Hanks (07:19):
And people want early results and it's not
always reality.

Andreas Welsch (07:24):
So that's where I think these examples, like you
mentioned around the enginewhere it's tangible, right?
It's adds business value.
It is something from real life.
It's not just something that,that somebody's dreamed up.
But there is a lot ofopportunity there if you go

(07:44):
after these examples.
So I'm taking a look at the chatreal quick.
Aamir asked do you think ML AIawareness is much higher
compared to, say, 10 years ago?
What do you think in theenterprise or in business from
your perspective?

Sara Hanks (08:04):
So I think if I were to go back 10 years ago there
wasn't a whole lot of awarenessof AI and ML at that point in
time.
I would say between 2015 and2017, There was a lot of buzz
around AI and ml.
I think a lot of false hope tooand maybe some disappointment,

(08:27):
which I think then took us to aperiod of like almost scar
tissue, right?
Like people were afraid of AIand ml and I think it's starting
to make a good comeback and I'veseen.
Leaders that were notnecessarily early adopters start
to incorporate that into theirthinking and, start to want to

(08:48):
explore that as far as theirbusiness strategies go.
That's just based on myobservation.

Andreas Welsch (08:57):
It sounds boring, but I must agree with
you.
I'm seeing the same thing and,much like you, having been
through that hype cycle it'squite encouraging.
I feel now to see that it ispicking up that we are talking
about realistic and realexamples.
And then we haven't moved beyondthe the hopes and dreams of the.

(09:18):
So maybe then moving on to thenext question.
I think you've touched on, on,on challenges.
If there's anything else youwanna add, maybe that might be a
good area, otherwise we can moveon to what we had talked about
as our third question.

(09:39):
Anything else or challenges thatcomes to.

Sara Hanks (09:41):
I think, I guess one other common challenge that I'd
like to hit on is sometimes it'sdifficult when you're moving
into a new role to reallyprioritize those use cases and
also to identify when it, whenyou need to either pivot or
stop.
And I think, from in thebeginning when it comes to
prioritization, I think reallyunderstanding what's the problem

(10:04):
that you're trying to solve andthen what does the data look.
Because I think you can probablyeliminate some of those early
requests if it, if the data'snot sufficient or it's not
clean, or you're gonna have toinvest a significant amount of
time into making the data into auseful state.

(10:25):
I think knowing what that lookslike upfront and prioritizing it
accordingly is the needing toprioritize is a challenge, and
the way to overcome it would be,to really evaluate the.

Andreas Welsch (10:39):
Did any suggestions for how somebody
coming new into the role couldspot that?
Again I remember it's not that,it's not that black and white,
right?
You slide into it and then youhave a use case and as you un
untangle it and start working onit you are likely to come across
these issues are challenges, butany red flags that, that come

(11:02):
to.

Sara Hanks (11:03):
So I think the first thing is depend on your team.
Chances are if you're coming newinto a role, there'll be
somebody in the team that's gotdata science background or
experience an industry that youcan certainly lean on.
But then like red flags I thinkone of the things that I've seen

(11:24):
is a lot of requests for likevery specific hypothesis
testing.
Is this specific thing thereason why this is trending?
And that is something that youcan really look at with basic
statistics as opposed to askingsomebody to look at it from a
machine learning perspectivebecause you might be over

(11:47):
overthinking and over puttingenergy into the analysis itself.

Andreas Welsch (11:55):
That, that makes a lot of sense.
Going with some simpler ways,but definitely send an expert to
make that assessment and giveyou that feedback.
Awesome.
So I know you've mentioned abunch of things to, to watch out
for in some of the challenges.
But I'm also curious to, to hearhow have you seen all of this

(12:16):
come together, maybe in a greatexample of a use case that the
team and yourself have worked?

Sara Hanks (12:23):
If I was to really like highlight where I've seen
all three of these in asuccessful project right out of
the gate, I have to give a shoutout to my colleague Milan.
He had a team that was similarto mine, but they were more
focused on research anddevelopment within engineering

(12:44):
and trying to show or prove tothe business that AI could be.
And I think he got this formularight on his first major AI
project.
It was a quality inspectionusing image recognition, and the
decision that the AI needed tomake was in a way that it didn't

(13:06):
require tons and tons of labeledimages to be able to build the
algorithm.
He was fantastic at picking aproblem statement that was
relevant.
This was an issue that was 10plus years.
And then continuing to marketthe value to the different
stakeholders.
And then in terms of like earlyadoption and keeping people

(13:27):
engaged, he had the right rhythmset up right out of the gate.
And, I just, I wanna give creditto that cuz I think he really
did a nice job of pulling allthree of those together within a
single use case.

Andreas Welsch (13:41):
Awesome.
Thank you.
So thank you so much forsharing.
I know we're coming up on, ontime what would be the key three
things you would want ouraudience to, to take away from
today's sessions?
What are the key threelearnings?
Just in summary.

Sara Hanks (14:00):
The key three learnings for me for someone
moving into an AI leadershipwould be, first is identify who
those early adopters are andkeep them engaged.
Second, think like a marketerand identify messages that are
relevant to your individualstakeholders and meet them on

(14:22):
their on their playing field.
And the third is just learninghow to prioritize, which use
cases to go after and when topivot or stop them if they're
not working.

Andreas Welsch (14:35):
That's awesome.
Thank you so much forsummarizing.
I know we're getting close to,to the end of the show, so I'm
really excited that we had theopportunity to do this together
that we were able to come on theshow.
Quite frankly I haven't foundtoo many people in similar
roles, so that's why it's madeit even more exciting for me

(14:56):
personally, and I hope also evenmore valuable for those of you
in the audience.
To hear from you what you haveto share and key learnings,
things to look out for and howto how to set yourself up for
success.

Sara Hanks (15:11):
Yeah, I appreciate the conversation today and thank
you for inviting me to be on.

Andreas Welsch (15:17):
Awesome.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.