All Episodes

September 11, 2025 46 mins

When is the right time to hire an AI leader, and what do they actually do once they’re in the door? In this episode, Galen Low sits down with Tim Fisher, VP of AI at Black & White Zebra, to unpack the real-world impact of AI leadership roles. Together they explore the tension between hype and practicality, the mix of skills needed to bridge tech, business, and people, and why AI leadership is less about flashy experiments and more about building trust, change readiness, and operational maturity.

Tim shares candid insights from his own path into the role, offering a grounded look at how organizations can approach AI without losing sight of their business goals—or their people. If you’ve ever wondered whether a VP of AI is a made-up job, or how AI leadership can actually smooth project delivery, this conversation is for you.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Galen Low (00:00):
What the heck does a VP of AI do?

Tim Fisher (00:02):
This title, I think can mean a lot of things.
This role should be practical.
It should not be flashy like Ihave business results in my hip.
This like do cool stuff withAI, which I think people think
is like the job description.
I don't think that'svaluable to a CEO or a CFO.

Galen Low (00:18):
When is the right time or wrong time for an
organization to hire an AIspecialist like yourself
into the leadership team?

Tim Fisher (00:25):
I think it might be too soon if leadership
is still debating whetherAI is a fad, no data
infrastructure, if your culturepunishes experimentation.

Galen Low (00:35):
What's the biggest challenge that
you see in front of you?

Tim Fisher (00:38):
The biggest challenge is creating a
culture of change, a cultureof failing fast, and a culture
of comfort with ambiguity.
Implementing AI in anorganization is way more
of a human thing than itis a technology thing.

Galen Low (00:57):
Welcome to The Digital Project Manager
podcast — the show thathelps delivery leaders work
smarter, deliver faster, andlead better in the age of AI.
I'm Galen, and every week wedive into real-world strategies,
new tools, proven frameworks,and the occasional war story
from the project front lines.
Whether you're steeringmassive transformation
projects, wrangling AIworkflows, or just trying to
keep the chaos under control,you're in the right place.

(01:20):
Let's get into it.
Today we are talking aboutAI leadership roles within
digital first organizationsand how hiring an AI executive
can impact the way projectsand operations run — for
better and for worse.
With me today is Tim Fisher,the new VP of AI at The Digital
Project Manager's parentcompany, Black & White Zebra.
Tim is a deeply experiencedmedia tech executive,

(01:43):
entrepreneur, data sciencegeek, and hack engineer,
hailing from companieslike People Inc. (which is
formerly Dotdash Meredith)and retail giant, Target.
But more than any of thosethings, Tim is someone who
likes to solve problems usingpeople, technology, and a
little bit of cleverness.
His role as VP of AI at Black& White Zebra is a new role
within our media organization.

(02:03):
So we're gonna put Tim a biton the hot seat and unpack
what his role is all aboutand what other pieces need
to be in place before anorganization can unlock the
value of this kind of role.
Tim, thanks for beinghere with me today.

Tim Fisher (02:15):
Thank you very much, Galen.

Galen Low (02:17):
And welcome aboard.
So, yeah, you know, afull disclosure, Tim
and I work together.
He's a new hire and as partof our hazing ritual, I
dragged him onto the podcast.
But I thought it might bea really good lens because
there's a lot of organizationsout there right now looking
for AI leadership to drivetheir AI strategy or create
their AI strategy or unwindand untangle their AI strategy.
And I think there's alot of sort of murkiness

(02:37):
around what that means.
Some skepticism, someoptimism, some zealots.
And I thought maybe we couldsort of unpack it today, but
I joke about the hazing thing,but as soon as we got on a
call, I knew I needed to haveyou on the show because you've
admitted to me that you are asnerdy as I'd hoped you'd be.
And I know that my listenerswant to hear the raw sort
of like ego free lowdown onthe impact that an AI leader

(03:01):
can have on an organization.
And so far, you and I arepretty hit and miss when it
comes to following an agendaversus just like going down
interesting rabbit holes.
But here's the tentativeroadmap that I've
sketched out for us today.
To start us off, I wantedto get like one big burning
question out of the way,like that difficult, direct,
and maybe even insultingquestion that everyone
wants to know the answer to.
But then I'd like tozoom out from that.

(03:22):
I wanna talk about three things.
First, I wanted to talk aboutthe skills that someone stepping
into an AI exec or leadershiprole needs to have, as well
as like when the right timeis to hire for that role.
Then I'd like to talk aboutthe impact of AI leadership
and what that looks like forfolks in the trenches who are
like delivering projects andtackling day-to-day operations.
And then lastly, I'd like tozero in on the future, just

(03:44):
the future of AI, enhanced waysof working and what that looks
like as it pertains to projectsand just general operations
and maybe even like what weneed to be careful about.

Tim Fisher (03:54):
That sounds wonderful.

Galen Low (03:55):
So as I mentioned at the top, there's been a
lot of talk in my projectmanagement communities recently
about like the need for AIexperts to be at the leadership
table in executive roles,in senior leadership roles.
But there's also a lot of peoplewho think that like a chief
AI officer or any kind of AIleadership role is actually not
a real job, or like not yet.
Anyways, they say that it'sa bit cart before the horse.

(04:16):
It's that hiring move that'smaybe like a little bit more
flashy than it is practical.
So I thought I'd be directabout it and ask you.
What the heck doesa VP of AI do?
What does that mean fordigital project leaders and
how does that translate intoAI enhanced ways of working
throughout an organization?

Tim Fisher (04:32):
Well, first of all, I'm not sure I've heard
this not a real job thing.
So maybe this is justyou, and this is part of
the hazing thing, but.

Galen Low (04:39):
You come from a long line of of AI executives.

Tim Fisher (04:43):
No, I'm kidding.
I've certainly heard thatbefore and I'll be honest.
Even to me, I think VP ofAI sounds a little made up,
and I'll be honest, at mylast job, I actually did
make up the title that I had.
So, but look, this title, Ithink can mean a lot of things.
It can be really tech focused.
It can be really productfocused, it can be business
focused, or it can be a hybridof all of those, which is

(05:03):
how I would describe my role.
And this whole like cartbefore the horse thing.
Look, this comesup all the time.
The thing is, this is new,like there's no AI process yet.
There's not an employeerubric for what an AI
title means or where itsits in the organization.
There's no definition of therole that everybody's agreed
upon, but just with everythingwith AI, you have to start
somewhere and absolutelysome organizations are gonna

(05:27):
get this wrong at first,and that's totally okay.
And like you said, thisrole should be practical.
It should not be flashy.
Like I have businessresults in my head, like
real business results.
I here at BWZ and alsopreviously People Inc. You know
this like do cool stuff withAI, which I think people think
is like the job description.
I have real job description.
I don't think that'svaluable to a CEO or a CFO.

(05:49):
And I'm sure that's like drivinga lot of the eye rolling and
like, what is this job for?
I think like a real practicalAI role, like the goal of a
role like that in leadership.
It's that person's taskedwith building capabilities,
starting something like that'sreally the job and focused
on real business results.
You know, like myresponsibilities are, I have
a lot of responsibilities likeeducation up and down the org

(06:12):
chart, automation, strateg.
Operational rigor where therecurrently isn't any around
AI workflow consulting,like throughout all the
different departments of acompany, change management,
legal and risk management.
There's just, you know,it goes on and on.
There's a lot to this, andthe leadership role can
be really helpful to helpbring all that together.
It depends on the size ofthe organization and the

(06:33):
organization's needs andthe skill sets of the hire.
You know, it might belike building a team
like I did at People Inc.
Or it might be like dottedline relationships to
existing product teams andlegal teams, you know, in a
really big company or maybemore of a solo independent
contractor or a mix of them.
The big secret here is notreally a secret, like AI
and executive titles feels alittle bit silly right now,

(06:55):
like totally admit that.
If you scope it properly,you know, it can bring
clarity and order and somelight process not too much.
And real KPIs for somethingthat otherwise feels
incredibly ambiguous right now.
I think specifically for digitalproject leaders and really
anybody delivering anythingat all, and they executive.

(07:15):
Brings an opportunity to bea really close partner to
translate a bunch of chaosinto really practical tools and
workflows, which for someonewho's delivering something,
means smoother projects.

Galen Low (07:27):
I like the framing of like change agent, and
it kind of sits in between.
You're right, it's like it'snobody's role necessarily,
like arguably maybe a chiefoperating officer, but I think
there's like, there's more toit than just even operations.
And the way that it sort oftransforms ways of working.
But I also like the note Itook was change leadership.

(07:48):
And what I meant by thatis, you know, we were joking
about a made up title.
You actually did makeup your title right at
your, in your last role.
But the core of it being becauseit was necessary because.
Someone needed to sort of takeon the challenge of this change.
It happens to center around AI,but to your point, it's not just
about the technology and liketinkering with it and having

(08:11):
cool ideas and flashy headlinesand good PR about cool AI stuff.
It's like actuallyoperational change.
It's change management, it'speople leadership, and somewhere
in there tucked in there is whatthe technology is capable of.
I really liked your inflectionabout the idea that like.
Education, right?
Because part of that changedleadership is bringing

(08:33):
people along and also theAI stuff is moving so fast.
Any kind of emerging tech, butlike I think this is pretty
unprecedented in terms ofwhat I've seen in my time.
Trying to even just keep up.
That is a full-time job.
Having someone educate usabout like what is current is
just like, it's so much betterthan swimming around Reddit
and Udemy and YouTube and allthese things, and just hoping

(08:54):
for the best and then bringingit all together into like ways
that we can do things, actualworkflows that have business
impact and like businessimpact that's measurable.

Tim Fisher (09:02):
I think you hit on a lot of really
important points there.
I especially like the oneabout, I don't know exactly
what you said, but around.
People being in differentplaces or bringing people along.
I think one of the strugglesof a role like this is to
meet people where they areand hold their hand and bring
them into the future that thisrole maybe lives in, right?
And so people existall along the spectrum.

(09:25):
In there, there are peoplewho are just getting started
because they've been reallyscared or haven't had the
opportunity or whatever it is.
Then there's people thatare living, you know, in
the right now, in three andsix and 12 months ahead.
And an organization,especially a larger one, can
have large groups of peopleall along that spectrum.
And being able to meet folkswhere they are and do that where

(09:47):
there are so many people inso many different places is I
think a really big challenge.

Galen Low (09:52):
Yeah.
It's like the unification or,I mean, maybe that's even too
big of a word, but even justtaking people from where they
are and sort of accounting forthat because I think a lot of
the organization, I won't getinto it right now, but a lot
of the organizations I seeshooting from the hip are the
ones that kind of we're makingfun of in the memes, right?
But they're like, dothe AI, do it now.

(10:12):
We need the AI done yesterday.
Do it.
Please.
That's not AI leadership.
That's just like dictatingfour words at a time.
So I agree.
I appreciate the sortof disambiguation here.
I wondered if we could likezoom out a bit, because I've
kind of been framing this as.
I don't know, as a change agent.
And I think maybe some folkslistening might be like,

(10:34):
well, anyone can do that.
You know, that's justchange management then.
Like you just happen tohave AI in your title.
But I, you know, I don'twanna like rush under the rug
the sort of technology bit.
I know we've been talkingabout the fact that it's more
than just the technology.
Sure.
But when it comes to thetechnology, I mean, we're
talking about fast changeand some folks are, you
know, they'll say that like.
AI isn't something that anyonecan even be an expert at yet.

(10:54):
It's too new.
We haven't like, sort of,you know, figured it out yet.
We haven't run all the studiesand got all the results.
And how can anyone be an expert?
So I challenge you a bitand say, you know, maybe
just ask, can you tell me abit about your background?
What credentials or accoladesor experiences make you
qualified to represent AIat the leadership level?

Tim Fisher (11:14):
I think it helps.
I have the physicsand math background.
Super big nerd.
At some point I switched tobusiness because I needed
to pay rent, but that'sdefinitely worthy, you
know, where sort of like alot of my excitement comes
from lifelong science nerd.
I, I got into like techand systems engineering
and things like that.
I had an opportunity ata previous job to create
documentation around processesand to design workflows.

(11:38):
As incredibly nerdy asthis sounds, I absolutely
fell in love with that.
So.
I've spent 30 yearscommunicating about complex
ideas and sort of, you know,back to this like mesh of like
technology and people things.
That's a really importantpart of the role.
Also at People Inc. I built atop 10 technology side almost
from scratch over two decades,which that was my role is

(11:59):
demystifying technology andexplaining it to people who
struggled to understand it.
You know, I scaled thatskill across the internet.
So that was, I thinka really big set of
experiences that helped me.
Learn how to talk aboutthis stuff in a way that is
really important right now.
You know, back to ourconversation about all
the different placespeople are right now.
I've also started afew small businesses.

(12:21):
I've run about a dozen mediabrands, so I've been responsible
for the business up, down,left and right, so I understand
how it all works really well.
I think the culmination of allof that is a qualification in
AI because I have this likedeep technical understanding
and I also have this ability toclearly communicate to multiple
different types of audiences.
So there's like this nerdlayer, right, where it's like

(12:42):
I know how language models aretrained and how they operate,
and then there's this businesslayer that sits on top where I'm
able to confidently communicateand understand around like.
When an LLM is good for Xand maybe not for Y and how
we can get from zero to one.
So did I go to some sortof AI leadership school?
I, no I don't think that exists.

Galen Low (13:02):
The two, two day bootcamp.

Tim Fisher (13:05):
Right.
I mean the, I say this,but there probably isn't
AI leadership school rightnow that I'm not aware of.
But all of those things arespinning up as we speak.

Galen Low (13:12):
Well, it's really interesting you say that, you
know, we're joking about it,but in some ways what you've
described to me is is sort ofthis, like, it's a rare, almost.
Some might say unicorn sort oftrifecta of like business and
entrepreneurial understanding.
Like, you know what likemakes a business go.
You've run businesses,you've started businesses.
There's also the sort of likegeeky technology layer, right?

(13:33):
Having that deep understandingof it and then this like
communication layer, right,of being able to explain it,
almost be the translator betweenthe two to say, you know,
technology business peoplebetween the three, I guess.
And like, how canwe get this done?
And then you've done it right?
And with technology, you'velike scaled operations
and teams and sites.
Not necessarily all AI,but AI was almost the like

(13:55):
natural progression of whatyou were doing already.
So it makes sense thatyou'd sort of make up your
own title at your last job.
I'm taking this too far,but like, you know, you had
mentioned like physics andmath and I immediately, not to
draw comparisons between youand these individuals, but I
thought of like folks like.
Neil deGrasse Tyson onthe astrophysics side,
or like Hannah Fry onthe mathematics side.

(14:16):
People who just kind of havethis mix of qualities where
they kind of get it, they getlike the sort of blend of things
that they need to do to educate,to enact change, to, you know,
keep people in the point in theright direction and to lead and
still understand, you know, allof the inner workings of it.
You know, now that you'resaying this out loud, I'm
like, gosh, this is a reallydifficult role to hire for

(14:38):
in some ways, you know.

Tim Fisher (14:40):
It is.
I don't know.
I'm struggling to move pastthe, you threw my name out there
with Neil Degra Tyson, so we canstop right now if you'd like.

Galen Low (14:52):
Well, like, well, you know, what an amazing role
he plays and others, right?
Of just like bringing it down toa level that everyone can get on
side with, but not like blurringit into lies and not like
simplifying it too much thatit, you know, is too basic, but
actually distilling the essenceof what needs to be done.
Choosing what level, right.
We were talking aboutbringing people along

(15:13):
from wherever they're at.
Some people they wanna know howthe language model is trained,
and you can explain that.
And some people, like you justwanna know, am I supplying
too much context to myChatGPT taking me like 25
minutes to write a prompt.
Like, is this normal?
Right?
Like, just tell me, help me Tim.
Right?
Like, and I think that'sthe job in some ways, right?
To be like, okay, wellyou're here and you wanna
know this and like I caneducate you at your level.

(15:35):
Is it wired thatdoes the videos?
It's like five levelsof something, right?
Five levels of harmonyfive levels of, you know,
juggling or like hacky sacksor something like that.
You know, you can explain iton all the different levels
and still bring people along.
We joke about the, explain it tome like I'm five, but actually
that might be the most helpfulthing for some people who just
need to get over that hump.

Tim Fisher (15:54):
Sometimes it is.

Galen Low (15:56):
I alluded to it earlier about timing.
Timing for this role.
Hiring for this rule.
And I did, I came acrossthis meme at, I, I think
it's circulating around.
Or it's you know, it's CEOs,they're like, who are we CEOs?
What do we want AI?
When do we want it?
Now?
What do we want it for?
And they're like, eh.
So there's this kind of likethe sense that like, you know,
business leaders, CEOs want AIwithout really understanding the

(16:18):
outcome that they want from it.
I get it.
And you get it from thesort of perspective of like,
think competitive, right?
If you're not doing it, thenyou're kind of falling behind.
You kind of have tobe on this bandwagon.
But you know, whenI look out there.
Some organizations justdon't look ready, even if
they're saying that they'reready, that they're doing it,
they just don't look ready.
So I thought I asked, youknow, like when is the right
time or wrong time for anorganization to hire an AI

(16:42):
specialist like yourselfinto the leadership team?
What level of maturityis required to achieve
AI related goals?
Maybe like what aresome examples of what
those goals might be?

Tim Fisher (16:52):
I think it depends on a lot of factors.
Obviously, CEO andexecutive support.
Some organizationsactually don't have that.
Which oddly often comes alongwith a lot of blind pressure to
do something and org's agility.
Most organizations aren'tparticularly agile.
These are not small changes.
These are large changesthat are happening inside of
organizations because of AI.

(17:14):
I think when conversations shiftfrom, let's see what we can do,
or let's experiment, or let'ssee what's possible to this at
scale, like that's a good sign.
I also think scatteredbottoms up AI use around
the organization that maybeneeds some coordination
and some sort of like orderand some light process.
I think that's anotherreally good sign.
I think it might be toosoon if leadership is still

(17:36):
debating whether AI isa fad, that is something
that is still happening.
There no data infrastructureas anyone who spent any
time working with AI at all.
The inputs are extraordinarilyimportant, so if you don't
have data that can go into anAI system, your company might
not be ready if your culturepunishes experimentation,
or there's not a lot oftolerance for ambiguity.

(17:59):
Wow.
This is gray.
This is some really great stuff.
It's getting better, butif you're not prepared to
sort of absorb and deal withthat's not a great sign.
I think most importantly though,and I think we all take this
for granted in the digitalworld that we all live in,
but the digital nature of theorganization, there are many
organizations out there thataren't particularly digital
even in the year we're in now.

(18:20):
So if it's all about inputsand outputs and most of your
stuff is still in file cabinets,you're better off hiring
somebody to come scan yourdocuments before you create
some kind of an AI strategy.
It's really important.
It's also the right timeif there are teams drowning
in repetitive work, if youknow there's opportunity
for automation, right?
And you even mentioned thisa little bit ago, like.
Competitors winningat automation.

(18:42):
Look, if your competitors aredoing something that's clearly
good for their business thatyou're not doing and might
be time to ask yourself ifwe need to make some changes.
Right?
And then of course, if yourCEO is asking about AI in
your strategy, it's all,look, it gets to a point
sometimes where waiting is thebigger risk then jumping in.
But yeah, that's why Iwould summarize all of that.
You also mentioned goals.

(19:03):
I think goals arereally interesting.
I think.
Unless you're an AI startup ora foundational model creator.
So like if you're open AIor Google, or Microsoft or
Amazon, or you know, maybeyou're building something that
literally was not possiblebefore AI, then your goal
is to do something with AI.
Like AI is the actual goal, butI think for most businesses it's

(19:26):
about transforming the businessand operating at its best
using the latest technology.
So I would say the businessgoals should not change at all.
You just change howyou reach those goals.
Maybe a hot take thatprobably shouldn't be.
I think the AI revolution formost companies isn't actually
about AI itself at all.
I think the value isthe forcing function for

(19:49):
automation and transformation.
In fact, I was just talkingto somebody recently, I said.
Estimate, like maybe 25% ofthe projects I've deployed
so far in my AI roles couldhave been deployed previous
to large language models.
It's just that the businessdidn't prioritize those things
because there were no dedicatedtransformation leaders.
There was no like inwardpressure to ask ourselves,

(20:11):
how can we do this better?
We just do whatwe've always done.
So back to the title thing,I think honestly these types
of roles will probably evolveaway from having AI in their
titles and more transformationin those titles, because that's
really what I think it's about.

Galen Low (20:26):
You know, I think with the titling,
like we've seen fluidityin titling over the past
little while in areas likecybersecurity, like DevSecOps.
Even like, like RPA, likeprocess automation, I'm sure
I haven't looked, but I'msure there are dozens of VPs
and directors of, you know,process automation, RPA, all
of these things that we did todo exactly what you said, which

(20:48):
is like become more efficientat achieving the same goals
and the target may shift.
Right?
So in other words,right, if it was like I
just had on as guests.
Folks who do like AI callcenter transformation.
I guess what all those metricsstayed the same, but like
the target is to reduce callvolume or reduce call time
or increase, you know, theresolution rate of like, did we
solve our customer's problem?

(21:09):
It's not like, I'm sure thereare AI metrics within that, but
the core business goal is notto become an AI organization.
It's to better serve ourcustomers, better serve
our agents, hopefully,and also just, you know,
do business better.
What I liked about whatyou said earlier is that.
I don't know.
I we use this wordlike business maturity.
In my head it's like this,like flat still water

(21:32):
where it's like, oh, we'veachieved business maturity,
nirvana, everything iscalm and now we're ready to
launch into the next thing.
But you didn't say that.
You said either there is supportand like data infrastructure
and like you're ready.
Or things are happening, likeat the lower levels, people are
experimenting and like if youdon't organize that energy and
focus it somewhere, like it'sjust gonna disperse and it's

(21:53):
gonna be mayhem and chaos andit could actually damage culture
before it actually impacts yourbusiness in any positive way.
And I thought thatwas neat because like
I do see that a lot.
I do se see it a lot where likethere's a lot of energy, like,
and sometimes it's positive andsometimes it's frantic, like,
you know, panicked energy, like.
I need to figure out AI, soI better like do all these
things and then, you know,hope that I keep my job.

(22:16):
And some people are justreally excited about it.
They're like, oh, Imight not have to take
notes during a meeting.
Fabulous.
Say no more.
Like, let's do it.
Let's build a machine.
And you know, to your point, notinventing AI, but using AI to
do what they're already doing.
But I think the biggest thingthat I, you know, I picked up
on in there was that notionof like the safety, right?
This like psychologicalsafety to experiment.
Because I think the otherbit is that there's some

(22:37):
organizations where allof that experimentation is
happening, but like behinda closed door, secretly,
no one's talking about it.
Yes.
No one knows it's happening.
Frankly, like some of thedata that is going into some
of those, there's probablyno governance around it.
It could be mayhem thatbusiness leaders don't even see.
'cause they haven't createdthat culture of it being okay
to raise your hand and say,Hey, I'm playing with this.

(22:59):
Like, is that okay?
Like, you know,here's what I'm doing.
And if they think they'regonna get their hand slapped,
then you know, you can bedamn sure that they're gonna
do that to keep themselvesrelevant for their next job,
not for their current job.

Tim Fisher (23:09):
Amen.
Yes.

Galen Low (23:11):
This is really interesting by the way.
Just 'cause like I, I don'tknow, I think I came into
this conversation as like.
It's almost likethe utopia, right?
Like this utopian vision of whatAI transformation looks like.
But actually it can bethat it can be messy, it
could be solving problems,it could create problems,
it needs leadership.
And I guess that's kind oflike the crux of all of this
is that it does need someonewho can like zoom out and

(23:32):
zoom in, see the force forthe trees and guide people
along so that this energythat gets spent into it is
productive and actually enactschange that can be measured.

Tim Fisher (23:42):
Exactly.
Yes.

Galen Low (23:43):
I wanted to zoom into like life in the trenches
is kind of what I call it.
My background is in projectmanagement, business
development, account management.
We are in the day-to-day sortof operation of a business, and
we've been talking a bit aboutthis as we go, but like when
it comes to adopting AI intothe way that we like actually

(24:04):
collaborate on our projects andlike do day-to-day operations.
Do you see that as somethingthat is best done, like
top down within an org?
Like is that the idea of thesort of AI leadership layer?
The AI leader?

Tim Fisher (24:17):
I think you have to do both.
I think top down andbottom up, and I think the
right AI leader can makeboth of those successful.
Both those directions top down.
I think that's a littlemore clear, right?
It.
Setting goals andstrategies, you know, a
really clear direction.
You can tackle really bigworkflows and systems and create
these big projects and like,you know, completely change how

(24:38):
your business does this enormousthing that they do every day.
Right.
And that's, I think that's theone that's a little more clear.
And then the bottoms up Ithink is often thought about a
little too late or it's assumedthat it will take care of
itself or it's not important.
But that's like the educationand like you're getting at
a lot of this just in, inwhat you were just saying.
Right.
The education and the empoweringindividuals and the, I think

(24:58):
a really interesting oneis, I know at the last job.
We had just like an intakeform for great ideas to do.
You know, like I've got thisthing I do every day and I
know based on your TED Talk,and you're like, you know,
and like my research thatI know that LLMs can solve
this for me, and I would lovefor your team to build that.

(25:20):
But a lot of what individualpeople do are individual and
bespoke things in their roles.
So there's not this enormousROI to throw a bunch of
engineers and product folksat this like solution for
something that one person does.
But it can be transformativefor that one person.
So empowering people to solveproblems themselves with, you

(25:42):
know, some of these like lowand no code tools that I know
a lot of us are familiar with.
And using ChatGPT is your,you know, where Gemini are.
Some other conversationalAI, your like coding partner
for app script behind GoogleSheets, like empowering
people and giving peoplepermission to do things like
that is incredibly impactful.
So I think it is both of thosethings and a sort of like

(26:03):
meeting at the, in the middle,but I think they both have to
happen to be really successful.

Galen Low (26:07):
In my head, I'm almost like, oh, there's almost
like this like layer thatis similar to like a project
management office in termsof like, is this a project?
If it is, we can like,provide support for it.
Like we've got our team, wecan put it on rails, we can
educate you if it's likeactually not really a project,
but like go ahead and do it.
We can kind of guide you.
I think that is a reallyinteresting like division
right now where I thinkfor me there's this tension

(26:30):
between like no code, youcould just do it yourself.
You should have done it already.
Just do it yourselfalone by yourself.
And then there's thelayer of like, actually
it's really complicated.
Like we need engineers andlike people who understand the
ethics and the governance andlike, yeah, you know, these
projects like these need to be,more care needs to be taken if
we wanna be able to do this.
Right.
But I think you said it reallyclearly where it's like.

(26:51):
There is this sort of linebetween what needs to sort
of scale into impact for abusiness versus what is sort
of more of an individuallike productivity thing.
And you know, folks on my team,you know, they're in lovable,
they're in N eight end, right?
Like they are just likebuilding a thing to help them
do that thing that they do.
And I guess, and maybe we canwalk through actually this

(27:11):
pipeline, but you know, forus we're kind of individually
making that call to be like,ah, that's just a me thing, so
I'm just gonna do it myself.
Or Wow, this could impactevery podcast that we record.
Like we gotta build this guys.
Could we step throughlike that flow of like.
Even like a bottom up sortof approach where people are
experimenting and sort of whatthey can do with an idea and how

(27:34):
you would like evaluate an ideaof whether this should be like,
okay, yeah, we should buildan entire operational machine
around this and we'll like useour resources to develop it.
Don't worry, you don't haveto code it yourself in lovable
versus like, actually I canshow you how to do that.
It's definitely gonna beuseful, but the scale is
smaller in terms of likewhat we wanna invest in it.

Tim Fisher (27:53):
Yeah, I think that's an interesting question.
I think I, this is such abespoke answer depending
on the organization andhow you're put together.
And it's funny, as I'm listeningto you ask this question, I'm
thinking about the varietyof situations out there with
respect to how much individualcontributors even understand
what they do, how that hasan impact on the business.

(28:16):
Again, so we're back to thislike sort of transformation
and introspective thing.
I I mentioned a little bit agowhere we don't spend a lot of
time or I would argue we don'tspend nearly enough time in
business critiquing what we doand how we do it and its value.
And I think leadership,I'm feeling very lucky at
this particular companybecause I see this happen
a lot, which is great.
The push from the top tohelp people understand how

(28:39):
every part of their, whateverit is that they do in their
job, how that impacts.
The goals of thebusiness, right?
So that constant challengeof communicating these big
strategies and having themtrickle down into what people
do and having clear metricsand KPIs and things like that.
So I say all that to say therehas to be a process by which

(29:00):
when you are experimentingon yourself and automating
something that you maybe thinkhas an opportunity outside
of the walls of your job.
To be able to quantify thevalue to the business, right?
So depending on the maturityof the business around, you
know, the connection betweenthe top and the bottom around,
you know, the value of the workthey do and the business goals.

(29:22):
That can be one of two things.
It can be obvious.
That's something that you'reconstantly tracking and
reporting on anyway, and thosetrackings and reportings are
visible to other team members.
It becomes an easy task foran individual contributor to
say, this is the value thatI think automating this would
bring to my organization.

(29:42):
But without that, which I thinkis much more common out there
in the business world, thereneeds to be some sort of a
process of quantifying that.
I know at at People Inc. Oneof the enormous tasks we went
through across the editorialorganization was digging
really deep and trying tofigure out what is it that
people even do, and thenwhat is the cost of those

(30:05):
tasks and what is the valuethey bring to the business?
And that helped us tackleit, you know, from a
top down perspective.
But it was interesting andfrankly, very fortunate that
we saw a lot of the individualthings that people were working
on and excited about matchingup in ways that made us excited
to scale some of that work.
But it's definitely notgoing to happen again,

(30:27):
this is back to this wholeconversation we're having
about an AI leadership role.
It's not going to happenwithout someone wearing the
hat of this is something weneed to create a little bit
of process around or at leasthave the information about.
And again, I think back to theforcing function thing, I think
a lot of organizations fiveyears from now are going to come
out of their initial sort of.

(30:49):
Experiments and tackling of AIwith maybe not a bunch of fancy
PR worthy AI products to talkabout, but a level of maturity
just around how we think aboutthings and how we decide to
stop or start doing new things.
I think will be worthwhatever, sort of like

(31:11):
anxiety came along with.
I don't know what to do withAI now in this organization.

Galen Low (31:15):
One thing that I had turning in my head is
earlier you had said, youknow, AI leadership role.
There's a sort of consultingaspect to it, right?
A hundred percent.
And as you were talkingabout understanding
what people do and.
That whole process of kindof like auditing, right?
The word audit showed upin my head and I was like,
that's a really scary thingfor like, you know, your

(31:35):
classic C-suite to be like,Hey, we're gonna do an audit
of what you all do and decidewhether it's valuable or not.
Everyone's like, youknow, they're thinking
of like doja, right?
Like they're thinkinglike, oh my gosh, someone's
gonna tell me that I'm notvaluable and haven't been
for the past 15 years and I'mgonna be out on the street.
Whereas I think there isthat leadership layer in
between almost, right?
Where to be like,okay, well, like.

(31:55):
We do need to thinkabout the way we work
through the lens of AI.
It's not AI for AI's sake.
We need to sort of use it toenhance our business goals.
And what you said is we're gonnafind out that some of the stuff
you guys are doing is amazingand things that like we aren't
giving you credit for that aregreat and we wanna like amplify
that or keep them as they are.
And there's gonna also bethings that are tedious and

(32:17):
boring and that you hate to do.
And we wanna look atthat too because there's
ROI there as well.
But I think circlingback to what you said
originally is like.
That person needs to be trusted.
You can't just walk in withthe chainsaw, the proverbial
chainsaw, or a real one andjust kind of go like, Hey,
like we're gonna trim out allthe fat and then robots will
take your job, or whatever.
You know?
It's like there's this trustlayer of someone who can

(32:39):
paint the picture clearly.
At both levels there's abusiness impact and there's a
people impact, and here's howit all kind of fits together.
Whereas without that layer,it can be like, that could be
riots in the streets, right?
It's like, yes, yeah, we'regonna look at everything
you do, pull up your processdocumentation, we are gonna
inspect it to make surethat you're adding value.
Like it's like a really,that's what everyone doesn't

(33:02):
want, but everyone does wantthe tedious stuff to go away.
I know.
Trust is, it'sabsolutely about trust.
Yeah.
No, I think that's huge.
I hope I can have youback on the podcast.
'cause what I'd like to donext time is dig into, you
know, it occurs to me thatall these things that come
up from the suggestion box,which I love by the way.
And I love that idea thatlike if you've got the right
culture and people understandthe goals, you're gonna

(33:23):
get really good ideas inthat submission box, right?
Or if you don't, you're gonnaget a lot of like silly ideas
and someone needs to parse them.
But either way, there'sthat layer that says,
actually, you know what?
This is really good.
Someone's made a reallystrong business case.
'cause I do think it'sbusiness casing at the end
of the day to build a thing.
And like it's greenlit by leadership and
now it's a project.
That's what I would love to getinto next time of like, okay,

(33:44):
well some of these things, spawnprojects, which obviously my
audience is very interested in.
I have been talking a bitabout, you know, sort of like
AI enhanced ways of working,so like how can we also deliver
projects better using AI?
So I won't get too deepinto that now, but I
thought maybe I can like.
Roll it up because it'sbecome clear to me in this
conversation that it's avery challenging role, right?

(34:05):
Like, and I know a lot ofpeople on paper, they're
like, that's a made up role.
Or you know, like, whatdoes that person even do?
Or aren't they just themessenger for the CEO, you
know, with proverbial chainsaw,but it is challenging.
So I thought maybe I'd justget your perspective on like
from where you sit rightnow in this organization,
Black & White Zebra, what'sthe biggest challenge that
you see in front of you, andwhat is your hypothesis about

(34:27):
how you're gonna solve it?

Tim Fisher (34:29):
The big question.
I think the biggest challengeis creating a culture of
change, a culture of failingfast, and a culture of
comfort with ambiguity.
Notice I didn't saya single tech thing.
The headlines are all aboutmodel hallucinations and data
quality and prompting challengesand all these like awful,
horrible things and like theseare all solvable challenges.

(34:52):
The models update constantly andthey get better all the time.
There are plenty of ways todeal with that hallucinations,
you know, grounding yourinputs in more trustworthy
information, you know, ragfor anybody that's, you know,
related into this stuff.
But implementing AI in anorganization is way more
of a human thing than itis a technology thing.
In other words, I thinkan AI executive job is to.

(35:13):
Manage the expectations at thetop, which is like me telling
the CEO, no, there's no magic AIbutton for us to buy or build.
That's not how it works.
But otherwise it's managingchange like all over the org.
Like that's reallywhat it's about.
And you mentioned thisearlier, there's all kinds
of popular change frameworksout there for businesses.
Like, you know, there area number of companies that

(35:33):
make billions of dollars ayear helping to like manage
all of that stuff, right?
But I don't think they'regonna be right for the speed
at which AI is coming atus and the scale at which
that it's making changes.
So this changes faster andscarier than the internet.
I mean, AI fundamentallyalters like the what
and the how of work.

(35:54):
We don't know whattomorrow looks like.
That's frightening.
I had an opportunity a coupleof months ago to go spend
some time at OpenAI, thecompany that made ChatGPT.
A chief economist, they've hireda chief economist because so
many people, them included ofcourse, they want to understand
and talk and sort of helpdirect how this is all changing
everything at a really highlevel, you know, the economy.

(36:16):
So I'm in this room with allof these C-level executives and
all these, you know, a few ofus have AI titles, but most of
them it's CEOs, CIOs, CFOs, youknow, the people who are making
decisions about these things.
And the conversations in theroom were not about their
businesses or about wheretheir vertical was headed.
It was about what my kidshould major in college.

(36:38):
It causes this.
This is such an enormouschange that these people in
this, like once in a lifetimeopportunity to spend time at
this company in the big roomtalking about these things,
and they have a very humanreaction because they know just
enough about what's happeningto know that my daughter,
who was gonna be a computerscientist, is now wondering if

(37:00):
that is the right path forward.
And the other side of that,like, well, what is the
exciting next role in the world?
Like, like where should we bespending our time thinking?
And I, that really hitme hard when I heard
all these people like.
Stepping outside without asecond thought of like why
they were there and likethey were there for lots
of really big companies.

(37:21):
It's just, it wasn't thequestions that I expected
to hear, but it made allthe sense in the world.
So I tell that story just tomake really clear, these are
enormous changes and it'snot just the people down at
the bottom doing the workthat are like, you know,
everyone is very anxiousabout what this all means.
So, all that said, though,I think the opportunity

(37:42):
every company has isburied in all of that, like
anxiety and fear, right?
Like humans and organizationsthat can change are
going to win full stop.
That is what will happen.
The other part of this waslike, what are my plans
for, you know, doing this?
Right?
I don't think there'sa universal template.
I do not.
I think there are some themesthat are really important.

(38:02):
So I would say normalize AI.
It's here and it's messy.
So talk about it a lot.
Talk about it, honestly,talk about it openly.
Get leadership on boardif they're not on board.
And then get them totalk about it more.
Talk about it.
Honestly, talk about itopenly and no secret plans.
That is such a destroy oftrust, like, talk about
the plans, talk about 'em.

(38:22):
Honestly, talk about 'emeverywhere all the time.
There's a theme here you'reprobably hearing, which is
like radical transparency,which I'm a huge fan of,
and I think, you know, andit's a, the fastest trust
builder I've ever come across.
And you've said trusta couple of times.
I cannot express howimportant I think that is.
I also think you have tonormalize rapid change.
I hear a lot like frustrationwith leadership, changing

(38:44):
their mind all the time,but changing your mind
with new information is.
Strength, like that'slike a reality based
response to a world wherenew information is being
thrown at you all the time.
I would be very fearful ofan organization that sets
an AI strategy that doesn'tchange about every quarter,
because that probably meansyou're not paying attention to

(39:07):
what's going on in the world.
I also think you haveto normalize failure.
I think you have to fail fast.
I mean, people hear thatphrase all the time.
You know, you have tobuild systems to make sure
that you don't, you know.
Speed dead horse, so they say,or whatever it might be, right?
You just continue ondown, like there's just
so much change happening.
The harder part of this,and the one people don't
talk about often enough,is failing publicly.

(39:27):
I mean that insideof your organization.
You know, we all like, I think,deeply understand that when we
fail, we learn something andthen we change when we move on.
If you feel publiclyin your organization.
The entire organization learnssomething that is frightening
for most people to consider.
But when you fail in public,whether that's in a public
slack room or in a meetingor whatever it might be,

(39:50):
everybody there learns somethingand I think that makes the
organization move faster.
Then I think the most importantthing, and I hope I say this
enough, admit that I don't knownormalize, I don't know this
over competence in what the AIfuture means and its impact on
your organization or the world.
Someone's selling you somethingand often themselves when

(40:12):
they are expressing that.
I know Sam Altman has a job as amarketer for his company, right?
Like that's a verydifferent story.
But for leaders in AI andtraditional leaders in an
organization, that's a bad sign.
So yeah, I have an easy job.
Right?

Galen Low (40:27):
Yeah, totally.
So the one thing that reallystood out to me there is.
We started talking about thisnotion that like, yeah, change
managers, you know, like, youknow, when I was working at
a big consultancy, there'sa whole team of people like
that was an offering to likemanage that change, but it
was always manage that onechange or manage some change.
And what I appreciateabout your framing is.
The notion that this is agenerational change, right?

(40:48):
You're in a room full ofexecutives, they're talking
about what their kids are gonnamajor in because we are thinking
about change at the scale thatit's actually impacting multiple
generations ahead of us.
It's not like that, like shouldwe renovate the lunchroom?
This is like, this willfundamentally change the
way organizations and groupsof people are structured to
also deal with more change.

(41:09):
You know, we talk about thepandemic and you know how
that kind of like primed us tounlearn some of the myths that
we had about the way we work.
And this is almost likethat next wave of being
like, and guess what?
We don't have to have, youknow, an airborne virus to like
figure out new ways of working.
This technology can push usforward, but it's gonna change
everything, not just the workwe're doing today, but like

(41:30):
the education system, right?
Career pathing, you know,like, all of these things.
That's why it's a big deal.
That's why it makes senseto have it in the title, not
VP of Change or VP of ChangeManagement, or Executive
Change agent, but VP of AI oryou know, CH with ca, I don't
know what the acronym is.
Chief AI Officer, CAIO.
And the reason why it's, youknow, it's not a buzzy title.

(41:52):
I mean, it might soundbuzzy, but it's actually
because it represents thislike fundamental shift.
The change, not just a change.
And you know, that's what'sreally interesting about it.
Its impact is prettybig and cannot be
understated and even still.
Most of it's not about thetechnology, it's about the
sort of radical transparency,this trust, and if I were to

(42:13):
be honest with you, to showmy bias, that's what I'm super
interested in because I'veworked in a lot of places
where, you know, psychologicalsafety is something that
might be written on the wall,but isn't always the thing
we practice that sort oflike, cover up your failure,
you know, cover your butt.
Like is, it's systemic.
It's natural.
As you know, social creatures.
This is a moment of likeinnovation where leadership is

(42:35):
not knowing, leadership is notknowing, but maybe taking a
chance on it, and then figuringout how to pivot and change as
we fail, as we learn together.

Tim Fisher (42:43):
Agree.

Galen Low (42:44):
Tim, thank you so much for this.
Just for fun, do you have aquestion that you wanna ask me?

Tim Fisher (42:48):
Would it be too rude if I ask you, so you
mentioned at the very beginningabout whether VP of AI is
a real job, so I'm going toflip this right back to you.
So let's imagine a world whereAI is reshaped everything, you
know, certainly delivery work.
And so do you think being aproject manager is about to
become a made up job title?

(43:09):
Is it about to changeso much that it doesn't
make sense anymore?
Evolve into somethingbigger, something different?
Like where do you.
See this takingproject management.

Galen Low (43:18):
Touche.
Tim Fisher.
Touche.
But I deserve it.
And actually, you know what?
So here's an unpopular opinion.
I would say probably yes, thefunction I think elevates,
but I think the role and thetitle actually could go away.
And for anyone who's like seenme speak in person over the past
couple years, like we do thisexercise where we we make up

(43:38):
jobs that don't exist, right?
Director of human andmachine ways of working
and methodologies, right?
Like stuff that we don't doright now, but gosh, like
if we could, everythingwould run smoother.
It's just that we're sostuck in the weeds and
the job is to like, youknow, everyone describes
it as like herding cats.
You know, you're a projectleader, you don't have
like authority over anyone.
Gosh, everyone's justrunning around running amuck.

(44:00):
Leadership keeps changing theirmind and like, you know, you've
gotta manage the iron triangleof scope, schedule, and budget.
Part of me actually hopes.
It elevates beyond that.
And to your point that itactually maybe gets a new
title because it's alreadyimbued with a lot of stuff.
We already have a brandingproblem with project managers,
but the excellent projectmanagers out there do more

(44:20):
than what most people thinkof as project management.
And they deserve tobe titled as such.
Their role deserves tobe described as such.
And there's an education piecethere too, just like with AI
to be like, okay, well deliveryof value through like human
and machine collaboration like.
It's gonna be really important.
You know, it's important today.
It's gonna be more importantlater, as we talked about,
more projects are beingspun up because of like

(44:42):
emerging technology like AI.
And in some ways it's justnot good enough to like make
a plan and hope it staysthe same for like 18 months.
Like it's just notgonna happen anymore.
But yeah, coming back toit, I think it will become
a made up job title.
It'll become like thisanachronism, but in a good way.
Because I think we thenshift on like, I don't
just like, I don't want my.
Title to be computerin 2025, right in 2045.

(45:06):
You might not want your jobtitle to be Project Manager.
Heck, 2035 2030.
You know, change ishappening fast, but.
Yeah, I do think so.
I actually do.
Okay.
I think it will change andI think that could be good.

Tim Fisher (45:19):
Yeah, I agree.

Galen Low (45:20):
Thanks for playing along with that.
And Tim, thank you so much forspending the time with me today.
This has been a lot of fun.
Before I let you gothough, where can people
learn more about you?

Tim Fisher (45:27):
Just head to LinkedIn, search
for Tim Fisher, BWZ.
Fisher without a C.
And you can't miss me.

Galen Low (45:33):
Thanks again, Tim.
I really appreciate this.

Tim Fisher (45:35):
Thank you very much, Galen.

Galen Low (45:37):
That's it for today's episode of The Digital
Project Manager Podcast.
If you enjoyed thisconversation, make sure
to subscribe whereveryou're listening.
And if you want even moretactical insights, case studies
and playbooks, head on over tothedigitalprojectmanager.com.
Until next time,thanks for listening.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.