Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Andreas Welsch (00:00):
Today, we'll
talk about how you can grow your
chief AI officer role.
And who better to talk about itthan someone who's actually in
that role, Matt Lewis.
Hey Matt, thank you so much forjoining.
Matt Lewis (00:10):
Hey Andreas, thanks
very much for having me.
Andreas Welsch (00:13):
Awesome.
Hey, why don't you tell ouraudience a little bit about
yourself, who you are, and whatyou do.
Matt Lewis (00:18):
Sure.
I'm currently Global ChiefArtificial Augmented
Intelligence Officer at InizioMedical.
I've been in that role, which isChief AI Officer, essentially.
It's a lot easier to say thatthan a whole long title.
For about six months now I'vebeen in the role.
Before that, I was a globalchief data analytics officer for
the firm, six years in thatrole, and I've been with Inizio
(00:39):
all total for eight years.
I've been in life sciences mywhole career, about 25 years.
Inizio is the largest end to endcommunications and consulting
firm for life sciences.
We work to help groupscommercialize new interventions
for a variety of differentdiseases and health conditions
across the planet.
We're using now artificialintelligence to kind of speed
(00:59):
time to market and help ensurethat people are able to manage
their health.
Andreas Welsch (01:04):
That's awesome.
Thank you so much for thesummary of who you are in your
bio and I think especiallyhealthcare is such an important
industry and topic and wherethere is a lot of discussion
about AI, but also given it is aregulated industry, I know it's
not the easiest of industries todo AI.
And given all the hype and thebuzz around generative AI these
(01:27):
days, I'm really glad to haveyou on and to hear more from you
on what really matters forsomeone in a Chief AI Officer
role and maybe even how you canbecome one.
So for those of you in theaudience, if you're just joining
the stream, drop a comment inthe chat where you're joining us
from today.
I'm really curious to see howglobal our audience is.
And I know people have joinedfrom all different parts of the
(01:49):
world previously.
So I'm always eager to see whereyou are joining us from.
Matt, should we play a littlegame to kick things off?
Matt Lewis (01:56):
Sure.
Andreas Welsch (01:57):
This game is
called In Your Own Words.
And when I hit the buzzer, thewheels will start spinning.
When they stop, you will see asentence that I'd like you to
complete in your own words.
To make it a little moreinteresting, you only have 60
seconds to do that.
Are you ready for What's theBuzz?
Matt Lewis (02:17):
Sure.
Andreas Welsch (02:19):
Awesome.
Then let's get started.
If AI were a movie, what wouldit be?
60 seconds on the clock,starting now.
Matt Lewis (02:30):
That's rough.
If AI were a movie.
Okay first I'll say that AIwould probably not be any of the
actual AI movies that have beenreleased before, say April of
2023.
Because all the movies that areout there that people have
actually seen, I've seen themall, are all these like kind of
Pollyanna-ish or doomsdayscenarios of what AI is.
(02:51):
That actually don't reflect whatgenerative AI is capable of
doing today, or what augmentedintelligence with AI can
actually do with humans in theactual world.
I might say that a movie couldbe maybe like a mix of the movie
Evolution, which is starringDavid Duchovny from 20 some odd
years ago.
Where this alien life form comesto earth, people didn't expect
it, and then in a very shortamount of time, it takes over
(03:13):
everything and eventually humansfigure out how to both work with
it and manage it.
But it is able to grow inproportions that no one expects
and quite quickly that kind ofcaptures a lot of what's going
on right now in generative.
Andreas Welsch (03:25):
That definitely
sounds like there are a lot of
similarities with AI evolving soquickly that sometimes if you
skip a day or two of news, youmiss some really important
things, but at the same time,it's hard to stay on top of
everything that's going on aswell.
Matt Lewis (03:42):
Yeah I said,
Andreas, back earlier in the
year, like maybe May or so, Iwas speaking at a conference,
and I said that a week ingenerative is like a quarter in
the real world.
And now, my team makes fun of itbecause it's, true.
It is you miss like a couple ofdays and it's everything is
turned on a dime.
It's almost impossible to keepup because things are changing
so quickly.
(04:03):
And if that's true, if a week islike a quarter in the real
world, if you miss like a monthof what's happening in our
environment, that's like a yearof like a real marketplace kind
of activity.
A lot can really change in avery short amount of time and it
requires like constantvigilance.
Andreas Welsch (04:19):
Very true.
So I'm seeing more and more ofthose gray hair as well.
Yeah, all over.
So I'm taking a look at the chatand it's awesome to see.
From India to Charlotte, NorthCarolina, Finland, Canada, all
over the place.
Indonesia, Dubai.
Thank you so much for joiningus.
(04:41):
Really awesome to have you withus.
And if you have any questionsfor Matt or myself, please feel
free to put them in the chat aswell.
I'll take a look in a few moreminutes to see what's on your
mind and how we can help answerthose.
Now we've obviously want to talka bit about the CAIO role.
And I thought maybe it's goodto, to level set because maybe
(05:04):
not everybody has come across aChief AI Officer in their
business or in their rolebefore.
So I was wondering, what does aChief AI Officer do?
How is it different from maybe ahead of a center of excellence
or other types of AI, data rolesthat you see?
And to whom do you actuallyreport in your company?
Matt Lewis (05:22):
So I'll start there
first and then we'll work back.
So I report to our divisionalpresident.
I think in larger companies,that's going to be fairly
typical that the Chief AIOfficer role will report up to a
divisional president whereyou're managing a very large
business and the divisionalpresident has a singular
responsibility for a very large100 million, multiple 100
(05:45):
million, billion dollar P&LWhereas in smaller companies, it
might be the CEO.
That's a direct report.
Before I was in this role, I waschief data analytics officer.
And I think a lot of folks thathave a CDAO responsibility might
have like a CTO report, mighthave a chief digital officer
report.
I, do feel very strongly thatthe chief AI officer role does
need to report up to the chiefexecutive within the business,
(06:09):
whether that's a divisionalpresident or to the CEO for a
number of reasons.
The role itself, the Chief AIOfficer role, is a strategic
role.
It's not really a functionalexpectation, and it has
expectations with regards tostaffing and support, resources,
financial for standing up linesof business, but also for
(06:30):
progressing the transformationof the organization at large.
And under another kind of partof the organization you'll,
always be downstream from thatconsideration and difficult to
advance the expectations of thebusiness forward as a member of
the leadership team.
So it really does need a directline.
In terms of what the work lookslike, I think it does differ
(06:51):
depending upon, to your point,whether it's part of a regulated
industry, whether it's anothervertical, whether the
organization is wholly focusedon, say, internal
responsibilities or whether theywere a customer or client
responsibilities or essentiallywhat, kind of line of business
they're in.
But I can tell you my work isnow that I've been doing it for
(07:11):
about six months there, thereare at least probably like four
main work streams that I'mresponsible for.
The first of which is definitelyupskilling and staying up to
date and remaining on top ofwhat's actually emerging in the
space, which almost feels like afull-time job because there's so
(07:32):
much as that's coming out thatit's hard sometimes just to stay
on top of what is emerging, notjust in terms of literature
which is the late press andwhat's coming out from
newsletters and blogs, but alsowhat's coming out in the peer
reviewed press and journals andacademic settings, and also from
other experts that I speak within the generative AI space and
(07:53):
AI space broadly to pressuretest and ensure that our
approaches are rigorous andvalidated before we actually go
to production.
So that's a kind ofconsideration by itself.
And then the first part of mywork is what I might call like
enablement, which is really abit of like evangelism, like of
(08:13):
helping the space at largerecognize what good looks like,
what the standards and bestpractices and expectations
should be within this emergentspace as things progress
forward, as we think about whereaugmented intelligence and
generative AI and, otherconsiderations should be for
stakeholders within thediscipline.
(08:33):
How they should enact policiesand protocols and practices to
ensure that they're doing whatgood looks like as things move
forward.
And then also a lot of educationfor those teams, for groups that
are part of our organization,for outside organizations.
Sometimes delivered directlythrough professional societies.
Sometimes partnered with apartnership that we're
(08:54):
generating with one of the bigtech firms at present to deliver
training to a wide group ofpeople.
And really thinking about, froma competency perspective, what
are those skills that are goingto be necessary in the near
future versus the far future tohelp people remain competent and
kind of future-proofed againstwhat's coming.
So that's really just the firstpiece of my work, the enablement
(09:15):
piece.
The second part is reallyrelated to what might be called
like governance, which is reallythinking strategically about if
the organization that we residewithin now is exceptional at
delivering this set of solutionsand services and software and
whatever we're doing as a grouptoday.
What will we need to be doingtwo years from now, five years
(09:38):
from now, to win in themarketplace of ideas?
And what will that pivot looklike as transformed by
artificial intelligence?
And as a result, what resourceswill we need?
What staff capabilities will weneed?
What structures and systems andprocesses will we need with
regards to things likecompliance and ethics and, which
(10:00):
you and I were talking aboutbefore, and governance and
provenance and all the rest thatare going to be necessary to
ensure that our customers cantrust us and etc as we progress
forward.
And a lot of that existed in theCDAO environment, but it was a
bit nascent perhaps.
Now it's being nurtured andcultivated into more of a robust
consideration in many groups.
(10:20):
And then the last piece, whichis probably what I might call
like imagination, perhaps isreally working with customers,
working with clients to thinkabout how they stand up
generative AI, other artificialintelligence implementations
within their environments sothat they can supercharge and
really 10X what's possible intoday's environment and start
(10:44):
getting different types ofoutcomes that are incrementally
beneficial than what's possibletoday.
And the types of things thatthey want to do are varied.
Everyone doesn't have the samegoals, but they all are levered
around the same types ofconsiderations where AI is
juxtaposed on top of legacy orexisting processes.
Andreas Welsch (11:07):
That's awesome.
I really like how you aredescribing your role and how
it's multifaceted and on onehand grounded in the technology
and the data, but on the otherhand, that multiplication
evangelism type component to itas well to help others
understand what's theopportunity.
What can we do with it?
And I think we talked about thatbackstage a little bit.
(11:29):
It's actually not there toreplace you.
It's there to help you getthings done more quickly,
faster, get insights that youhaven't been able to get before
and so on.
So I think that's a reallyimportant component to combine
the two because to your point,also from what I hear where your
role is in the organization.
(11:49):
If it's on eye level with yourpeers in different business
functions, in differentfunctional roles, I would
imagine it's a much different.
Much more different conversationthan being in either a
technology or an IT or data roleand so on.
Matt Lewis (12:05):
Yeah, that's exactly
right.
The one I couldn't have said itbetter myself.
And the way that I've expressedit, like when people ask me
internally, I guess sometimesyou'll be like you were doing a
data analytics role.
What's different about this newrole?
And I said before there might'vebeen people that had a
consideration of dataengineering or the cloud or data
lakes or storage or the rest.
(12:26):
And, but those were somewhattactical considerations.
Like they still could do theirwork and then they would work
with us to do data analyticswhere it was appropriate.
And that's great.
But the work that we're doingnow under the AI mandate is not
tactical, it's transformational.
Everything that exists withinour company and everything that
exists globally, by firstsociety at large, will be
(12:48):
transformed by AI in the days tocome.
And as a result, the CAIO roleis to really catalyze that
transformation and to have therole live under another group or
to live downstream somewherecouldn't really serve that end.
It really needs to be central.
Andreas Welsch (13:05):
Great point.
And I think a great example ofhow this can work in a business.
I'm taking a look at the chathere.
Let me see.
One of the questions is aroundevolving your career and,
transitioning.
Maybe even if you have adifferent background in and
you're not a data scientist oran AI engineer by trade, how can
(13:28):
you move into more AI typeroles?
Do you always need to learnPython and R and these kinds of
languages to move into theseroles?
Or are there other opportunitiesto learn about the business or
to bring in what you know abouta certain industry?
Matt Lewis (13:44):
Yeah I think that's
a really great question and we
get asked that question a lot,both internally and externally.
It isn't necessary to have adeep tech background to
transition either into aleadership role like this or
like an AI specialist role,which I think a lot of people
are interested in these dayswhere they're either dabbling in
AI, they're talking a lot aboutAI out in the community or
(14:06):
they're attending conferences orinterested in really doubling
down in the space.
I think one of the things thatreally differentiates folks that
kind of stay in like a legacyrole versus transitioning into
more of a dedicated role isreally a little bit of what
might be called the intellectualcuriosity, and a bit of learning
agility to recognize the valuethat can be extracted from a
(14:29):
novel role that is differentfrom, but perhaps related to the
current position.
I think that really hasn'treally changed in the 25 years
I've been working, the onlydifference is that now, I think
if you're able in role, whetheryou're in an analyst role, if
you're saying data analytics, oryou're in a strategist role, if
you're in the CSO suite or in adigital role, and you're able to
(14:50):
start working with off the shelfapplications like generative
applications that exist that areavailable for license or that
you can use on your phone orwhatever.
And you can see what's possiblein or around the edges of your
work, both in terms of speedingtime to decision or being more
creative or validating conceptsthat would have taken much
(15:11):
longer to produce.
And you can start imagining whatyour work would actually look
like two, three months down therange.
You can actually start doingsome of that, standing those up
as like mini experiments anddemonstrating to the business
that the work is both possible,validated, and can actually be
transacted on.
And I've seen a number of peopleboth.
Within our firm, as well asother groups demonstrate that
(15:32):
next most likely work isactually the job they should be
doing.
And then they just transitionfrom one to the next by showing
that this new role, which isessentially emergent work that
didn't exist pre-generative isactually better for the
business, better for them.
And then the alignment happens.
So it's a less hey, you guys areposting this thing out there
that I want more than hey, Ithink I could actually create
(15:54):
more value for the business bydoing these things that I'm
actually already doing.
And then the transition justhappened.
I've seen a lot of thatrecently.
And where people said that theywere an analyst or a designer or
a strategist or something, andnow they're in a role that
they're still trying to figureout what the title is, but it's
a higher paid position with morevalue for the business.
And that is a better alignmentwith what they're looking to do
(16:16):
I think for the Chief AI Officerrole specifically, I've seen
people come that are really deepin on the data science side,
that are really deep in on thedev side, that are deep in on
either strategy or digital andtransition over, but it is
challenging to do a role, Ithink, like this without Some
subject matter expertise inwhatever business you're in.
It doesn't really matter whatthe business is, but if you're
(16:38):
at Coca Cola, you have tounderstand that organization and
business to some degree, becausethere's the corporate part of
the role working alongside theC-Suite.
It has an expectation that issomewhat divorced from the core
technological considerationsjust in terms of the P&L, in
terms of asking for headcountand making investment cases and
(17:00):
working alongside consultancies,but you need to have an
understanding of the business tobe able to speak alongside your
peers in that regard.
Having a deepened understandingof the tech is helpful and
important for sure, but thereare aspects of the role at the
level that are necessary withinand across the business.
Andreas Welsch (17:18):
Perfect.
So definitely sounds like goodopportunities there.
If you want to grow into thatkind of a role, or if you even
want to do more with AI in arole that you currently have or
are looking to evolve yourcareer and take that to the next
level since obviously that it'sa lot easier to get exposed to
(17:38):
different kinds of AI and dabblein it to some extent without
having to be a deep expert ondifferent programming languages
or more detailed type of things.
Now, I'm wondering, it goesalong the same line, right?
How do you stay current on newtopics in your role?
(17:58):
What matters to you, especiallybecause it's on one hand so
strategic, but also so broad andso deep at the same time?
And how do you stay current andwhat keeps you up at night?
Matt Lewis (18:10):
It's such a
challenging question.
That's like the hardest questionprobably of all the things
you'll ask me.
It is so difficult to do that.
There's a quote that I often goback to.
Kevin Kelly, who used to be atWired Magazine, who's came out
with this short book of famousquotes that he things he wished
(18:30):
he knew about later in life Andone of his quotes is:"You can't
control how much work you have.
If you work on anything that'sworth doing, the amount of work
you have will never end.
Because it's a worthy cause andyou could work on it forever."
The only thing you can control,he says, is your time.
Your time is the only thing thatis amenable to intervention from
(18:52):
you as a professional in role.
There's literally no end to theamount of AI news that will come
out.
You can literally sit at yourdesk 24/7 x 365 and news will
forever come.
It took me a while actually torealize what that meant within
this space because I was working70 hours, 75 hour weeks from
December through May, June, justto stay up to date and still
(19:15):
deliver against my expectations.
And it just, it wasn't tenable.
I just couldn't do that level ofcommitment and still deliver
against my actual expectations.
And in the summer, I switchedthe way that I was learning, and
I just kept a kind of timeclock, like a little timer on my
desk, where every day, I jumpinto the AI pool and jump back
(19:36):
out 45 minutes every day.
I don't do more than 45 minutes,but I time it.
And I start the clock and I stopthe clock exactly when I'm doing
anything that's direct learning.
There are about eight AInewsletters on like Substack and
Beehive and a couple of LinkedInones that I look at directly.
There's some podcasts like yoursand a couple others that I look
(19:58):
at.
There are some peer reviewedpublications I look at, some
conferences, but I don't do morethan 45 minutes.
It doesn't matter where I am inthe section.
But when my clock goes off, I'mdone.
And, whatever I learn I save andI annotate and I map everything
in the concept map that I use.
And so when I need to come backto what I've learned, I've
(20:19):
stored it already for later use,which for me is mostly
presentations that I give.
I don't have to do that samework twice.
I only dip in once.
And then when I use it later,it's already analyzed.
That works for me, becausebefore I was spending like
three, four, or five hours aday, just trying to learn.
And it wasn't efficient becauseby hour two, hour three, I was
(20:39):
so exhausted from 12 hours priorof working.
That by hour 14, hour 15 ofbeing awake and working, I just
wasn't getting much out of it.
Andreas Welsch (20:48):
That's, I think,
a really good recommendation,
how you can structure it andstill get to the essence of what
you need to learn and want tostay on top of it without
burning out.
I think Louise had a questionearlier where she said, hey, how
do you balance being an AIenthusiast and an expert and
trying to stay on top of thesethings with your personal life,
if it's interwoven and ifthere's such a flood of
(21:11):
information?
I really like how you'vedescribed it and timeboxing it
and writing it down or savingit.
So if you need to come back,yeah, at least know you've read
it already and you know where tofind it.
That's awesome.
Matt Lewis (21:23):
Yeah.
I think I'll just clarify thatcomment and say, I'm not really
an AI enthusiast at all,actually.
I'll say that I'm like anaugmented intelligence
enthusiast in that I thinkthere's a lot of annoying things
that exist in actual human life,both professionally and
personally that augmentedintelligence can make better for
all of us as people.
And we're right at the cusp,like right, at the precipice of
(21:46):
being able to make our worklives significantly less
painful.
And a lot of our personal livesmuch more enjoyable by the use
of AI.
And I call that augmentedintelligence.
I think it's like we're just nowbeing able to do that.
I can do a lot of it because ofmy position.
But the rest of the world islike about to see what's
possible, because of thataugmented approach.
(22:07):
I favor the augmentedintelligence side of my title
much more because I really thinkthat if we do this right, it can
make a lot of people that aredoing mundane, rote things all
the time, which are quiteannoying and boring, take those
away, make the work morepleasurable.
And then for people that want todo things in their personal
time, nights and weekends,whether it's music or art or
(22:28):
sports or whatever it is, thereare lots of ways to enjoy our
free time much better and muchmore richly using AI that is
just starting to be woken up to.
Andreas Welsch (22:38):
I really love
how you phrase that and frame
that.
Especially the part aboutaugmenting intelligence and
augmenting our intelligence.
Now we're getting close to theend of the show, and I was
wondering if you can summarizethe key three takeaways for our
audience today?
And maybe talk a bit about howyou see the Chief AI Officer
influencing the business andinfluencing these different
(23:00):
stakeholders as well as part ofthat transformation.
Top three key takeaways.
Matt Lewis (23:05):
The Chief AI Officer
role is it's an executive level
position that is critical inensuring the successful
transformation of existingbusinesses to the kind of future
modern digital businesses thatwill exist in the next three to
five years.
And it has to report to thesenior executive within the
business, whether that's adivision president or a CEO, to
(23:26):
be successful.
Everyone's going to have adifferent set of work streams.
I shared mine.
Depending upon vertical andfunction and private or public
or what they do, it's going tobe a different mix of things,
but probably some educating,some standing up models, some
implementing, and a lot oflearning.
Also, it's not a job for peoplethat don't like talking to
people and don't likeinteracting with people.
(23:48):
Because you're with a ton ofactual humans all the time,
every day, all the time, becauseyou're trying to help people
learn and learn differently andwork differently than they've
worked in their entireprofessional career.
And that takes a lot ofdiplomacy and a lot of
discussion and a lot ofunderstanding of how to work and
how to work better.
And then the last thing I'd sayis augmented intelligence is not
something that we invented.
(24:09):
Gartner actually came up withthis concept originally about
eight, nine years ago.
It sees like the human as likethe alpha and the omega of the
AI picture where humans begin.
They develop the software, theydevelop the model, they
implement the plan with AI onboard.
They interpret the results thatcome back.
They determine therecommendations that go forward
to the user and then workingwith the system to figure out
(24:32):
how to implement them so thatthe results are incrementally
better than either the AI or thehuman can do alone for the
mutual benefit of all involved.
That's augmented intelligence.
It doesn't matter whether it'sin commercializing novel
molecules for pharmaceuticalcompanies, like we do, or it's
in helping to find a sampleplaylist off of remixes from the
(24:52):
1970s in your personal life.
It's still finding more value inthe world.
Andreas Welsch (24:58):
Thank you so
much for that summary and
hitting in the key points thatwe should be aware of.
And Matt, thank you so much forjoining us and for sharing your
expertise with us.
And for those in the audiencefor being with us today, again,
super global audience.
I enjoy seeing how far of areach we're able to create.
Matt Lewis (25:19):
All right.
Thanks again for the time.
Much appreciated.
Thanks everyone in the audience.