All Episodes

July 16, 2025 41 mins

Send us a text

Trust meets technology in this illuminating conversation with Mark Fitzgerald, who leads global development initiatives at KPMG. Speaking from the Ideagen Future Summit in Washington DC, Fitzgerald challenges our understanding of digital transformation with a startling assertion: "The future is already here."

Through compelling real-world examples, Fitzgerald reveals how our relationship with technology is fundamentally changing. He shares how younger team members view AI not as tools but as colleagues—opening separate devices each morning to work alongside their digital counterparts. This shift signals profound changes in how we'll approach everything from education to professional services in the coming years.

The conversation weaves through critical questions about trust in digital systems, the massive energy demands of data centers, the evolution of impact investing, and how technology is reshaping international development across 70+ countries. Fitzgerald offers unique insights into how organizations are already planning their workforce needs for 2029-2030, underscoring that AI's transformation of work isn't hypothetical—it's happening now.

Most powerfully, Fitzgerald frames digital transformation around two essential elements: data (the objective component) and trust (the subjective human element). As he explains, "Data about yourself is yours," highlighting the growing importance of personal data ownership in our increasingly connected world.

Whether you're a professional wondering how AI will reshape your career, a leader navigating organizational change, or simply curious about how technology is transforming global development, this conversation offers valuable perspective on navigating a future that has already arrived. As Fitzgerald concludes with his call to action: understand that your role has already changed, and you have the opportunity to shape what comes next rather than merely responding to it.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:12):
Welcome to the IdeaGen Future Summit here in
Washington at the NED live.
I'm honored and privileged tobe here with my good friend Mark
Fitzgerald from KPMG.
Thank you, George.
Mark, welcome privilege to behere with my good friend, mark
Fitzgerald from KPMG.
Mark welcome.
Mark is changing the world atKPMG as he works with global

(00:33):
institutions and governmentsacross the world.
Mark, we're here to talk aboutwhat so many people are talking
about, which is transformation,specifically digital
transformation that's reshapinginternational development and I
want to ask you how are thingschanging?
What is this transformationlooking like?

Speaker 2 (00:55):
Perhaps, if you can see, that far over the next five
years.
George, I always do this to you, but I'm going to go off script
straight away.
Apologies for this.
I want to recognize today isthe 6th of June, so D-Day 81
years ago, and a lot of yourconversations today are the
future of education, healthcareand so forth.

(01:16):
But if I think back to that dayand how it shaped the future
that we have all lived throughfor the last 81 years, it's
really quite pivotal.
So everything you mentionedwhere I focus my time, with the
World Bank, the UN and so forthall those institutions were
established after World War IIand that world order still

(01:39):
exists today.
I know some people have a viewthat it's under attack and a lot
of people have a lot ofopinions about it, but that
order has been in place, thatstructure has been in place for
eight decades.
So as we think forward into thefuture of digitalization,
government development and soforth, we do have to lean back a

(01:59):
little bit and reflect where wecome from, because our
perspective is different,whether we've been around for
the eight, eight decades,whether being around for, you
know, one decade, because how weembrace technology will change
on that perspective.
One quick anecdote I got an uberin today and this is an overlay

(02:24):
on digitalization, trust andinnovation.
This Uber driver was great.
His name is Jose, he came from,I think, mexico about 20 years
ago and he was full of chat andhe wanted to tell me about
another passenger he hadpreviously in his Uber.
And this guy he didn't know whohe was, but he was telling him

(02:47):
a story.
He said, oh, I orderedsomething from Amazon and I had
five packages arrive at mydoorstep.
And then when I got home, therewere three packages in my home
and I wondered where did theother two go?
Because I got a picture fromAmazon to say here are my five
packages when they weredelivered.

(03:08):
So he trusted that they arrivedwhen they said they were going
to arrive.
So he asked his housemate wheredid the other two?
He said no, I didn't take themwhatever.
So they went into their ring,the webcam on the door to see
what happened.
And what happened was the Uberdriver sorry, the Amazon driver
brought the packages, put themall out the five of them, took a

(03:29):
picture and then picked up twoand left.
So my point here is we have acertain degree of trust through
that digital platform that whatwe order will arrive.
We also have a degree of trustthat who's delivering those
packages will deliver them andleave.
So your immediate thought iswho, from the street, took those

(03:51):
packages?
But it wasn't.
It was the actual driver.
So trust is inherent in the useand interaction of the digital
platform.
What was interesting, though,just to round out the story, the
other passenger he was talkingabout ended up being an Amazon
VP, because of course, they'vegot a big base here in Crystal
City.

(04:11):
He was appalled because he wasrelatively high level.
He doesn't get to hear thesestories.
He gets to hear of the theft,but your immediate thought is
not from the drivers, it goesfrom elsewhere.
So the process of taking thepictures is to help the customer
understand.
Amazon has delivered on itspromise, sure, but the trust in

(04:34):
the delivery mode and the driver, that's where he was taken
aback.
So it was a humancentered storywrapped in a digital experience
.
Wow, Speechless, yeah so sorry,what was the question I
completely went off topic buthow it relates to international

(04:59):
development.
So I notice you have anothersegment straight after this one
around AI and how it is going toimpact the future of any
particular sector, particulargovernment.
But I notice in this nextsegment, your focus is on the
future is already here and Ireally want to stress that point

(05:20):
.
We all interact with technologyin different ways.
I'm pretty much a dinosaur, butI interact with it.
The future is already here.
Ai is not abstract.
Ai has actually been around fora long time, largely around
robotics, analytics and so forth.
Generative AI is more recent,but we're all moving.

(05:42):
We are already moving intoagentic ai.
So the point I think I wouldleave anyone with is the future
is already here.
You just have to make thechoice around what that means
for you.
It's either by sector orgeography, or what you want to
do personally.
Everything is going to be alittle different on your
perspective, but it is isn'tsomething that you should wait

(06:05):
to understand the impact of onyour life.

Speaker 1 (06:09):
Mark, you've been talking about this for a long
time, from the UN, from usconvening at the UN, at the
NASDAQ, all over the world, andcould you, for our global
audience, describe what doesgenerative AI mean and what you
just said?
Agentic is here.
I understand it because we'vetalked about it, but what, for
the global audience, is a GenticAI?

Speaker 2 (06:32):
So let me give you some examples to kind of bring
it to life.
Everything is rooted in data,and how you gather that data,
how you analyze that data.
That has been in place now forsome time.
So that's kind of the originsof how we interpret intelligence
based on a digital platform.
Then we moved into generativeai, where it's a little bit more

(06:53):
interactive and I'll give you avery quick anecdote on this.
Just last week we had a teammeeting with my, with my team,
and there were two members whosebirthday began in the year 2000
and whatever.
So they were not born in the20th century, so they were like

(07:14):
22, 23.
And the way they describedtheir interaction with
generative AI was not as a toolbut as a colleague.
So I kind of blew my mind alittle bit, because I am being
wired, I'm being messaged tothink about generative AI as a
tool to improve efficiency.
The way they describe is theyhave their own device, their

(07:34):
laptop, and then they haveanother device.
They open up both, they turn onboth when they come to the
office every day and theyinteract from the human
engagement through their normaldevice and then through this
other device where generative aiis deployed and that

(07:54):
interaction is seen almost likea colleague to colleague
relationship.
So hopefully that gives yousome perspective.
It's all based on the same dataand the interaction and the
interrogation of that data butit's how it's viewed and how
it's deployed has changeddramatically.
And then when we get intoagentic AI, then we're into what

(08:16):
that device can do in advanceof you interacting with it.
So it could be as simple aspredictive vacations.
We see you like these type ofvacations in your past.
Here's some recommendations foryou.
Would you like us to book thatfor you, or it could be?
We've talked a lot about health.
These are the patterns we'veseen in your mental wellness

(08:40):
journey or your prescriptionjourney, whatever it is.
Would you like to ensure afollow-up and monitoring that in
due course?

Speaker 1 (08:48):
Or doesn't it even go further than that?
Doesn't it even like?
Isn't it gentic also like seethat you have diabetes, you've
been treated for it and perhapsyou know I scheduled a workout
session at the gym for tomorrowbecause there's an opening in
your schedule and that type ofthing.
Isn't that, of course?

Speaker 2 (09:08):
And when you think about the power of that data, it
can be a force for tremendousgood.
But if they get it wrong, ifthere's any inherent bias in
that analysis, it can gosideways pretty quickly.
So we're just on the cusp ofunderstanding what that
responsible kind of ethical useof that data is and there's a
lot of very wise and learned,experienced think tanks looking

(09:31):
at that.
But how it shows up in ourday-to-day life it's still early
.
We know the power of what itcan be, but the potential
pitfalls are also of concern.

Speaker 1 (09:42):
Have you seen the new VO3, which creates it's
incredible, it's language tovideo and audio and creates any
scene.
I watched a five-minute clip ofan auto show and participants

(10:04):
at an auto show and you wouldnot be able to distinguish real
from fake.
The sounds, the people,incredible.
And so we're heading in a placewhere you're right.
You mentioned the future isalready here, it's already
beyond here.
Would you consider Amazon?
Is that a genetic like whenthey suggest packages and other

(10:25):
types of products?
I should say they're starting.

Speaker 2 (10:28):
Clearly they have the capacity.
And of course, Amazon is notjust about deliveries, it's AWS
as well.
So data centers is a big engineof their own growth.
But how they use thatinformation, how they sell that
information, how they ultimatelyanalyze and package that
information, that's not a newconcept for them.

(10:49):
Sure, but yes is the answer.
But all the big tech companieshave invested heavily in this
area.
There's many variations of howmuch that is.
It's certainly in the hundredsof billions, billions, probably
in the trillions at this stagewhen you take into account the
capital expenditure as well.
But it's also the ancillaryinvestments.

(11:10):
So a data center is its owncapital expenditure project.
But then you've got to power itso that power has to come from
somewhere.
So each particular state orgovernment needs to understand
how much power extra power dogot to power it so that power
has to come from somewhere.
So each particular state orgovernment needs to understand
how much power extra power do weneed?
That perhaps ten years ago wasunderestimated.
Now it's accelerated massively.

(11:31):
So when you put overlay thatonto the climate agenda, how do
you then get to the requiredlevels of power just for data
centers?
That was probablyunderestimated only a few years
ago.

Speaker 1 (11:46):
And I guess a follow-up to that is does that
impact also areas on the planet,and maybe communities
specifically, but areas on theplanet that may not already have
enough energy Of?
Course it does Will they allfall behind theoretically.
Have enough energy?

Speaker 2 (12:02):
Of course it does Will they all fall behind,
theoretically, of course.
And later this year there's avery meaningful COP, the climate
conference that UN supports.
This year it's in Brazil.
They have it every year, butevery other year, every even
year, they seem to kind ofelevate its profile.
But this year is quiteimportant because there's this
concept of NDCs.
These are the disclosurestatements of different

(12:23):
governments around what theywant, their kind of path to net
zero or emissions.
What does that look like?
Only about 13 countries havesubmitted those so far.
Obviously they're expecting alot more between now and when
that conference happens inNovember.
So far, obviously they'reexpecting a lot more between now
and when that conferencehappens in November.

(12:44):
But a big part of that, george,is the energy demand within
these countries has changed somuch in the last couple of years
.
They've had to adjust, in somecases regress their ambition
around their carbon targets.
We've now moved away fromfossil fuels bad, renewables

(13:06):
good.
It used to be a very black andwhite world in that climate
agenda.
Now it's a bit more nuanced.
Now it's about we need both andprobably more.
Nuclear has come back online asa component.
So it's interesting when youget into the politics of this
that it's not about one is badand one is good.
It's about how do we embraceall of it to meet the energy

(13:29):
demands in a responsible,sustainable future state way
that will be ultimatelybeneficial to the government.

Speaker 1 (13:37):
Incredible, and that's shifted over just the
past few years.

Speaker 2 (13:39):
Right it has yeah, the small modular nuclear focus.
That technology has been therefor a while but nobody's
actually thought about deployingit in a serious way.
That has changed because ofthese energy demands and it's
seen as a responsible from aclimate point of view.

Speaker 1 (13:58):
Other people have concerns around safety, um, but
that has accelerated in a waythat would not have been
possible 10 years ago you know,it's just so profound to hear
about and I and I, I sound thealarm, I, I sound the alarm and
other things, and we'll talkabout that a little bit later.
But digitalization ispositioned as a key enabler.
Digitalization is positioned asa key enabler for achieving the

(14:20):
SDGs the SustainableDevelopment Goals by 2030.

Speaker 2 (14:36):
What is the most transformative, that you believe
holds the most urgency and how,mark, can technology get us
there?
Well, I'll step back a littlebit, george, because technology
is definitely going to be anenabler, for sure, and there's
many use cases in any sector,from farming to health provision
to infrastructure.
Those use cases are limitless.

(14:58):
But we need to understand a lotof it is based on two key
elements that will not change.
One is data.
Technology uses data to producea result that is perhaps more
effective, more efficient, moretimely.
But the other element so that'svery kind of objective the
other element is quitesubjective and that's trust.

(15:22):
If somebody on the other end ofthat output doesn't trust the
output, what do you do then?
So let me give you a fewexamples.
So a lot of that trust wasstressed and tested during the
pandemic, because you hadscientists and medical field

(15:44):
telling us one thing, you hadsome others saying something
else and then you had bits andpieces in the middle about the
interpretation of those two kindof schools of thought and,
depending how you got your kindof, your media or what you
considered your truth, yourfacts, largely through a digital

(16:07):
device, influenced how youwould then act as a person, or a
community or a company.
So for us it's about puttingtechnology and digitalization
into the context of what itmeans to the individual, what it
means to the context of whatthat individual is expected to

(16:29):
do, whether it's on theprofessional, corporate side or
personal side.
That is what's going to changethe trajectory of how
governments or other largelypublic sector entities will view
digitalization in a good way orpotentially otherwise.

Speaker 1 (16:50):
Given your experience with accounting and oversight
in complex, very complex systemsthat you're dealing with.
How can these organizationsleverage the data and the
digital tools to be able toensure transparency and impact
at the same time?

Speaker 2 (17:08):
Yes, again, some of this will be very familiar to
everyone.
So we all have trust that weput money into a bank and we
know generally that it's goingto be covered by the government
if there's any issue with thebank.
But banks do fail, so it's it'snot so much blind trust but

(17:29):
there is a degree of trust thatthe money we put into the bank,
that it is then digitally coded.
We never see that cash again.
It's a digital imprint.
We trust that.
That feels comfortable toeveryone.
We understand that.
But where we're moving is into adegree of uncertainty on a

(17:50):
human level about howdigitalization will interact
with them in a day-to-day way.
So let me give you an example.
Let me give you an example Ifyou are in a country that has
embraced so-called e-government,you are pretty much tracked
even before birth, so duringkind of maternal health and so

(18:12):
forth, and then delivery and soforth, from that point until
death and even beyond,electronically, and your ease of
interaction with government isentirely based on the accuracy
of that information but alsoyour trust that that information
is going to be responsibly used.
So another element I would sayto everyone is understand what

(18:37):
is yours.
Data about yourself is yours.
It doesn't belong to somebodyelse.
You have advocacy of yourownership over that data, so
that is something I think a lotof awareness needs to be built
on.
Some people inherently get thatif they're kind of digitally
native, but a lot of populationdo not and that can be a problem

(18:59):
.

Speaker 1 (19:00):
There's been a lot of talk about just as an aside on
that about having your owncopyrighted likeness, image, et
cetera, especially as we moveforward.
And now with AI, I mean thelines are getting blurred
because what's real, what's not?
Like I mentioned that VL3creating thing, so it's really
complicated, but your practiceworks across sectors, all the

(19:24):
way from clean water tosustainable cities.
What are the barriers Mark thatyou see across these sectors?
We love to talk aboutcross-sector.
What are some of those keybarriers that you're seeing?

Speaker 2 (19:35):
Yeah, I work a lot in emerging markets, so some of
the traditional barriers arejust access to capital, the
right resources, the right skillset.
It could be geopolitics, itcould be local politics, it
could be notes.
A lot of a traditional barriersstill exists and I wouldn't
certainly dismiss those asirrelevant.

(19:56):
They are highly relevant.
But on the technology side itis going to be an awareness of
how connected that country orthat sector wishes to be outside
of its immediate boundaries.
So that felt very comfortable10 years ago.
When you think aboutglobalization and how corporates

(20:19):
or even citizens worked rightacross border way, it felt very
natural.
Sure, that is changing or haschanged.
So people have now kind ofretreated into their national
boundaries.
So that could be a little bitself limiting in terms of
development of new investments,new opportunities, because in

(20:39):
the digital world those nationalboundaries are largely I
wouldn't say irrelevant.
They are relevant but they'reless impactful as a barrier.
So you need to understand thatif you're going to invest in a
digital either digitalinvestment itself or a digital

(20:59):
enabled investment gettingaccess to international and
global resources is key.
The other thing I would mentionin terms of barriers is going to
be what other actors in aninvestment do you need to
interact with and trust?
Let me give you a very quickcase study.

(21:20):
We're working with a coal plantin asia and we're trying to
create a financing deal wherethey have an early
decommissioning of burning coal.
So instead of 30 years out,let's say, bring it back to 20
years out.
So they've got lots of runwayto decommission early.

(21:43):
But you got a trust theregulator is going to hold them
account to 20 years.
You got a trust that theregulator is going to create
other opportunities for powergeneration outside of the coal
plant, because government needsthe extra power that the coal
was providing won't be doing inthe future.
You've got to trust that thefinancers of that delta between

(22:04):
20 years and 30 years are goingto pay, based on the interaction
with the regulator, thegovernment, etc.
So all of that is based ondigitalization trust, on the
data sharing of data,digitalization trust, on the
data sharing of data.
But you still have to have ahuman-centric element of trust

(22:26):
within those different actors.
It's always complicated, so Ithink the barrier is you are
never an island.
You've got to understand whereyou interact.

Speaker 1 (22:35):
How do you ensure that trust?
It's come up quite a few timesduring this conversation.
How do you monitor and ensurethat that trust remains's come
up quite a few times during thisconversation.
How do you, how do you monitorand ensure that that trust
remains in place?

Speaker 2 (22:45):
we used to be very focused on in dc trust but
verify was a term that wasthrown around for for a few
decades but it actuallyexplained meaningful.
We would never go in with blindtrust in a business transaction
.
But there was a degree of trust, an alignment of values or
alignment of expectations or ormutual benefit, but then you

(23:08):
would have to make sure thatthat's been actually implemented
.
So you would have firms orother monitors come in and
clarify and build thatcredibility over time.
Um, that's getting stressed,that's being eroded and that
could be a difficult issue,particularly with investments

(23:28):
that are built over a timelineof 10, 20 years.

Speaker 1 (23:34):
Incredible.
Incredible to hear.
You've built oversight systems,though, from the ground up, for
example at UNOPS.
What are some of the lessonsyou learned from that?

Speaker 2 (23:47):
So two big things I would mention.
Human nature is ever-present.
It looks and feels a littledifferent in this digital age,
but unfortunately we are frailand we are prone to error, and
if there's opportunity, vision,you will have bad actors.

(24:09):
That's, that's a common truthand that is going to remain.
So what you try and do ismitigate the impact of that.
So use your resources in termsof oversight, intervention, in a
way that won't go after the onecent in the dollar.

(24:30):
You're going after the dollarin the dollar.
You're going after the bigpicture items that have very
meaningful impact.
So that would be the firstelement.
The second is going to be neverassume that expectations are
aligned, because you have tounderstand the motivation of

(24:51):
different actors about whatthey're expecting out of a deal
or a transaction, and thatchanges particularly when you've
got governments involvedbecause of their cycle and
politics and so forth.
So you've got to constantlyre-evaluate that, and that can
be hard.

Speaker 1 (25:05):
That can be hard.
International donors,multilateral organizations
they're shifting their approachbecause of politics, because of
digital disruption, because ofemerging technologies.
How do we decipher that Mark?
What are you seeing there?

Speaker 2 (25:23):
We know that we're in a period of disruption,
certainly in the internationaldevelopment field and
particularly with respect to howthe US government wishes to
engage in the use of its foreignassistance, which is very
meaningful.
The US is a major player inmany fields, particularly in
health and education, even ingovernance models and so forth.

(25:45):
You name it.
They've always had a very keyseat at the table and that's
being re-evaluated right now,including the relationship with
the UN system, development banksand so forth.
But the way things are lookingright now is I think there's an

(26:06):
alignment of values of thecurrent administration with
development banks in particular,because it's very explicit what
development banks do.
They are banks.
They deal in transactions.
It's very explicit there'stransactions that can be made
clear and explicit about whereit would provide mutual benefit
back to those who are investingin those transactions, including

(26:28):
the US.
We're also seeing a very clearpivot to, I would say, old
school but still very relevantissues like job creation,
economic development, economicopportunity, investment in
health, investment in education.
So those traditional values ofthe US government still show

(26:53):
through.
We still have a bit of time tofigure out what that looks like
in practice, but we arebeginning to see that
realignment happen in real time.

Speaker 1 (27:07):
Fascinating, and so impact investing as well.
You're heavily involved inimpact investing and it's gained
so much traction against, ofcourse, across, rather, the
development space.
From your perspective, mark,what's needed piece.
From your perspective, mark,what's needed to move from good
intentions to actual measurableand sustainable impact and

(27:37):
results.

Speaker 2 (27:42):
Let me say first that for the last five plus years,
there's been a very clearstatement by the impact
investing field.
There is plenty of money.
There are plenty of problems.
Therefore, how do we make surethe money gets to where the
problems or the opportunities,whichever way you want to phrase
it how do we make sure thathappens?
We're further down that roadnow.
I think what we're hearing herein the USS is not fully

(28:04):
reflective of what's happeningin the rest of the world.
I think what we need to do isredefine what impact means in
the context of where we are.
So we're in DC today.
Impact could mean what's in theinterest of the US or what's in
the interest of a particularcommunity.
I just had a very quickconversation with one of the

(28:25):
participants here aboutinvestment in new areas of power
generation in Texas.
It's still on the agenda.
It's still happening.
Those transactions have been inthe works now for many years
and they will remain so becauseour populations continue to grow
, our demands continue to grow.

(28:45):
So, state by state, we stillhave our ambitions about what
impact we need to achieve.
We may be redefining it orrestating it and, on the global
stage.
A lot of other donors andactors have begun to lean in
because there's so much momentumaround access to investments,

(29:10):
making those transactions happen.
The momentum has kind of gonebeyond the tipping point now.
This isn't something that weneed to convince people is the
right thing to do.
It's also financially the rightthing to do.
It's also financially the rightthing to do.
So I've noticed in the lastthree or four months and on the
world stage it's like okay,we're getting on with the

(29:31):
business as usual, is that right?

Speaker 1 (29:33):
that's great, that's incredible, and so you saying
that in terms of the updateshere is is incredible to hear,
because you're working with over70 countries.
In your portfolio there's 193member states.
Last time I checked in theUnited Nations, you're working
with 70 and probably more at anygiven time.

(29:54):
It fluctuates, I'm sure.
How do you navigate, tailoringsolutions to these countries and
nations and then juxtaposingupon that local context while
maintaining global standards andconsistency?
How do you do that?

Speaker 2 (30:13):
So this is where digitalization has a big part,
because of course, the zeros andones are going to be a zero and
one in any context, in anyculture.
It's how you apply that willfeel and look a little different
.
So it could be language, itcould be the analytics, it could
be the output of the analytics.
That's what will change at thelocal level.

(30:34):
But you have to understand whatis the problem of the
opportunity you are seeking toaddress through digitalization.
So in some places it's purelyabout providing survival support
in healthcare, whereas in othercontexts is the survival piece
is largely taken care of.
You were more invested in howwe make people thrive, right.

(30:58):
So the context is everything,but the constant is tech can
enable a more efficient andeffective and timely kind of
solution, but you've got to kindof make it relevant not just to
the country but to theindividual in that country that
you're willing to interact with.
So human-centered designremains highly, highly relevant,

(31:23):
even in the digital world.
But what that looks like andfeels like by country, by sector
is quite varied.

Speaker 1 (31:30):
Let me digress for a second and ask you a question.
I think a lot of people aroundthe world are wondering While
you're talking about digitaldisruption and all of the things
that come along with it.
One of the things that comesalong with it is perhaps the
need for upskilling, forretraining, because there will
be some jobs that will no longerbe conducted by humans and it

(31:57):
will be AI-driven.
It just won't be necessary.
It's kind of like you had thecart and the horse and that was
great.
You had a beautiful cart alongwith it, and along came the
automobile.
Suddenly, you didn't need thatnice cart.
You maybe kept the horse.
So what are you seeing there?
I mean, are you seeing if thefuture is already here, which we

(32:19):
believe is the case here, whichwe believe is the case?
Is there something that folksaren't ready for?
Perhaps that?

Speaker 2 (32:29):
you're helping to sound the alarm on.
Yes, I think back to lastSeptember, the UN General
Assembly, microsoft had an eventand one of their senior vice
presidents.
He made a very eloquent andcompelling kind of story of
where we are now with respect toAI and he went all the way back
to kind of the printing pressand kind of looked at elements

(32:51):
of technology enhancement sincethen.
So he's put it into thatcontext.
Think of how the printing presschanged pretty much all our
lives, the world, in such adramatic fashion.
Press changed pretty much allour lives, the world, in such a
dramatic fashion.
They see it in the same veinand probably then some because
of the scale and the volume andacceleration that we can have in

(33:11):
the modern world.
I say that because how peopleengage with the use of AI will
not just determine what theywill do in their job, but it
will determine how that job willultimately be shaped for the

(33:32):
next person in that job.
Does that job exist or not?
So it's something we all haveto embrace individually, but
it's also going to have asignificant role in reshaping
how workplaces kind of look andfeel in three to five years'
time.
We're not talking decades here,george, I've said to you before

(33:52):
, even at a firm like ours,we've already adjusted our
campus recruiting headcountfigures for 2029 and 2030.
That's how far we're thinkingin terms of not just numbers but
the skill set.
So if I could make a plea toanyone is bring this all the way

(34:13):
back, not just to universitylevel, third level we have to go
back minimum to high school andprobably beyond.

Speaker 1 (34:20):
Go all the way back to elementary so I guess, if a
high school or middle schoolright now is not not embracing
ai for their students, bigmistake, right?
I would ask why.
You would ask why?
Yeah, fascinating because evenuniverse large universities and
colleges are not.
Some are like and then, and Ilook, I three, four years ago.

Speaker 2 (34:42):
I get it because I think there's a sense of people
are going to use AI to cheat orto plagiarize or just create
something that they should becreating themselves All the
things that we understandcollege to support.
But that was only two or threeyears ago.
The world has changed now and,as I gave you that little
anecdote about the people whowork on my team, how they see

(35:06):
and interact with AI, thatfuture is existing today.
That's not going to change, nomatter how resistant, more
experienced people like you andI may be, but that reality is
here now Will entire profession.

Speaker 1 (35:21):
You know just a prediction and we don't want to
single out a single professionor anything else, but will
entire professional servicesmaybe not necessarily marks?
I don't want to single that outjust like.
Will things change sodramatically?
Let's pick on lawyers for amoment.
We will single out one.
What about lawyers?

(35:41):
Is it possible, based on whatyou're seeing and the landscape,
that maybe you won't need to goto a lawyer?
Maybe you'll get better legaladvice from an AI platform, an
app that can draft your will oryour home mortgage loan
application or whatever it is?

Speaker 2 (36:01):
The way I'd answer it , George, is the technology
could certainly do that, itcould create that vision.
But are we comfortable withthat output, with that outcome?
Because I could see myself in aposition where I'm comfortable
that the lawyer I would workwith would be 90% driven by what

(36:21):
technology can produce.
But I want to have thatvalidation from a person.

Speaker 1 (36:25):
But that's you, that's me.
So maybe your colleagues thathave the two computers and the
colleague which one is AI wouldsay well, I don't need to talk
to someone, just like the ATM orthe bank branches where I still
like to go in say hello, I'mhere, I'm depositing, or I still
write checks, what's that?

Speaker 2 (36:46):
So we've come back to a circle to trust.
And trust is relative, dependingon your perspective.
So my trust would be thevalidation by a human.
Somebody else may have adifferent view, and that's where
we are right now that isevolving in front of our eyes.
Are we ready?
We have no choice.

(37:08):
It's already here, so we wehave to embrace this and
acknowledge the opportunities,but also the potential pitfalls.
As I mentioned earlier, a lotof very smart people have
written a lot of ideas of how tobest utilize these emerging
technologies.
Let's not wait to engage on apersonal level.

Speaker 1 (37:34):
Private sector.
What role do you see theprivate sector partners,
especially in consulting andassurance, playing in advancing
inclusive innovation across thedevelopment ecosystem?

Speaker 2 (37:49):
I'll make this very quick.
We have to interact withtechnology in a way that will
build trust, either in capitalmarkets or the customers, and we
need to understand that contextof what that means.
So is that going to beresponsible or socially ethical?
That can mean a lot ofdifferent things to a lot of
different people.

(38:10):
So know your audience and buildtechnology to deliver
accordingly, and that will lookand feel very different
depending on the country and thesector, but the technology will
be very consistent.
And, Mark, you know, just a fewyears ago you and I were doing
an interview and I think, andthe sector, but the technology
will be very consistent.

Speaker 1 (38:23):
And, Mark, just a few years ago, you and I were doing
an interview and I think wewere at the UN or at the NASDAQ
and we were talking about 2030.
And here we are, just fiveyears away from 2030.
It seems like it was a millionyears ahead.
What gives you, Mark Fitzgeraldat KPMG, the most hope about

(38:48):
our ability to align technologypolicy and capital for global
good, Mark Fitzgerald?

Speaker 2 (38:56):
KPMG Director, global Development Goals.
George, I've said many times toyou that the Sustain
development goals are like acode of ethics for humanity.
I still believe that Largelythey're aspirational and
certainly have become so becausewe've progressed on many of the
key metrics, largely because ofthe pandemic.
And then, you know, a lot ofgovernments don't have the same
amount of resources or attentionto be able to focus on some of

(39:19):
those key metrics.
But those needs in the worldstill exist and where there is a
need in the world, you have tomeet the need.
That that's just a moralobligation we all have.
So the private sector is engagedwith that, whether they know it
or not.
So what I mean by that isprivate sector and invested in

(39:40):
their communities.
Wherever they want to sellwhatever they're selling or
engage in what they're producing, they need to have a vibrant
community to work with.
That's just inherent in theirbusiness model.
So, as we think out 20, 30years, private sector isn't the
panacea.
It's not going to be the oneand only thing to save the world
I don't go to that lofty levelbut it is a big, big player in

(40:06):
making sure that we advance onwhat we need to.

Speaker 1 (40:10):
Mark Fitzgerald, what is your call to action for our
global audience here in yourrole at KPMG?

Speaker 2 (40:17):
Two things.
One would be on a personallevel understand that generative
AI, agentic AI it's not thefuture, it's already here.
Understand what it means to youindividually, understand what
it could mean for you and yourrole and embrace it.
And the second is whatever roleyou are in, understand that

(40:38):
that role has already changed,even if you don't feel like it
has.
That that role has alreadychanged, even if you don't feel
like it has, and you get toshape that future through the
work you're doing today andthrough that personal engagement
and use of technology.
So the avoidance of thatengagement means a lot of things
may be done to you.
You have an opportunity toinfluence what that future looks

(41:00):
like for you individually andalso for your role and your
sector, etc.

Speaker 1 (41:05):
Mark Fitzgerald, kpmg , leading the way.
Thank you so very much, thankyou.
Advertise With Us

Popular Podcasts

Law & Order: Criminal Justice System - Season 1 & Season 2

Law & Order: Criminal Justice System - Season 1 & Season 2

Season Two Out Now! Law & Order: Criminal Justice System tells the real stories behind the landmark cases that have shaped how the most dangerous and influential criminals in America are prosecuted. In its second season, the series tackles the threat of terrorism in the United States. From the rise of extremist political groups in the 60s to domestic lone wolves in the modern day, we explore how organizations like the FBI and Joint Terrorism Take Force have evolved to fight back against a multitude of terrorist threats.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

NFL Daily with Gregg Rosenthal

NFL Daily with Gregg Rosenthal

Gregg Rosenthal and a rotating crew of elite NFL Media co-hosts, including Patrick Claybon, Colleen Wolfe, Steve Wyche, Nick Shook and Jourdan Rodrigue of The Athletic get you caught up daily on all the NFL news and analysis you need to be smarter and funnier than your friends.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.