All Episodes

December 15, 2024 • 16 mins

Athena delves into the ethical challenges surrounding AI adoption, outlining 5 crucial areas for consideration: accountability, bias, data privacy, deep fakes, and job impacts. She emphasises the importance of critical thinking when using AI tools and discusses the potential effects on employment, citing examples like Klarna's significant job cuts due to AI efficiencies. Peppes also explores the broader economic implications of AI adoption, highlighting the need for balanced decision-making that considers both growth opportunities and societal impacts, while stressing the importance of understanding and adapting to these technologies at individual and organisational levels.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Matt Best (00:00):
Welcome back. We're here to continue our

(00:01):
conversation with Athena Peppes,founder of Athena Peppes
Consulting and Beacon ThoughtLeadership. Ethics is such a
huge topic that exists in thatyou mentioned that earlier with
they can is it going to takeaway people's jobs? Is it going
to various other things as well?All right, there's already
questions. You mentioned the newregulation around AI, like

(00:22):
questions though, that theethics that sit behind that, you
know, putting out videos thathave got people's faces when
they weren't actually there,representing something that they
wouldn't necessarily believe inor support, is that, what's your
perspective and take on that isthat, is that an area that
you're seeing is that becomingkind of heightened problem? Is
that, like, I mean, short, shortit's going to slow us down a
little bit. But what, sure, whatare you seeing in the market

(00:44):
around the that ethicschallenge?

Athena Peppes (00:47):
Yeah, I'd say it's a huge issue, as you say,
and there's like so much tothink about. I think it can also
feel my thing has always had tosimplify stuff, right? And
because it can feel like, wheredo I start? I always have in my
mind, kind of five things tostructure a conversation around
this and to help think throughdifferent issues. The first is

(01:10):
around accountability. Who isaccountable for the information
that your AI enabled bot gives?Now, this might seem like a
straightforward question, butthere was an interesting case
earlier in the year of AirCanada that basically argued
that the bot which had givenmistaken information to one of

(01:30):
the customers that was trying toget information around
bereavement fees. They arguedthat the bot was responsible and
they had no liability to pay themoney back. Now that did not go
down well, no, no, but they justthat was their, their kind of
argument. But there's all sortsof kind of illegal implications

(01:51):
around that.

Jonny Adams (01:52):
Do you know the outcome of the case? Did they
get laughed out of court?

Athena Peppes (01:55):
Yes, they did.

Jonny Adams (01:56):
Yeah, yeah. A bit more human.

Athena Peppes (01:58):
Yeah, exactly they did. But then you can get
into a little bit more detailedquestions, of like, Well, is it
the executives that are thatkind of approved, that is there
any responsibility with the teamthat designed it? What about if
it was supplied from a thirdparty? So there's like, so much
complexity there, right that forcompanies to figure out the

(02:20):
second ones around bias. So Iuse, I use these tools quite a
lot because I personally thinkthey help my productivity
immensely. They save me so muchtime, and I just love
experimenting with differentthings. But this is the
importance of critical thinking.Always is. I can see that
they're they're biased, and II'm sure they're getting better,

(02:41):
or at least, I hope. But I waswriting a piece around the
economics of of AI, and I wantedan image of an economist
pondering the future ofproductivity.

Jonny Adams (02:51):
So where did you go to find the image? That's the
question.

Athena Peppes (02:54):
Dall-E. I used Dall-E, but straight away gave
me a man with white hair,obviously, you know, like middle
aged white man. I was like,Okay. And then, because I've had
this experience before, I justthought, I wonder what would
happen if I swapped theprofession so I could, I you can
prompt it to try and getdifferent things and say, Oh,
give me more diverse andeverything. But I was like, what

(03:15):
if I just swapped the wordeconomist for nurse, and
straight away, gave me a woman.So of course, they are because
they're, they're, they'retrained on information that
we've created, and we all comewith our own biases. But how do
we do we have a responsibility,as you know, as an organization,
to not perpetuate those biasesas well, if you're using those

(03:38):
tools, then there's the the dataprivacy issues that we've
touched on as well. How do youget around that deep fakes that
would that's the fourth one, andI think that's something that's
particularly concerned becausethere's so much synthetic media
at the moment, and the qualityof it is amazing. One that might
interest you is, have you seenthe Google LM tool that now

(04:02):
creates podcasts from articles?

Jonny Adams (04:04):
No, but there we go.

Athena Peppes (04:10):
But that's not to say that it will it all will be,
but you, you know that alsomakes you think about your own
job. How can you use it to doyour job differently, perhaps,
or perhaps, as humans, we valuethe fact that that was that we
all got together here and hadthis, this conversation in
person. We put greater value onthis than on something that was

(04:31):
just artificially generated. Andthen the fifth one is around
jobs. And I think that's a hugeone, because it sometimes gets
not mentioned very much in thein the light of like
productivity benefits, whichthere are many. But I think it's
better not to hide away from theconversation and to think about
what, what would that mean forthe impact on your people? Will

(04:53):
that mean job losses? And if itdoes, how do you handle that?
What's your responsibility toupskill your people and help
them understand that technologybetter.

Jonny Adams (05:03):
On that point, I was curious about that last
point, and I suppose it's reallyhot topic for for anyone
involved in in a growth role,whether that's, you know,
marketing, sales, consultancy,whatever that may be, a customer
success. When you think aboutthat, that job piece, not naming
names. If we think about CEOthat lasts of 10 years in

(05:24):
software as surface has beengrowth at all costs. So they
would pretty much do anything toget to where they need to get
to, especially their VC backs.It's like quite aggressive. Do
you think that a CEO and thosetypes of C suite actually will
continue to think about growthat all costs, at the absence of
job losses, at maybe sort of,May, I say, middle to lower
tier. Or do you think thatactually there's going to be

(05:46):
some of that ethical sort ofinput, we actually know common
guys, that you got to not worrytoo much about growth, and
actually you got to think aboutthe society and the people
within your function. Do youhave a theme or a trend that
people are talking about, ormaybe it's a bit too early, I'm
not too sure.

Athena Peppes (05:59):
Yeah, it's, it's quite diverse. I would say, I
think you're right that if welooked at the past, say, for
instance, when smartphones andsocial media kind of became
commonplace, there was a lot ofissues around, maybe not
necessarily on jobs, but kind ofparallel issues that we could
learn from. And what we saw isthat we were very slow. Our

(06:21):
institutions were very slow toadopt to those kind of issues
and help people. Automationwould be one example of that.
Right loads of jobs gotautomated. I think now there's a
lot the kind of jobs that willbe affected are not just manual
jobs, but also manual routinejobs, but also knowledge
workers. And perhaps that's whywe are seeing much more of a

(06:45):
discussion, because it's there'sa feeling that, oh, this is
becoming a lot bigger in termsof the impact on jobs. I don't
know if that, if you cangeneralize about how CEOs are
are seeing this. I think the theCEO of Klarna, the payments
company, said that they they gotrid of 1200 jobs because

(07:06):
genitive I was helping theirmarketing and sales teams do
things so much faster. And hesaid, actually, that they can,
that they'll only be able tofunction with 2000 people, I
think, as opposed to perhapsnearly double that now, or
something like that. Butthere's, there's always the
economic incentive of the thing.There's an IPO coming up and,

(07:26):
yes, good. So there's issueslike that.

Jonny Adams (07:29):
Is it a language pattern that creates fear in the
current sort of market of joband point where, you know, job
cuts? Or is it, you know, arepeople being redeployed well
enough, you know, it has to comein one hand that I get the
efficiency model, because I'mtrying to sit there, if I was a
CEO, and think I've got thepressure of the board, I've got
the pressure of investors. Doyou know what? I'm five years
away from exiting, going toretire? Would I go for it? You

(07:52):
know? Would I? Would I cutheadcount and use some type of
really? Because that's whateveryone's telling me, is that
AI is going to solve myefficiency problems. I don't
know. I've been noodling on thatone, and I guess that I don't
know which way people wouldturn.

Matt Best (08:05):
I sort of gear towards this, this view of
because there's a there's apoint at which AI is already,
I'm picking on AI there,obviously other, but AI is, is
already helping in findingefficiencies that makes people
more productive, that can resultin more growth. But I think to
your point is, well, that growthat all cost. Where does that

(08:25):
sort of kick in and start tobecome a problem? Or where does
it become just, are we justgoing to find this natural
equilibrium and it just becomesa hey, look, there's going to be
a sort of reassignment. I thinkmy biggest concern is the pace.
So with other transformationalchanges we think about in other
industries in the past, you sortof automation in manufacturing,

(08:47):
for example, that was probablysort of slower, maybe more
expensive than, say, AI could bewhere it's so everything's in
the palm of your hand much, muchfaster. That means an
organization could tomorrow say,We're gonna, you know, the
example that you shared there,Athena, that you know, well, we
can cut 2000 jobs kind of almostovernight. And I think it's the

(09:07):
pace of that that might be thething that hurts us. Because
behind all of that, you've gotthe knock on effect to education
and to the journey that, youknow, the next generations are
going on, and the enabler of,okay, we're making sure that
we're training in the rightthing. You know, how much is AI
prevalent in education, inschools at the moment, or is it
just being left down to kids,just learning through the way

(09:30):
that they and I think that'sprobably the bit that concerns
me, sorry, is the pace isprobably different, to say,
previous transformations.

Athena Peppes (09:38):
Yeah, and I think you know, the things you touched
on as well about the borderperspective, right? So it helps
to think, to think of this froma macro economic perspective.
CEOs, I think, would generallythink about it in the context of
the organization, but the issuesthat come up, like you
mentioned, around education, thefuture and how do we plan for

(09:59):
our economy? Companies, thoseare much bigger issues, and it's
not as commonplace to find CEOsthat might have that vision,
arguably, some of that mightcome from us. Right? What
expectations do we have fromthese organizations about their
responsibility to actuallycreate jobs? So in the economics

(10:22):
field, there's a huge debatearound the impact of this on on
jobs. I'm not sure there's aconclusion, yet there's a bigger
argument, because most of theways that they would estimate
what the impact would be isbased on previous Waves of
Change, and perhaps the datathere's not good enough

(10:42):
predictor of what's coming inthe future. So does that need an
AI model to just work? Yeah, butyou see the kind of challenge
with doing that. There might benew jobs being created, right?
There definitely will be. Thereare loads of philosophers now
being hired by big companies tothink, to help them think

(11:04):
through these kind of questionsthat that might come up. So it's
yeah, it's a it's a huge topic.I just feel that the as
individuals, the more weunderstand the technology, the
more we use it, the more welearn about it, the more
prepared you are to influencethat change as well, whether as
a consumer, you know, as anemployee, as a as a citizen, as

(11:29):
well.

Matt Best (11:29):
It's connecting those things, isn't it? And I think
that's going to be thechallenge. And we're in danger
of kind of diving into so reallykind of political but if I look
at the corporations response to,you know, climate action
planning and that kind of thing.It's not been all that
proactive, right? It's very muchokay. I'm forced to now do this,
and I guess the consent I maybehave this, my personal opinion,

(11:50):
is that if we have that sameapproach to some of this, you
know, some of the kind oftechnology, and it's not
supported, and that's why Iasked the question around kind
of policy makers and their rolein this, I think there's a
really important part to playthere.

Athena Peppes (12:03):
Absolutely. And again, they have the same actual
trade off that the C suiteexecutives have, right because
arguably, there's a huge case tobe made for how you use AI in
the public sector, whichdefinitely needs improvements in
terms of efficiency and servicesthat need to be provided to
people. So the opportunity thereis huge. But then how do you

(12:27):
kind of balance that withoutwhat it means for jobs? And
arguably, they have, so theyhave doubled the role to play,
in the sense of both balancingthat as the public sector, but
also in terms of the policymakers within that, doing, you
know, thinking about that forthe whole economy.

Jonny Adams (12:42):
I'm really curious, and we've talked about AI, I'd
say that is in a general term,not today, but in every space
that I can even think of. Andwhat's beyond AI? And I know you
think about that, and talk aboutthat a lot, but you know, even
past all of this, what is goingabout now? What's the next
thing? Do you have a anindication or a hypothesis
around that, is that a fairquestion to ask?

Athena Peppes (13:03):
Well, I think, for one thing, I think the whole
topic of artificial intelligencedefinitely has more room.
There's a lot of discussionaround this being hype and so
on. And of course, there'salways a little bit of hype when
something new comes into theforefront, right? So there is,
but the opportunity isdefinitely there. It's huge. You
see some companies appointingchief AI officers into the into

(13:29):
the C suite board. So in thesame way that in the past, if we
look at the topic ofsustainability, used to be one
person doing what was called CSRback then that no one used to
take seriously to now someoneyou know, whole teams focusing
on how organizations can deliveron those goals. So something
similar is happening with withAI. I think what people worry

(13:52):
about a lot individuals, is isthis idea of general artificial
intelligence, so this technologybeing able to completely
replicate what we are as humans,but there's just so much that
needs to happen for us to getthere that I think we won't be
around ourselves to discuss thatquestion possibly.

Jonny Adams (14:13):
So I don't need to fear anything. I can feel quite
confident that we're going to beokay.

Athena Peppes (14:16):
Yeah, but I think it's kind of find that balance
as an individual, as you know, aleader, as a team member,
whoever you might be, whateverrole or hat you might be
wearing, what's next is findingthat balance between making the
most of the growth opportunitythat is definitely there, the
stuff that these technologiescan do that can just create new

(14:38):
content in seconds. We knowthis, but how can you make sure
that you do that in a way that'sforward thinking enough that you
don't get caught out by some ofthe risks or challenges other
issues around that?

Matt Best (14:52):
I think that's a fantastic place to, yeah, a
fantastic kind of final thought,Athena, thank you so much for
sharing that, and I think we hada couple of other comments.
Conversations recently on thepodcast about the importance of
that patience, and it feels thatwe actually need to be a bit
patient. We might even need tokind of slow down that decision
making process and be a bit moreconsidered, perhaps, but with
that North Star goal of right?Well, what could this do to us

(15:14):
to help maximize growth for uspersonally and also for our
businesses?

Jonny Adams (15:17):
Yeah. I mean, I think I take your last point.
I'm going to capture theopportunity, you know, and thank
you.

Matt Best (15:22):
Seize the day?

Jonny Adams (15:23):
Yeah, I am feel a bit more calm in the
circumstances. So thank you somuch for sharing some great
insights today.

Athena Peppes (15:28):
Yeah, it's been a pleasure to be here. Thank you.
Advertise With Us

Popular Podcasts

Stuff You Should Know
24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.