Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jean Gomes (00:14):
As technology
becomes ever more compelling,
powerful and integrated into ourlives, more of what was real
experience is being mediated byscreens and speakers for what we
gain, we must also question whatwe are losing and what
ultimately that costs us for ourfuture. This is the work of
(00:34):
Christine Rosen, most recentlyexplored in her new book, The
extinction of experience. Tuneinto an important conversation
on the evolving leader.
Scott Allender (00:44):
Hi folks.
Welcome to the evolving leader,the show born from the belief
that we need deeper, moreaccountable and more human
leadership to confront theworld's biggest challenges. I'm
Scott Allender
Jean Gomes (00:54):
and I'm Jean Gomes.
Scott Allender (00:55):
How are you
feeling today? Mr. Gomes,
Jean Gomes (00:57):
well, I'm feeling on
the cusp of something. Because
even though everybody for thelast month has probably been,
you know, on holiday and so on,I think I've never worked so
hard getting ready for the setof challenges facing our
business and our clients and ourresearch for so I'm feeling very
excited about all of that, butI'm also feeling, in a strange
(01:21):
way, more tired in as we startSeptember than I probably ever
felt before, but so I need tolisten to that and do something
about it. So I'm good to go forthe rest of the year. But other
than that, I'm feeling very,very motivated and excited. How
you feeling? Scott?
Scott Allender (01:38):
Well, I feel
like I need you to be rested,
because I have a lot thatdepends on you, so I need you to
take care of yourself. I'mfeeling,
I'm feeling really energisedtoday, been excited about this
conversation we're about tohave, and so just feeling a lot
of gratitude.
Yeah, little bit fatigued, butall on the positive side
(01:59):
overall. Today we're joined byDr Christine Rosen. Dr Christine
Rosen is a senior fellow at theAmerican Enterprise Institute,
where she focuses on Americanhistory, society and culture,
technology and feminism. She's aprolific author, and we're
delighted to be talking to hertoday about her new book, The
(02:20):
extinction of experience.
Christine, welcome to theevolving leader. Thank you so
much for having me. Christine,welcome to the show. How are you
feeling today?
Christine Rosen (02:30):
You know, I
like that. You ask each other
that question, and then Ithought, Oh, I hope they don't
ask me, but I will say, I thistime of year because schools,
kids are going back to school,things like that. I've always
gotten a little burst of energy,even though it's been a long
time since I've gone back toschool in the fall. I like the
change of seasons. I'm inWashington, DC. It's just
(02:50):
starting to get cool again, andI'm excited. I love the fall,
and I like the sense ofenthusiasm when you see the kids
walking to school inanticipation, because by mid
November, they all are trudgingalong like they've just got the
weight of the world on theirback so but no right now,
Feeling very, very happy.
Scott Allender (03:12):
Excellent. Well,
the extinction of experience is
a really striking phrase andtitle and covers a big shift in
society. Can we start by, whatdo you mean by the extinction of
experience? What are we losing?
Christine Rosen (03:32):
So I borrowed
the phrase the extinction of
experience from a naturalistnamed Robert Michael Pyle, who,
years ago, was worried aboutyoung children not having an
experience in nature, not beingout in the world, spending too
much time indoors or staring ata television screen and and not
coming to appreciate andunderstand the importance of the
natural world. And it struck mewhen I read that, that that's
(03:53):
really not only children who arehaving that experience, it's all
of us. When we think about howoften we put something between
ourselves and other people andput something between ourselves
and the rest of the world,usually that thing is some form
of a screen, some form oftechnology, some sort of
communication platform, and Ibegan to worry that it was
changing our habits of mind,changing our expectations of
(04:15):
each other and having some illeffects above and beyond the
ones that we have seen in a sortof tech backlash over the past
five to 10 years, I worried thatit was changing our expectations
of what it means to be human andto have human experiences. So
that prompted me to startthinking about, what are human
experiences? What can we callwhy do we need to distinguish
(04:36):
between the human and the nonhuman, and do we need to defend
some of those traditional humanexperiences? Because it's not
just that they're good for usindividually, but they help us
form stronger families andstronger communities. So the
book isn't a binary kind ofattack on on technology at all.
It's much more practical thanthat in terms of helping us to
(04:57):
think differently about, youknow what it is to be human? I
mean, we're using screens rightnow to mediate our conversation
with which we couldn't otherwisedo. So there's tremendous value
that you acknowledge in that.
But I'm interested when you youthink back, um.
Jean Gomes (05:13):
Um, in your your
theorising on this, was there a
moment where you kind of wentthis, I need to write this book.
What was this something, or isit a cumulative set of things?
What was, what was going on foryou there?
Christine Rosen (05:28):
I would say
it's cumulative, but a few
things stand out to me. I'm aGen Xer. So, you know, I grew up
without all this stuff, and thencame to it as a cynical adult.
And of course, we are the bestgeneration. Everyone overlooks
our important contributions toeverything. So you know, the
cynicism remains. But I wasstruck one day while in a public
(05:48):
space in Washington, DC, lookingaround and just noticing that I
was the only person noticingthings everyone else was either
rushing around headphones in,you know, staring at a phone,
and this was about 10 years ago,so it's gotten only worse since
then, in terms of how peoplemediate their experience. And it
struck me as kind of tragic andvery lonely that I was in this
(06:12):
crowded public space with lotsof other people. I love to
people watch. So I was reallykind of intensely looking at
what other people were doing,and everybody was going through
life as if they weren'tphysically there. And it was
that lack of a sense of physicalembodiment. I mean, they're
bumping into each other becausethey're not paying attention,
and all the frustrations of, youknow, bumping into someone who's
(06:32):
staring at their phone insteadof what's in front of them. But
to me, it also was this hugemissed opportunity to just
understand the world we live inand then world we share.
Everyone was trying to escapefrom that shared reality into
their own personal reality, andthat really struck me as
something that we shouldn't lethappen without comment. And so I
guess that really prompted me tostart looking at other areas of
(06:54):
life where we have cast asidethe face to face human
interaction, thinking we'reimproving improving speed,
convenience, efficiency andconnection with the technology,
but where we might actually havenot improved what it means to be
in a profitable, productive,flourishing human relationship
with other people.
Scott Allender (07:13):
I'm supremely
interested in this. I'm thinking
philosophers have been sort oftalking about humans propensity
to be kind of in a waking sleepstate, you know, for hundreds of
years, right? So it seems liketechnology is now only
exponentially advancing thatsort of predicament in your
research. I'm sure it alsoincludes, you know, the dopamine
(07:35):
hit we get from quickinteractions with technology and
being on our phones. Is that theprimary reason why we're so
willing to trade off, or isthere other stuff going on that?
What why people are willing toto give up that sort of
interpersonal interaction andjust go to the device?
Christine Rosen (07:52):
So there's
certainly the dopamine hit is
real, and there have been a lotof interesting studies of what
that does to us, how these arelike little mini slot machines
in our pockets, and keep uscoming back for that
intermittent reward system. ButI think what fascinates me is
how it transforms ourexpectations and habits of mind.
So what I mean is that if we, ifwe spend a lot of time in this
(08:13):
little dopamine machine, withthis dopamine machine, getting
those hits and rewards. Thenwhen we don't have it and we're
interacting with anotherdifficult person, how do we
handle that? Because we haven'tpractised that skill as much,
because most of our interactionsare under our own control. When
they're through a screen, we canescape them tap, you know, we
can claim to have an internetproblem and get cut off. All
these other things, we have asense of control. So two things
(08:36):
happen. I think we try to haveexercise that sense of control
over other people, and that'salways a disaster. And we also
don't spend time doing what youbegan this podcast asking, which
is thinking about how weactually feel in a given moment,
people don't sit with theirfeelings and try to understand
them as much. It's particularlytrue of the young, because they
can always Deschamps themselves.
But part of being a fully formedhuman being is to be able to sit
(08:59):
and go Wait. Am I angry or am Isad? Am I actually anxious, or
am I worried about something Ishould be worried about?
Processing one's own emotions?
Is part of being a grown up, andit's something that takes
practice, and that practicebegins at a very young age. If
you constantly can distractyourself from those often
uncomfortable moments, how doyou ever practice them? How do
(09:21):
you grow up to be someone whocan identify their own feelings
in a moment that's perhaps tenseor or beautiful? I think the
escape from the reality is thepart of these devices that
worries me the most.
Jean Gomes (09:34):
Yeah, and as we
escape from reality, we start to
kind of weaken what makes ushuman in many ways, the the
embodied cognition that gives usadvantage, our ability for
empathy, for understanding, forinsight, creative in the list of
things that allow us to actuallyhave advantage in a AI world is
(09:58):
is endless, and at the verymoment when we most need those
things, we are weakening them inourselves. So I what I was
fascinated reading your work wasthis that what you're you're
doing here is you're reallydeepening and understanding of
those qualities. And what we'relosing and how to reclaim them.
So we haven't got time to covereverything that you you've
(10:20):
researched in that but if, ifyou could help us kind of
navigate this a bit, and thinkI'm somebody who is at the start
of my life in business or ineducation, and AI is going to
automate a whole bunch of thingsthat that I'm going to do. How
do I prevent myself frombecoming less? How do I become
(10:43):
more?
Christine Rosen (10:44):
That's a great
question, because the the kind
of mass de skilling we'reseeing, everyone worries about
de skilling with AI and theoutsourcing to AI. I worry about
what we've already de skilledourselves, those those things we
know without knowing why we knowthem, which is the embodied
cognition, the embodied way thatwe read each other's facial
expressions or hand gestures, orwhy, why you can walk into a
(11:06):
tiny metal cube with a bunch ofstrangers and because everybody
gives a little nod in thatelevator before the doors close,
you know that you're safe, oryou maybe don't Get on that
elevator because someone looks alittle dodgy. So these are all
things that we're hardwiredthrough 1000s of years of
evolutionary development toknow. And I think one of the
conceits of our currenttechnology is to try to train us
(11:28):
to mistrust those things and tothink that those skills aren't
actually useful or meaningfulanymore. Because AI can do that
for you. AI can gather all thisinformation about how you how
often you spoke in that lastmeeting, and what that means for
your state of mind, and all thethat's information. I'm really
interested in the stuff that ishuman, and so I we produce this
information, but to understandhow we can best use it,
(11:50):
particularly in a businesssetting, you have to understand
what it means to be a person howto interact with other people,
and that is face to faceinteraction. So you can't always
be face to face, but the kind ofde skilling that I think we
should be concerned about withAI is the idea that it's no
longer a tool, just a tool thatcan help us do certain things
better, but that it shouldreplace some of our own
(12:11):
judgments about what it means toassess another person's
character. You see this alreadywith how resumes are actually
vetted. You know, a human beingdoesn't even set eyes on a
resume in some of thesecorporations until it's gone
through several differentcullings. And that worries me,
because people are weird andquirky, and some of the things
that an AI might have beendesigned to weed out might be
(12:33):
precisely what your organisationneeds, because it is that
quirky, creative bit that'sgoing to bring to the table
something new, a new idea that'sbeing surfaced by a human's
quirkiness, not by an algorithmsdesign.
Scott Allender (12:46):
So in this sort
of less embodied state that
you're talking about, I'mcurious about what you're
finding about the impacts on ourwell being. So we're talking
about the skill outsourcing andthe potentially over outsourcing
certain things, like therecruitment example, but in the
in the human piece of it thatyou're referring to, like, what
(13:08):
are some of the detrimentalimpacts on well being?
Christine Rosen (13:12):
I think we're
all this is going to be a broad
brush stroke. So forgive me,because there are always
exceptions to these sorts ofstatements, but I think we're
all becoming habituated tosettling for much lower quality
interactions in our lives, and Ithink we see that reflected in
the concern about the lonelinessepidemic, which is not quite
(13:34):
right. We do not suffer from aloneliness epidemic. We suffer
from self isolation, which isdifferent. We are choosing to be
alone because we don't feelalone when we're engaged on a
social media platform or, youknow, watching YouTube videos or
endless Instagram reels because,but that's sort of airsats. It's
like it's a substitute for realinteraction, but it gives us
(13:57):
just enough of a sense ofconnection for a little while,
and it's so easy and it's soconvenient, and it's so
available to us on demand, andwe don't have to brush our hair
and put on decent clothes andleave the house to do it. And so
we start to think, well, it'sfine. It's just fine. And the
more we choose that, and theless we slightly inconvenience
ourselves and go out and meet afriend at a restaurant and sit
(14:18):
across the table from eachother, look at each other's
faces and interact. Then themore that becomes a more, a
bigger challenge, bothemotionally for a lot of people,
a lot of young people who don'twho feel it's a real thing if
you walk up to them unannouncedwithout a text message
announcing your arrival, or,good Lord, if you call them on a
telephone. And I know this, I'mkind of saying it mockingly, but
(14:39):
I have kids in this generation,I actually say with a lot of
sympathy, I these are thingsthat they haven't practised.
They need to practice thesehuman skills. And I think I
worry that we're selling,settling for massive quantities
of lower quality experiences.
And that does change us. Itchanges our expectations for
ourselves, and it changes ourinteractions with each other.
Scott Allender (14:59):
I feel that so
I've got kids as well in this
generation, but you know, myphone rings and I sometimes
pause and like, why is Why areyou calling me aggressive? Too
much. It's too assertive. Textme like a normal person, not
even.
Jean Gomes (15:13):
A different issue,
Scott, we'll come to I'll tell
you after the show, how is thisreshaping our mind, potentially,
what is actually going on forus?
Christine Rosen (15:27):
So I think one
of the areas that really
concerns me is that the changein how we understand time, like
chronological time and theformation of memory. So I
became, as you know, because sheread the book, I became a little
obsessed with handwriting inthis book. So I have, I have a
chapter which delves deeply intohandwriting. I thought I was
(15:47):
doing this because I'm an oldfogy who likes to do things the
old fashioned way, and I made mykids learn to write cursive and
all this. But in fact, once youstart looking at some of the
fascinating neuroscienceresearch, in particular writing,
the mind body connection withregard to handwriting, is
implicated in all kinds ofthings, with the formation of
short term and long termmemories. With literacy, it's
(16:09):
just our brains are thesefascinating puzzles that we only
have figured a small piece outabout and with handwriting,
something that many schoolssaid, Oh, we don't need this. We
anymore. It's not a skill thatthe modern child should have.
We'll just teach them blockletters and move straight onto
keyboards. Turns out, learninghow to write in cursive brings
this whole other layer of memoryformation. So again, for me,
(16:33):
that became this example ofsomething where I thought, oh,
you know, everyone should learnto write by hand, because I'm
left handed, so I had to sufferthrough like, dragging my hand
across the page, and it was justonce I was able to write fairly
decently. It was a really proudaccomplishment. But it turns out
that actually it does change ourminds. It changes how we
understand the past, how weunderstand and process what's
(16:55):
right in front of us. And whenyou think long term about a,
say, a nation's cultural memory,how is that going to what's that
going to look like in an agewhere images can be totally
manipulated or created fromwhole cloth, where you can't
really trust a lot of theinformation, and we haven't
raised generations of youngerpeople to understand what it
means to form their own memoriesand to understand collective
(17:17):
memory. So these are all thesorts of ways in which, I think,
on a broad scale, we haven'tasked the right questions. We
just kind of push forward,assuming everything will turn
out fine, and mostly it has. Butthere are some places where I
think we can say, maybe we stophere with this, or maybe we
reintroduce cursive handwriting,because we now know it has this,
(17:38):
these other valuable things toteach us.
Scott Allender (17:42):
So historically,
you know, with every new
technology, there's this sort offear that comes with it, right?
So the phone would make us moreisolated television, I know, you
know, I was told that playingvideo games was going to rot my
brain and make me anti social,or, you know, all the things.
What is different about thismoment in
Christine Rosen (18:02):
time. So the
difference now is the scope, the
scale and the speed of what wecan do with these things.
Because earlier technology andthe speed is particularly
important, because with each newtechnology that was culturally
disruptive, the telephone, thetelevision, there was a, then a
lag of time where people, noteverybody, had one for a while,
(18:24):
and then once, once, there wassort of mass adoption. There had
been some time for norms todevelop. So people learned how
to answer a phone and how totake, you know, and there was a
there was a little bit of delay.
And we humans, our evolutionaryhistory teaches us, you know, we
still have a lizard brain.
Sometimes. That's why we reachfor the phone, thinking, Oh, is
someone calling me that we arenot. We need that time to
(18:47):
develop new norms and new habitsand to try to cope with that
environment, these technologiesthat we all now carry around in
our pocket, the first thing wereach for in the morning and the
last thing we touch at night. Inmany people's cases, they've not
been with us that long, and yetthey are in ubiquitous use
throughout the day and not justin work, but in our private
lives, literally on our bodies.
If you wear an Apple watch or anoura ring or any of these
(19:08):
tracking devices, we havebrought them right onto our
human bodies and wear them allthe time and use them all the
time, and that the norms thathave not had time to develop and
catch up. And I think that's whyyou do see a lot of you see a
lot of backlash. You see a lotof overreaction. You saw that
with video games earlier too,but we we should be cognizant of
that fact and not treat everynegative reaction is as just a
(19:33):
Luddite rant. I'm not a Luddite.
I get called one a lot. I usethis stuff every day, but I
often, I want to always ask thequestion like, how is this
changing my life for the better?
And what is it taking away? Whatexperience do I no longer have
because I do it this way? And inthat sense, it's like the old
order Amish. You know, they douse some tools, but they choose
them extremely carefully,because they don't want to
(19:57):
undermine their theircommunity's broader purpose and
theirs and their values. And wedon't do that enough. We just
sort of say, oh my word. Look atthis cool new phone, and the
next thing you know, you'vespent three hours playing Candy
Crush. And that is more so Ithink we actually could be a
little more Amish in how weapproach these new things.
Things, not to reject thementirely, but to make sure they
align with our values.
Jean Gomes (20:18):
And AI takes this on
to a whole new level. Can you
talk to us about how you'rethinking about AI, both at your
own use and in your work?
Christine Rosen (20:31):
So I have a
couple of I've had a couple of
Off The Record seminars with AIdevelopers and people, people
who develop the tools, use thetools, and then some of us who
are a little more sceptical ofsome of the tools. And what's
been productive there is to seewhen humans and AI are used
together to solve a problem thathumans alone could solve, but
not as well, but recognisingthat you do need the partnership
(20:54):
there. So radiologists, forexample, who can now use AI to
scan things and find potentialcancerous cells that just the
human eye and just the welltrained radiologist alone
couldn't now some, I think thereare plenty of people in Silicon
Valley who would say, Great, wedon't need these expensive
radiologists anymore. We'll justlet the AI do it. That is the
wrong direction, because youstill need there's still a
(21:16):
creativity and an art to themedical science of figuring out
you know, each person's personscan so AI used as a very
effective, limited tool bypeople who know what they're
doing already. That's fantastic.
That is, that is bringing justorders of magnitude of power to
decision making. Where itworries me is when it starts to
outsource the development ofskills. And so the perfect
(21:39):
example here is chat GPT withkids who are learning how to
write. Had a long argument withsomeone recently very successful
businessman who was like, Youwriters, I don't know what
you're talking about. My kidswrite these amazing papers, and
he's explained to me how chatGPT does a draft and then they
edit it. And I said, Well, chatGPT is turning your children
into beautiful editors, but yourchildren can't write. You got
(22:01):
very upset, and I said, givethem a blank piece of blank
screen or a blank piece ofpaper, and tell them to write a
story, see if they can do it.
Then they you know that they canwrite, because that is a as
someone who writes for a living,I can tell you, there's no
scarier moment than that blankscreen in the blinking cursor or
the blank sheet of paper. Sowhen we think that these AI
(22:22):
tools should substitute forlearning that tough skill, that
difficult skill of like gettinganalysing and thinking, because
writing is thinking, if you're ahuman being, that's how you
think. You can think that way ifyou allow chat GBT to do the
first draft, and you're notthinking in the same way
anymore. So that. And thenfinally, I worry about the
substitution of, you know, thechat bot style ais that want to
(22:43):
substitute human relationshipsfor a for a sycophantic AI
relationship that tells you whatyou want to hear is never
exhausted, is never tired, neverhas problems of its own that you
might need to help them solve,and that, I think, can
condition, particularly youngerpeople who are learning about
emotional development. It couldcondition people to have
different expectations for humanrelationships that they spend a
(23:04):
lot of time in that kind ofinteraction.
Scott Allender (23:13):
What additional
sort of consideration should
leaders of organisations bethinking about, from a sort of
ethics perspective, whenadopting technologies, adopting
AI, bringing that
Christine Rosen (23:24):
in so whether
it's AI or even if it's just
some of the older surveillancetechnologies, I actually tell I
teach college students everysummer at a seminar we do at
AEI, and a lot of them are aboutto graduate and start a new job.
We have a lot who are, you know,recently graduated from college.
And I always say, when you getyour badge, like, if you're
(23:45):
going to work for anorganisation, and they give you
that badge to get in and out ofthe building, just ask the HR
rep, what else it monitors.
Because I had what? So everyyear I do this, and I've had
several students, one who wentto work for a very large bank in
New York, who emailed me andsaid, you could not believe what
they are tracking. They're likemonitoring how often I speak in
meetings and where I go in thebuilding, and who my badge pings
(24:06):
off of, and other people in thebuilding, so they know who I'm
interacting with. He wasastonished. And so I said, Well,
this is one of the things wethat both if you're an employee,
but but especially if you're aleader, some transparency about
how much surveillance isactually happening. Because if
you don't have that, heimmediately started to mistrust
his new employers, like, why arethey tracking me? Don't they
(24:26):
trust me? So the trusttransparency and surveillance
aspects right now in manyorganisations are not healthy.
So I would say that's one thing.
The other thing is to bestraightforward with with people
whose jobs might actually end upbeing eliminated by a
technology, particularly withAI. We're seeing this with a lot
of sort of entry level whitecollar work that will probably
(24:49):
disappear in the next five to 10years. So if you're the head of
a business and you have a lot ofthose people on staff, you
should already be having thoseconversations about skills
retraining, about, okay, do youknow how to use the AI, is there
a way that we can integrate thatto help you do better, so that
we you're not eliminated? Imean, these sorts of
conversations should behappening. And I think there's a
lot of fear in leadership, ofpanicking the masses of
(25:13):
employees. And that's too bad,because the best integration of
new technologies, and I do thinkwe see this a lot in the medical
field, is when the tool is seenas a bright opportunity to
improve the broader mission ofyou know, in the case of
radiologists finding cancer, orin the case of you know, doctors
who have a new robotic techniqueto like heal more people, I
(25:34):
think every leader of everybusiness needs to have that
like, what is our mission? Andwhere are humans absolutely
irreplaceable? And what? Where,if they are replaceable by AI,
what are other human things wecan that can help them fill that
void? Because that's yourresponsibility. I think if
you're leading an organisation,what
Jean Gomes (25:51):
comes out across the
course of the book, for me is to
kind of call to arms tostrengthen all of these
different abilities that we haveas human beings. Because in an
AI world, as you say, there is areality. Organisations are not
going to just choose a humancentred approach. They're going
(26:12):
to operationally reduce thecosts of the business as much as
they can using AI. And if youthink about what many people do
today is it's not very demandingcognitively or even socially.
They pass around other people'spieces of information, copy and
pasting it, you know, they'reattending to emails, not really
adding an awful lot of value,and so on. When all that work
(26:34):
goes away and you're left with,well, what do I do now? The
organisation is going to askpeople to do more difficult
things. It's going to ask themto solve problems that are
harder. They're going to askthem to have difficult
conversations. Work is going toget cognitively and socially
more demanding, potentially fora lot of people. So from your
perspective, what advice wouldyou give people to think about,
(26:57):
how do they actually succeed inthat world? How do they improve,
you know, the the qualities theyhave so they're ready for this?
Christine Rosen (27:06):
Well, it's,
it's a difficult question,
because there's, there wereseveral generations in now to
training people in habits ofmind that do the opposite of
that, that actually make themmore machine like and you know,
if you talk to anyone in coding,and which is another area that's
been completely upended, youknow, everyone was a computer
science major, they wereguaranteed a job out of college,
and they would make pretty goodmoney and do all the stuff well
(27:28):
that those jobs are just gone.
So what? How? But if you haven'ttrained in the human skills to
find some other whether it'screative problem solving or what
far more important, managinghumans, dealing with other
people, those are the sorts ofjobs that AI is not going to be
able to do. And as long asthere's an identifiable uncanny
valley where humans look at evenvery sophisticated AI and some
sort of, you know, humanisedrobot, we still can tell it's
(27:52):
not human, and it freaks us out.
That's a good thing, until thatmoment, we still need those
human skills. So if you have, ifyou have the kind of job where
half of what you do is paperpushing, that's going to be
outsourced. Think about thethings you do that cannot be
replaced. And this is where Isound cheesy, but I'm like,
where can you really defend thehuman in the work that you do?
What is that and not just thehuman, but the uniquely human
(28:15):
thing you do, so whether that'syou know, maybe you're very good
at getting people together andgetting them to agree on
something. There are these AIprogrammes that claim they can
do that with people whodisagree. But the best way to
get a bunch of people who areArjun about something, you need
that mediator. It's like, ifyou've ever done jury service,
the person who ends up gettingnot chosen as the foreman, but
there's always that person on ajury who kind of brings
(28:37):
everybody to the centre of thetable so the decision of verdict
can be reached. Everyorganisation has those people.
They're not always using theirskills wisely, though, because
they don't get to practice them.
Those sorts of things are haveto be identified anew and again.
That shouldn't be seen as apanicked way to keep people
employed, but as a realopportunity, as you say, to give
people more challenging and morecreative work than they used to
(28:58):
do before.
Scott Allender (29:01):
What surprised
you most in your research, I'm
curious, as you were writingthis book, like what you came
across, that was particularlysurprising.
Christine Rosen (29:10):
So I think the
thing that was most surprising,
because it was most concerning,was the lack of patience, just
how impatient our culture hasbecome. And I know in I have a
lot of a lot of the friends inthe Econ department at AEI were
like, No, you want people whoare impatient and want things on
demand. This fuels innovation,and it's good for the economy.
(29:30):
And they were giving me thatwhole spiel, and I'm like, Okay,
there's something to that. Butwhen it comes to long term
flourishing, long term successin life, long term strategy and
planning and seeing somethingcomplicated and large, all the
way through to the end, when youthink about our politics and
sort of long term planning forsocial projects, and what we
need to do is, as as nations,you have to have people who are
(29:54):
willing to be patient, and thatis something I started by
looking at rates of road rage.
Because I'm like, boy, peoplereally. I live in an area that
has really terrible drivers, andeveryone complains about the
drivers. And I'm like, but Iwonder if that's just me getting
older. No, it turns outactually, road rage is on the
increase. Air rage is on theincrease. People are more
hostile and impatient in public.
There. All these ways we canmeasure the change in the
(30:15):
behaviour and the technologyisn't entirely to blame. All
kinds of stresses in life drivepeople into that sort of
behaviour, but there's a kind ofweird acceptance of it now that
you should never be bored. Youshould always have some way to
entertain yourself. And that'sthe part of it that I didn't see
talked about enough. And whatthat led me, personally to
realise is that every time I hada free interstitial moment, I
(30:38):
was picking up my phone tocheck, text, email, whatever,
and I don't even use socialmedia, so I didn't even have
social media as a draw, and Iwas still always picking up that
phone. And so to realise thatwe're filling all these little
minutes of the day that overtime, add up to your life with
that kind of interaction, ratherthan, say, interacting with
another person, noticing yoursurroundings, letting your mind
(30:59):
wander to daydream, to do allthe kinds of things that we used
to do because we had no optionnow, you have to actively choose
to put the phone down and try todo something else. I do think
that that was, to me, the mostsurprising thing in my own life,
but also in seeing howimpatience and on demand
thinking has changed our sharedsocial space, because I think we
(31:20):
are a lot less kind andempathetic and thoughtful of
each other in public space thesedays. I
Jean Gomes (31:26):
just wanted to go
back to something you mentioned
earlier on, which is the Amishthat use a value driven filter
for new technology, whilst amodern company can't necessarily
go back to that kind of world,what lessons can they draw,
practically from that approach?
Do you think,
Christine Rosen (31:45):
Well, I think
yes, we all going to continue
using zippers and drivingplaces, so we're not going to go
Amish. But I think what you cando every organisation and its
leader can sit down and say,first of all, make sure they
have clearly defined values, youknow, mission statements. Most
businesses have those. But thenthere's a second step, because
so much technology is integratedinto how everybody does business
(32:05):
these days. And that's to say,when a new thing comes up, what
is the decision making processwe have? What are the questions
we ask before we adopt or rejecta new thing. And that seems
simple, but many businessesdon't do this. Many educational
systems don't do this. This ishow bad ed tech ends up in
everybody's kids schools. But Ithink the way to do that there,
there are some models. Jacques,a little technology theorist of
(32:29):
the 20th century, had a wholelist of questions to ask about
technology, and he broke themout into like moral questions,
philosophical questions.
Philosophical questions,environmental questions, each
organisation should sit down andsay, if our value is x and we
introduce a new thing, how willthis new thing change how we
understand our values? So if youare a human centred company that
really invests heavily in youremployees, because you care
(32:51):
about how you know their wellbeing and what they can do for
the company. Then, if youintroduce an AI chat bot to
interact, say, betweendivisions, or to summarise
meetings, what's the human skillthat disappears there? If you
care about humans now, manycompanies will say, well, it was
something that wasn't thatvaluable. And, you know, we
outsource that already, andthey'll go with the chat bot.
(33:12):
But I think some places, if theystop and think about it, will
go, Well, you know, Susie fromHR has loves to convene these
meetings, because then she cansee all the people from the
different divisions. And afterthe meeting, they come up to her
and say, you know, we have thisproblem in our division with
this guy or this gal. And thingshappen on a human scale if you
bring people together. So eachleader has to have go through
the same set of questions,whether it's a new chat bot,
(33:35):
whether it's a new, I don'tknow, Outreach Programme,
whatever it is that however yourorganisation uses technology,
you should have a list ofquestions you ask of every new
thing, because every new thingis not always going to be an
improvement if you don't, if itdoesn't redound to the values
that you started with. And wejust don't ask because we love
technology. It's so seductiveand so amazing. And I mean, I'm
guilty of this too, but you'veyou've got to ask those
(33:58):
questions. They won't be theAmish questions, but they should
have some of the Amishprinciples about community, what
you value, what might underminethose values. That's what to
ask.
Jean Gomes (34:10):
How you seeing this?
In in your world, in inacademia, what are you getting
right and wrong there? Do youthink
Christine Rosen (34:17):
so? It's been
fascinating to me to see a lot
of my friends have returned tothe in the United States. We
used to do all our exams on ablue book. You know, you sit
down and do your final exam,writing it all out. And I have a
lot of friends, particularly inthe humanities, I'm historian by
training, who are using bluebooks again, because they it
avoids the having to deal withkids using a chat G, P, T, to
create answers on, on take hometests. So that's been
(34:40):
fascinating to me, because theyall then have to struggle
through the very poorhandwriting of all of their
undergraduates who have notpractised handwriting in a
while. So you see a kind ofweird retro reversion to things
that that will maintain theintegrity of this of the
classroom. Scholarly researchhas been transformed in some
negative ways. So these toolsare extremely effective at
(35:01):
mimicry. So you can write, youcan have an AI or a chat GBT,
generate a perfectly plausiblearticle with completely made up
footnotes. And unless, and I'veseen this because a friend Tess.
In me with this, an area ofAmerican history that I know a
great deal about. I've read allthe scholarly literature, kind
of a obscure area. And he'slike, look, read this paper and
(35:22):
tell me what's wrong with it.
And I found three footnotes. Iwas like, I have not read these
books, and I would have, becausethey were published when I, you
know, like, I and he's like,Okay, so those were made up
footnotes. The programme kind ofgenerated a plausible footnote.
I knew that because I've studiedthat I went to graduate school
and got a PhD to know that anormal person looking at that,
thinking there's no way theywould know that. So that kind of
(35:44):
inadvertent deception, becauseof the way these tools work, is
very dangerous, certainly inscientific research, we've seen
it in the legal field, wherepeople have submitted briefs to
courts that turn out to havefake footnotes, some of those
lawyers have been disbarred. SoI think that's where if human
expertise and human detection oferror is going to become ever
(36:06):
more important, because theability to mimic real things by
these programmes is incrediblysophisticated.
Scott Allender (36:15):
What about in
corporate life? Kind of coming
back to an organisationalconversation. You know, I sit in
meetings and people are tetheredto their phones and their
laptops and their iPads and allsorts of things while they're in
person and trying, and sothey're not really present. And
there feels like it feels like areal need. There's a sense of
(36:37):
urgency in the room that my jobis so important I couldn't
possibly be fully present here.
And sometimes that's probablylegitimate. There might be
something that they're waitingon that's super important. Other
times, I think it's justhabitual, right? You just sort
of don't want to leave yourdevices. So as a leader of teams
who you know might have a bunchof people walking in the room
with their devices, some,necessarily, some maybe not so
(36:58):
necessarily. What are some tipsthat you could give to a leader
to sort of help people pull awayfrom those sort of habits that
aren't as necessary as they mayfeel?
Christine Rosen (37:12):
Well, I would
say I mean absent presence as an
acceptable way to behave in anysort of meeting or both, both in
one's personal life and in one'sprofessional life. Shocks me.
This is this should never havebeen as acceptable as it is.
Again, I'm showing my age here,but the only people who were
immediately accessible when Iwas young had a beeper, and they
were either a doctor or a drugdealer. That was it like nobody
(37:34):
else needed to be on call likethat. So I think this there is,
of course, if you understandhuman nature and human
behaviour, there's a real senseof status signalling too, right?
If you go into the meeting, butyou, you know, somebody's
buzzing me, I better text back.
I think if you're the leader,you have to be pretty draconian
about it. I think you eitherhave meetings where you're like,
you know what, everybody phonesin the middle of the table for
the next 10 minutes while webrainstorm this particular
(37:56):
problem. And if you, if you'rewaiting for a call, we'll get
you don't even come into themeeting. Like, that's actually
that way. It's like, this is aspace where, if you're so if
you're doing something else, dosomething else, but right here,
right now, for the next 20minutes, 30 minutes, we're just
having this conversation. Andthis actually works. The the
examples I'll give come from myreal life. One is a comedian I
(38:18):
know who talks about making whenhe's working on new material and
he's touring, he makes peopleput their phones before the
concert in a little yonder bagso they no cell phones. And he
said he did that it's notbecause he's worried about
people stealing his jokes. It'sbecause when he's trying to get
a read on the audience, he can'tdo that when every other
person's got either the phone upin his face or their face in
(38:39):
their phone. And it'stransformative for his
experience as a performer to beable to really get that
interaction, understand the moodof the crowd, and if it's
shifting, if a joke lands,that's like an It's art. But
that's true of a good someonewho can lead a good meeting too,
like really reading the room andseeing, okay, this guy's nodding
off, and so that that makes hisjob easier, but also makes
(39:00):
everyone's experience better.
The other thing is, my kids,who, you know, they all have
smartphones, and they go off tocollege, and, you know, none of
them have any money, so whenthey all go out to eat, they do
this game where, you know,everybody, if they want to have
real conversation, everybodyputs their phone in the middle
of the table. Middle of thetable during the dinner, and the
first person to pick up theirphone during that dinner has to
(39:20):
pay for everyone's meal. Soeverybody doesn't, you know
they're all broke. They're like,I'm not picking up my phone. But
again, it's they're making adecision that says, you all are
important to me for the nexthour, and nothing else is as
important to me. And I think ifyou're a leader in a business,
you're telling that to youremployees too. You're like,
we're I value your time, so I'mnot gonna and you should value
my insight as your leader. Solet's all try to do this. But it
(39:44):
does. It really does requiremaking more seemingly more
draconian rules in order to havemore free flowing conversation.
Sara Deschamps (39:53):
Welcome back to
the evolving leader podcast. As
always, if you enjoy what youhear, then please share the
podcast across your network andalso leave us a rating and a
review. Now let's get back tothe conversation.
Jean Gomes (40:08):
Can we talk about
some of the things that you do
and you've learned others do to.
Build this presence in theworld, to build our experience.
So it's richer. What are thesome of these things may be
blindingly obvious, but we'renot doing them, so they're
clearly not as obvious as wemight think. But what are you
learning about how you buildthat?
Christine Rosen (40:30):
So I think
choosing the face to face is the
most important thing, if you cando it. Now, we can't all do that
every day. We're, you know,obviously we're it's wonderful
that we can see each other whilewe record this conversation, but
it would be a lot more fun if wewere all in the same studio. And
I know this, I do a podcast withsome of my colleagues, and some
of them are New York, and someof us are here. And when we're
(40:53):
all in the same place together,it's so much more fun, and it's
also more productiveconversation. There's fewer
interruptions, there's fewerlike, glitchy moments and and
that's just because that's howhumans are meant. We're all
meant to be in each other'spresence. That's the embodied
part of it. But I think we alsohave to embrace the idea that we
should have a little friction inour lives. So for the last 15
(41:16):
years, Silicon Valley has givenus these incredible devices and
a lot of really slick marketingthat says your life should just
always be made easier, moreefficient, seamless,
frictionless. But in fact, formost people, that starts to feel
very weird after a while,friction is what teaches us
things. Now. Does this mean Iwant people going around
deliberately making their livesharder. No, but I think what it
(41:37):
means is choosing the human andmaybe the more potentially
difficult relationshipsometimes, because it'll teach
you something about yourself andalso connect you to another
human being, which is a good inand of itself. It's not a
commodity. It's not but it is agood thing. And I do think a lot
of the a lot of the animosity wesee in public space. A lot of
the hostility and polarisationof our current politics is the
(41:59):
result of the fact that a lot ofit happens with this online
disinhibition effect in place.
Meaning I'll say stuff tosomeone who I don't see, or I
don't have to ever bump into onmy street that I would never say
in person. So I think we haveto, we now have to actively seek
out the human because the thedefault now is mediation. So
throughout your life, you know,in your daily life, think about
(42:22):
when, when could I actually godown the street and say hi to my
neighbour in person, versus, youknow, texting them, or here in
America, putting them on nextdoor on alert because they
forgot to pick up after theirdog, you know, surveying them so
that those sorts again, it seemsvery mundane, but if enough
people have that awareness thatthey're it's a choice. It is a
choice to pick up your phone andtext Happy birthday to your
(42:45):
grandmother, rather than to say,buy a birthday card and write
her a note, or, even better,call her on the phone or visit
her. I mean, the these are allchoices we have now, and I think
people act as if they don't havea choice, and I think that is
delusional, because we all havea choice, and we need to
sometimes choose that moredifficult, more inconvenient,
but more human.
Scott Allender (43:06):
Thing is that
the best way to combat the sort
of dopamine addiction and thatsort of like mindless activity
we do sort of just, you know, isthere other other way, other
things we should be thinkingabout to combat the habit?
Because I fully admit, sometimesI just reach for my phone, as
you described earlier, for noreason. It hasn't even done
(43:28):
anything. I just, I'm checkingit. Are you still on? Is it
still Yeah,what what else should we be
thinking about in terms of habitbreaking so that we can feel
like we can lean into thesechoices with more success?
Christine Rosen (43:44):
Well, I'm,
look, I'm a big fan of having
just non mediated hobbies andpractices in daily life. I'm I
teach and train martial arts,where I the Japanese martial art
of Aikido. It's very formalised.
It's very you wear uniformeverybody you know you you come
into the dojo. Nobody cares whoyou are, what you do, if you're
important or not. And it's all abunch of adults completely
(44:07):
making fools of themselves,learning how to do things like
roll and throw and like it's sofun. And it's fun because you
have to use your mind and yourbody, and you have to give it
your complete attention, and youcannot have any, you know,
devices anywhere near there areno screens anywhere. And if you
don't give it your fullattention, you might hurt
yourself or hurt someone else.
(44:27):
And so I think everyone can havesome sort of non non money
making, non productive, justcompletely weird thing that's
theirs, but it should use yourmind and your body and not be
mediated. I have friends who dothat with cooking or with
gardening or, you know, craftmaking. I mean, there are a
million things, whatever yourthing is, find it, but then
practice it. Do it often enoughthat it is a practice. Because I
(44:50):
think what it I know for me,especially when I teach, it just
forces me to slow down and getback to basics. Because I'm
like, I'm teaching someone who'snever tried to do this thing
with their body, and I have tobreak it down into like, oh, you
put your hand here, and then youdo this, and there's a and it's
slows everything down. And so Ithink a lot of us exist at this
level of efficiency,productivity and speed that we
(45:12):
when we. Down. It's almostdiscombobulating because it's we
don't do it often enough. Soanything that you can use your
mind and your body unmediatedthat slows you down, is what I
would recommend.
Scott Allender (45:25):
I love that.
Jean Gomes (45:26):
What else should we
be talking about? What haven't
we covered in your thinking?
Christine Rosen (45:31):
You know, one
of the things that worries me a
lot, and I think it's a groupthat's been extremely good so
far about raising some alarmbells with AI, and that's
creativity. And artists work,what artists do. Because if you
think about one of the ways thathumans have always communicated
with each other, it's throughsymbol, through image, through
(45:51):
music, through through a way ofthrough literature,
communicating acrossgenerations, but connecting in a
way that reminds people we mightnot be from the same place or
even the same time, but there'ssomething that connects us,
that's human. And I think one ofthe challenges now is again
this, this idea that we don'thave the patience any longer to
(46:11):
sit and be humble before thevision someone else might
present to us of what they thinkBeing human means. So we'll take
AI slop art or AI music, and,you know, think, oh, but it's
free, it's cheap, and it soundsjust like Taylor Swift's latest
song. Well, no, you actuallywant the song that was written
by the human because she hadanother terrible breakup, or,
you know, whatever it is. Butthat's true of painting. And so
(46:33):
one of the stories I tell in thebook that really resonated with
me was a an art historyprofessor, and she was finding
her students, even this is at anIvy League school. They were
very accomplished kids, but theywere so impatient and they she
just couldn't get through tothem until she started assigning
them something at the beginningof the semester, which was to
pick one work of art, and youthey had to spend a certain
(46:53):
number of hours, I thinkoriginally was like six hours,
four to six hours, not alltogether at the same time, but
before they were allowed tocomment on it, write about it,
discuss it, they had to justlook. They had to sit and they
had to look. And what thattaught was patience, awareness
and humility. And I thinkhumility is something that's
really lacking in our dailylives, because so much can be
(47:14):
outsourced and done for us andand, but humility is an
important human thing, right? Itreminds us that we are one small
person in a much largeruniverse, and that there are
amazing people who can do thingswe can't even comprehend, and we
can learn from them. Andtechnology sends us the opposite
message every moment we use it.
It's you put this phone in yourhand, and you're in charge of
everything. You're in charge ofeverything. You're the centre of
(47:36):
the map. You can summon a car.
You can get tacos delivered bydrone, whatever it is you are
God, and I think artists areconstantly in the battle to
remind us that we're human,which means we are limited. We
have we have a time limit, wehave a lifespan. We have cycles
of life where things are goodand bad, and that recognition
requires us to slow down andreally look and be aware. So I
(47:58):
think that's a part of the storythat continues to fascinate me.
I follow a lot of artists whoare who are really questioning
that in their in their work andand if you go on social media, a
lot of artists will now sayhuman made art. They're they're
distinguishing what they do as ahuman activity, and I think
that's really important,
Jean Gomes (48:18):
what's next for you
in your research?
Christine Rosen (48:21):
So I am now I
have two I'm going to start
working on another book, but Ihave to, this is how I do
things. I just start keepingnotebooks, and when I get to
like, five notebooks full ofideas, I start going back
through and trying to find thething that's going to be the
main idea. I've been reallyfascinated by how we understand
risk culturally over time, sohow the how the modern person
(48:43):
understands risk, versus howsomeone 100 200 years ago might
have understood risk and and howthat is also linked to sort of
the development of, I don'tknow, sense of morality, values,
norms, those sorts of things. Soit's all very hazy right now,
but I'm I, but, but the risk issomething that I think we've
(49:04):
allowed to set us. We've setaside in some ways because of
convenience, but we're alsoreally unable to assess risk in
a way that's rational sometimes.
So I'm just, it just fascinatesme that how humans understand
risk and how they assign valueto certain things is risky or
safe, and how that's informed byculture and history.
Jean Gomes (49:23):
We had a we had a
great conversation with the
psychologist Ellen Langer abouthow risk is subjective, not
quantifiable, in that in thatkind of financial sense. Yes,
you might want to listen to thatshow.
Scott Allender (49:40):
Excellent. Thank
you for doing this work, writing
this book. It's so important.
How optimistic are you about ourability to start re embracing
our humanity more fully?
Christine Rosen (49:52):
So even though
I might sound a lot like a doom.
Sara, I am optim. I'm optimisticby nature as a person, but I am
actually optimistic after doingall of this research, because
you can't keep humans down. Imean, we have this wonderful,
quirky, rebellious, stubbornstreak. That means that whatever
scheme you know Silicon Valleybillionaire wants to impose on
(50:12):
the rest of humanity. Like itjust takes a couple of us going,
I'm not going to do that. You'renot the boss of me. And
actually, that's the thing. Weare the bosses of ourselves. We
have a conscience, we have freewill. We have choice if we're
lucky, and so we need toexercise those more responsibly
and thoughtfully. So I'm I don'tthink that we're all going to
end up in a human zoo with an AIoverlord or, you know, Skynet. I
(50:34):
actually think that we willfight back in creative ways, but
we do have to see it as a not ahostile war, but as a thoughtful
response to what these verypowerful things we've created.
How do we use those to help usbe better humans, not to make
humans more conform more to thevalues of the machine? That's
really for me. I think why I canremain optimistic, because we
(50:57):
are crazy creatures. I lovehumans.
Jean Gomes (51:00):
What a wonderful
thought to finish this on,
because I think that idea ofhuman values versus machine
values is such a powerful kindof way of framing this whole
challenge. Thank you.
Scott Allender (51:14):
Thanks so much
and folks, until next time,
remember the world is evolving.
Arjun.