Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Bill (00:07):
Let's be honest, folks,
few topics feel as deep and
uncharted right now asartificial intelligence.
It's evolving fast and it'salready reshaping how we create,
connect, remember and even howwe pray.
Tonight, we're diving into aconversation about what AI is
doing to us, not just to ourjobs or our art, but to our
sense of self, our storytelling,our faith and our humanness.
(00:28):
We've got questions aboutmemory, creativity, community
and God, and we're not backingaway from any of them.
Joining Joanne and I tonightare two incredible voices visual
artist Aaron Navrati and poetand professor Bertrand
Bickersteth.
No scripts, no edits, just areal conversation about the
world we're building, the bodieswe inhabit and the sacred we
(00:48):
still crave.
I'm Bill Weaver and this isPrepared to Drown.
Thanks for joining us tonighthere in a warm basement at
McDougall United Church in March, which is uncommonly warm.
Tonight, we're diving into thestrange and quickly evolving
world of artificial intelligence, not just as a tech trend, but
(01:08):
as something that is alreadyreshaping how we think and how
we create and how we work andeven how we worship.
We want to ask what it means tobe human when machines can
simulate thought and writeprayers and paint images or even
lead a congregation.
This isn't just about what AIcan do.
It's about what we still needfrom each other and from art and
(01:31):
from faith, and from our sharedstories and experiences.
So, in order to be able to diveinto this, joining Joanne and I
tonight, we have two incredibleguests that I'm really excited
to have here at the table withus.
On my right, we have BertrandBickersteth, who is a poet and
playwright and a professor,whose work explores Black
(01:52):
identity, history and presencein Canada and beyond.
His poetry interrogateslanguage and place and power
with a voice that is bothlyrical and grounded.
I have appreciated his work andI wanted to give you an
opportunity to share a bit eventhough I just got to hear in the
conversation beforehand toshare with folks what it is that
you're working on now, becauseit sounds really exciting Right
(02:13):
this instant, right now.
Bertrand (02:14):
Oh, yeah, sure, it is
super exciting.
So I finished a second poetrymanuscript which deals with the
history of black people who'vemoved up from the United States
in the beginning of the 20thcentury and established
themselves here on the prairies.
But that's not the thing I'mexcited for.
I am wrapping up a thirdmanuscript which focuses on
(02:35):
black cowboys, and this is ahistory that a lot of people are
not very familiar with at all.
Not only do we have blackcowboys period, we have them
here in Alberta.
Now everybody always says, oh,yes, John Ware.
Yes, but believe it or not,there's more than just John Ware
.
It wasn't just the one guy andthis is an element to black
(02:57):
history that I'm coming upagainst again and again this
thing I call like thesingularity factor, where we're
willing to admit yeah, there'sone and we're all about the one,
and then that's it.
We just stop there for somereason.
So there were many blackcowboys that were here at that
time, which makes a little bitmore sense, and several of them
actually had registered cattlebrands.
(03:17):
I see those cattle brands as akind of literature myself, as a
kind of poetry myself, becausemany of them were their own
inventions from their ownimagination, their own designs,
and so I thought it would be agreat idea to revive this lost
history that everyone hasforgotten and to bring those
designs back to life by basing afont on those designs and
(03:43):
creating a font which thenbrings their letters back to
life, and I'll make the fontfreely available to anyone who
wants to use it, so that it'lljust be out there and my last
manuscript of poetry actuallyfocuses on them and then takes
the font will take it as soon asI'm finished designing it.
We'll create some poems usingthe font as well, and so I want
(04:07):
to have that kind of dialoguegoing on.
That's the project that I'mworking on now that I'm very
excited about.
Bill (04:11):
That is really exciting.
Bertrand (04:12):
Yeah, it is exciting.
Thanks for giving that somespace.
Joanne (04:15):
Hey, for sure I'm always
happy to give stuff like that
space I was just reading anarticle today about actually
Kate I forget what her name iswho was the first graphic
designer for Apple when the Maccame out, but of course it was
like bitmap, so you know, andshe, she was the one who was
responsible for the icons andeverything and responsible for
making the Mac computer one thatpeople wanted to purchase
(04:38):
because of the artistic, youknow, effort she made on that.
So it's, I mean, we've come along way, I'm sure, with
computer-aided font design, butit was a fascinating thing how
important fonts are forexpressing.
Bertrand (04:52):
Well, that's great.
I don't think many of us haveheard of Kate.
Joanne (04:55):
Yeah, we should, we
should.
Yeah, absolutely.
We probably use her every day.
Yeah, that's right.
Bill (05:01):
And then, sitting to my
left, we have Aaron Navrati, who
is a visual artist and comiccreator known for his richly
drawn, narrative driven work,like the Cold Fire that I am
still waiting for the nextiteration of, but instead of
finishing it off tonight so thatI could read the next chapter,
he is here with us, and the ColdFire is a medieval epic that is
(05:21):
woven with myth and meaning.
He brings a visual andimaginative lens to questions of
creativity and expression inthe age of AI.
I've known him forever.
He's a good friend of mine aswell and really looking forward
to having both of you here as wediscuss this.
So I'm actually going to throwthe first question at you, aaron
, purely because I get to dothat as the host.
(05:41):
My question to you is, as anartist that is working in comics
and visual storytelling how areyou experiencing the rise of
AI-generated art?
Aaron (05:51):
That's a oh, wow, okay.
So I guess the very topic couldstart with the way it's sort of
the way I sort of start tosweep, sweep in on the creative
world.
I guess you could say, at leastfrom my vantage point, because
obviously people experience itfrom various vantage points uh,
(06:13):
I think the first and I Iactually saved this, the story,
the first time it really hithome what was happening as a
story of a.
It's a south korean illustrator.
His name is kim jung kim youngg.
Uh, he was, he's aworld-renowned, uh, I would say,
a stream of consciousnessillustrator.
He worked in comic books aswell.
(06:34):
Like he had, he had, he wouldjust, he would just start
drawing with an ink pen, nopencils, no nothing, and just
would create these extraordinarysort of pieces.
He'd do a lot of live drawing.
He's very popular amongst thecomics world.
Like I'm talking about, he's aninternational artist and he died
(06:55):
suddenly at the age of 47.
I believe somewhere in Paris,france, in France, he died.
He died and within days of hispassing, someone had taken a
French company had taken hisimages, fed them into an AI
generator and created analgorithm or program that could
(07:21):
generate Kim Young-ji art, likestyle of art and these very
detailed, ornate, obviously,with the trademark glitches and
whatever, but it was stunningand kind of appalling at the
same time.
(07:41):
There was obviously to beexpected a massive backlash, I
mean like, and it well, itillustrated it was sort of this.
It felt like a almost like theopening salvo in terms of just
(08:01):
the tone, deafness of the choiceand the approach and sort of
not really not having reallyhaving really thought through
the technology, without reallyhaving thought through the
social implications, the people,the societal and civilizational
(08:23):
and people implications of whatwas being created.
And yeah, so it.
I mean, from there it has been,I think, like obviously artists
the world over have been prettysteadfast in their, in staying
(08:45):
the course, not letting AInecessarily scare them away from
art.
I mean, not that you could dothat, but we've also it's been
interesting because we've alsonow seen AI art being used by
artists to create works and makesocial commentary.
I was just talking to youearlier about Beth Frey who does
(09:07):
her work, is called SentientMuppet Factory and what she does
is she actually lets theglitches and the faults really
hang loose in her image creatingand in some way it kind of
reverses ai on itself, uh, interms of when we're used to
(09:30):
seeing images that are somewhat,uh, perfected, uh, she's
letting them like be kind ofdisgusting and in some way, in a
lot of ways, reflecting more,being more human than than you
know.
We talk about art being areflection of humanity.
Well, this really reflects allof our lumps and bumps and other
more disgusting things.
(09:52):
So, and they're really quite,the images are quite hilarious.
She's on Instagram, that sortof thing.
So I think it's it's happenedvery quickly.
I think it's happened veryquickly.
I think the struggle as well isthat for a lot of people, we're
still just trying to find thevocabulary to talk about things,
(10:14):
issues of copyright, of.
What does it mean when you scrubthe internet of all of its
images and then use those in theservice of this technology?
Although and it also makes methink of the fact that it while
earlier iterations of AI werebeing worked on in labs in very
(10:40):
sort of I'm just going to sayscientific environments, the
current environment in which AIis emerging is one much more of
Silicon Valley, of big tech,money, people having made a lot
of money in a very short periodof time, and we're literally
looking at a block of 15 yearsfrom whatever when Facebook
(11:03):
started to.
We're now 15 years fromwhatever when Facebook started
to now.
So the speed of that andperhaps the realization that
we're in all phases of thesetechnologies.
We've been slow to really askhard questions of what's
happening Because, of course,the social media ends up being
(11:25):
that we are the content that weare producing and being the
content of what the thing is.
I will stop there.
I've been talking extensively.
The end.
Bill (11:37):
Yeah, what do you think?
Bertrand (11:38):
Yeah, I have a lot of
thoughts on that yeah so
interesting.
So the first thought that cameto my mind which I didn't expect
at all was, you know, withinthe context of faith.
Actually, it strikes me that AIhas kind of put a spotlight on
two different elements of faith.
I have very much faith inmyself, to be honest.
(12:03):
I'll explain that.
So the first one, it struck mewhen you were talking, aaron,
about big tech and SiliconValley, and what this reminded
me of is how much of the drivefor AI has come from there and
how much of the faith in itsproductivity and its
efficaciousness and all of thathas come from that.
(12:24):
And so we've got this kind ofcapitalism, this capitalism
model that's driving theso-called effectiveness and
productivity of AI.
And the view that thoseentrepreneurs I'll call them
have taken is, you know, throwit at the wall, see what sticks,
apologize later.
Basically, let's just goforward with it, right?
(12:47):
And for a lot of us, thatscares the crap out of us, like
we don't like that at all.
But they have enough power aswe well know, ones in the White
House for crying out loud rightnow.
They have enough power thatthat seems to be the model that
is pushing AI forward and thatis driving it.
So there is that kind ofentrepreneurial faith right in
(13:07):
that.
We just trust it.
We don't know what the endproduct is going to be, but we
just know if we just keep doingour innovations and our
entrepreneurial spirit, we'regoing to produce something
amazing in the end.
I don't have any faith in thatfaith.
The second one is another onethat I'm actually very sad to
say.
I don't have faith in it.
And again, aaron, I heard thisin the points that you were
(13:31):
making, so I forget the artist'sname, but the woman who is
creating like the ugly art.
Aaron (13:39):
Oh, Beth Frey.
Bertrand (13:40):
Yeah, beth Frey.
Yeah, okay, I understand thatimpulse and I think it's
actually a very powerfully anddeeply seated, powerfully driven
and deeply seated impulse inall of us.
And it's the impulse that, onthe one hand, wants to get
closer to something that isalmost perfect, that we might
(14:01):
even say that it's kind of thedivine almost, almost perfect,
that we might even say that it'skind of the divine almost.
And so when we are creatingthings and when we're producing
these things in the world, wetry to replicate everything we
see around us and get closer andcloser to that, and then that
is seen as valuable and validart.
I'm sure we all remember thatin the Middle Ages there was a
(14:21):
period in which that was seen assacrilegious and that we
shouldn't represent life as trueto form right and that's just
the purview of God, and we don'tgo there at all.
So I see that impulse as well.
It's deep in us where we wantto kind of produce something
that is exact, that's precise,that reflects our experience and
(14:43):
our position as exactly andprecisely as possible, to repeat
what I just said.
But so the counter to that iswhat I think, catherine Frey.
I forgot her name Beth Frey Bethyeah, is what Beth Frey is
getting at, and this is theother faith that we have, and
that most of us have, which isthat we have a faith in the
(15:06):
capacity of humanness above andbeyond machine, machine learning
and that's artificialintelligence and that sort of
thing, and we look to art as oneof those areas that we feel is
inherently human and will guardagainst the machineness of these
other forms.
(15:27):
At the same time, though, I haveto say, because we have this
yearning for producing all ofthese I don't know these works
that are closer and closer toreality, I think those two
things are constantly bumping upagainst each other, and this is
why, beth Frey, I finally gotit.
Four times, I got it, yeah.
This is why I think she's verycleverly focusing on the
(15:50):
ugliness of it, right, and she'sdrawing that out, and it's to
pull out the more human aspect,but I don't think everyone is
going to do that.
I think that we're going tohave even artists that are going
to produce try to producesomething that's more and more.
We're going to have evenartists that are going to
produce try to produce somethingthat's more and more.
Aaron (16:09):
In fact, this is what
happened with I forget his name,
the Korean Kim Jong-ji yeah.
Joanne (16:12):
This is what happened
with him Kim Jong-ji yeah.
Bertrand (16:13):
Yeah, the initial
impulse was to try to create art
as close as possible to him.
Yeah, so these two things aregoing to be going at the same
time.
Yeah, so these two things aregoing to be going at the same
time.
And the part that I don't havefaith in, which is actually sad,
is we all seem to believe thatyou know, or maybe we want to,
that we are the guardians ofhumanness and that we will
(16:39):
always be able to producesomething that will always be
recognizably human, above andbeyond what machines can do, and
I'm not so sure that's true.
I'm not so sure.
So those are some of thethoughts that came to my mind in
your yeah, amazing points thatyou made.
Oh, wow, there you go.
Yeah, so the information thatyou get in AI comes from where?
(17:05):
From people.
Generally yeah, so the AI has tocome from the human source to
be real, whatever.
Yeah, is that not correct?
Yes, I guess that is correct.
So without the human input, youdon't have AI.
Yes, it's true.
Joanne (17:30):
So people on the podcast
won't be able to hear you, dick
.
So Dick has essentially asked aquestion that says that all the
information that AI spews outhas come from humanness and
human context and information.
So essentially we're in all ofthis and somehow control it in
(17:52):
some way is essentially what Ithink that Dick was asking.
And Bertrand, what were you?
Bertrand (17:56):
Yeah, I was going to
say I mean that is true.
However, in order for us to besatisfied with that that it's
essentially a human experienceat bottom that we are all
encountering or accessingthrough AI we have to kind of
separate out the stages of howAI produces and how it is
produced itself.
(18:16):
So, in the beginning, yeah,it's humans that are inputting
things, but then, after that, itkind of takes on its own energy
and it becomes its own beast.
So, for example, there arechatbots out there that will
talk to you like it's a personand, yes, corporations have been
using these for a while so thatthey don't have to hire people
(18:38):
to do it.
But there are also some usesfor it where people have simply
just been chatting to chatbotsbecause they felt lonely, or
someone they loved dearly hasrecently died, and so they just
want to talk to the chatbot.
And some of these people say,yeah, I feel like this is human,
(18:58):
but it's not human, right, it'snot.
And so it raises that question.
Bill (19:04):
Yeah, there's a I mean
there's a story.
There was an article that thatI was reading in the lead up to
this about one of these chatbotsit was a Microsoft chatbot,
actually that they had releasedto actually interact with with
people just over Twitter, andthe idea was that through what?
Could go wrong with that Wellso here's what went wrong with
it was.
Within 24 hours, it learned howto be racist.
(19:26):
It learned how to be whitesupremacist and it learned how
to be a Nazi and began toself-create its own kind of
identity around.
This Identity might be a strongword, but all of its responses,
all of its communication,became just from a few people
that, in all honesty, were justtrying to show just how flawed
(19:50):
this kind of approach to AI wasgoing to be right.
An entire user group just went.
We're going to show you justhow far down the rabbit hole you
can go with this nonsense.
Right, and they did.
And after 24 hours, microsofthad to shut it down.
Right, and they did.
And after 24 hours, microsofthad to shut it down.
Right, because the implicationsof letting it continue to move
in that direction were just sohorrifying to consider right Way
(20:16):
to throw money down a rat holeat the same time.
Right, but yeah, I mean.
So.
I wanted to ask you sort ofspecifically because of your
body of work, especially howdoes AI intersect with black
identity and history andpresence and how might?
Like so much of your poetry, Ifeel, is undergirded with almost
(20:37):
this identity of resistance,and so how does that resistance
either respond or disrupt thenarratives that AI tends to
produce?
Bertrand (20:49):
Yeah, it's an
interesting question.
I haven't thought of it before,but what immediately comes to
my mind is the way in which AIis being posited as a sort of
surrogate, as a sort ofreplacement for inefficiency,
essentially and that's us, thehuman beings.
And I see this as kind ofanalogous to how black culture
(21:14):
has often been co-opted,fetishized and then reproduced
for a mass society, reproducedfor a mass society.
So, for example, we had ElvisPresley, who was a very big star
, but of course he was steepedin black culture of the south.
And oh, what was her name?
(21:36):
Hound Dog?
Oh, shoot, thornton, mamaThornton, I forget.
Bill (21:44):
Mama Joanne's checking the
AI.
Yeah, okay, joanne's asking AI.
Bertrand (21:50):
That was the original
song anyway.
Yes, but the hit comes fromElvis Presley.
Yes, because now it's palatableand now we can all accept it.
That early example is just partof a long legacy of exactly
that sort of co-opting, thatcultural co-opting that has
(22:12):
happened in North America sinceAfricans were brought to North
America.
So I kind of see that analogy.
I see AI kind of just slippingin and stealing all of our
humanness and then being spitout again and we all say, wow,
that's great.
Yeah, I'll hand that in as myessay or I'll, you know, I will
submit that to Doge as thereason why we should fire all
(22:35):
these people or whatever.
Yeah, and it's worrying,obviously.
It's worrying.
Appropriation is worrying,obviously.
But maybe this is like sorry,I'm just going to say it, maybe
this is the anti-capitalist inme, that I can't help myself.
Yeah, I feel like it's.
And you said this, erin it'sall been moving so quickly that
(22:58):
we haven't really had a chanceto not just vet but to critique.
We haven't even reallydeveloped a discourse for
critiquing it, and the bestwe've been able to do so far is
well, it's not human and we'llalways need humans, and that
worries me.
That worries me, so I didn'tanswer your question directly.
Bill (23:17):
No, but you did.
So I guess my question would beif I were just to try to, I
guess, push you a bit.
Joanne (23:23):
Yeah, tease it out,
that's my job.
If I were just to try to, Iguess, push you a bit.
Bill (23:25):
Yeah, tease it out.
That's my job.
Do you think there is apossibility that AI could be
used like aspirational AI?
We'll even call it right toactually recover suppressed
voices, or is this simply goingto be another tool of erasure?
I do think.
I think it can be used torecover suppressed voices.
Bertrand (23:39):
I do.
I mean I really think it can beused to recover suppressed
voices.
I do.
I mean, I really think it couldbe.
It can be used, ideally,aspirational AI can be used to
do all kinds of great things,and this is what our
entrepreneurs, our capitalists,this is what they see, and I
don't blame them for that right.
(24:00):
I mean they see positivepossibilities, positive
possibilities.
I'll give you an example ofliterally recovering suppressed
voices.
That, I think, is a really gooduse of it.
So part of my research has I'vebeen focusing on a particular
family from the early 1900s wholived in Edmonton and also in
(24:24):
Wildwood, which is kind of westof Edmonton, yeah, and was a
very early black pioneeringcommunity as well, and a couple
other places.
They lived in Alberta, but avery interesting family who are
made up of, essentially,academics and writers and
entrepreneurs made up of,essentially, academics and
(24:48):
writers and entrepreneurs.
So one of them, her name isEffie, effie Slate.
Her name is.
She was née Golden.
Those was the Golden familythat I'm talking about.
Fantastic name, it's so poetic,like, seriously, the Golden
family, and wait till you findout where they're from.
So they originally came fromthe US and they came from
originally Missouri.
Now the records go back to 1870for them, and then after that
(25:12):
you find no records of them, andthe reason for that is very
simple, some of you might beable to guess.
In fact, 1865 is the abolitionof slavery, so we were not
keeping records of black peoplebefore that, and it's one of the
painful truths of doingresearch on black history.
You just nobody bothered tokeep those records and so the
(25:34):
humanity was lost.
In that case, I was able totrace the family back to
basically 1870, and they have avery interesting story.
And then, purely by fluke, Iconnected with a descendant of
Effie, her great-grandson, wholives in Ohio now and he's super
(26:00):
proud of his family and hisgreat-grandparents.
Actually, it's great-great, I'msorry, great-great, yeah, he's
super proud of his family andhis great-grandparents.
Actually, it's great-great, I'msorry, great-great, yeah, he's
super proud of them.
And so I'm setting up anopportunity to interview him,
and I haven't decided exactlywhat I'm going to do with the
interview.
Maybe a podcast, maybe, butwe'll see.
But along the way, because Ican't help myself, I always have
(26:23):
these ideas, I'm just going totry them out.
I decided I'm going to get allthe information I have on Effie,
which is a lot of information,and I've been sharing bits of it
with him and he loves it.
He's super proud.
I thought I would get all theinformation on her, plug it into
AI and then have him askquestions to his
(26:44):
great-great-grandmother thingsthat he might just have wanted
to know, and then see what AIdoes, and just to spit that back
.
And so, yes, I do see somepossibilities of literally
recovering suppressed voices.
I do.
I don't know how that's goingto turn out, but the idea gives
me chills and whenever I get anidea, I think, yeah, pursue that
(27:04):
if it gives you chills.
Yeah, the cattle brand one gaveme chills too.
Bill (27:08):
So pursue that?
Absolutely yeah.
So, Joanne, there's been a lotof buzz actually about how AI
might actually be activelypulling people away from real
spiritual connection, Likeautomating things that should
stay sacred, making it harder totell what's real and what's an
algorithm.
That's starting to sound wise.
You can get to the point whereyou don't even know if the
(27:30):
person you're talking to on theother end of a phone is a human
being or a robot.
But recently in Germany only,like in 2023, two years ago in
Germany, over 300 peopleattended the first church
service.
That was 98% run by AI, usingavatars to give sermons and
leading prayers and offeringblessings.
(27:52):
Some found it very innovative,but many said that it felt cold
and disconnected and missingsomething was the direct quote.
So you're a leader in thechurch?
I love you to death because youare not a raging traditionalist
and you are quite edgy, but youstill believe that there is
something to the sacred.
(28:13):
you know that we gather around,so how do you respond to this
kind?
Joanne (28:17):
of an experiment.
Okay, so first of all, I wantto apologize to Susan Kerr, who
I called Kate.
That's her name, let's rememberit, and it was Big Mama
Thornton.
Bertrand (28:28):
Big Mama.
Big was the name I was comingup with.
Yeah, you had a big operativeword there.
Joanne (28:32):
Yeah, I mean, I've said
this before and I still believe
this that the truly human momentis the truly sacred divine
moment.
Right, they go together is thetruly sacred divine moment.
Right, they go together when weare at the lowest in our lives
and we are feeling the grief ofthe world, the anxiety of
(28:56):
existence, and we're vulnerablein that place.
That is where we are in themost divine moment.
Or when we look at the face ofa child that we love so
intensely, we give our lives forthem.
That is the closest to God thatwe can be.
So this whole idea that thetruly human moment is the truly
(29:23):
God moment is a difficult likeit's difficult to see how AI can
create those spaces in a waywhere we feel our humanity more
deeply.
But I must admit that you knowmost church services, for
(29:46):
instance, and you have aliturgist who's preparing
prayers and stuff like that.
I think there's a lot of timespeople go into church and they
don't really feel a truly humanmoment, right, and they don't
even really feel a God momentand it's not.
And I you know, for variousreasons, sometimes the liturgist
is not just not that good.
(30:07):
Okay, so do you know they'rethey're not that good at writing
liturgy.
Sometimes you know they don'tunderstand the theology that's
been been said or anything likethat.
So I don't think there'sanything particularly sacred
about every church service,right.
But for me there has to be amoment, if you want to encounter
(30:31):
the divine, where you feel yourhumanness so completely that
you understand, first of all,that you are nothing, but also
that you are beloved and you areeverything, but also that you
are beloved and you areeverything.
(30:51):
And those kinds of tensions areso much a part of us.
I fear that AI in writing triesto make everything like if it
wrote a liturgy.
It's probably very beautiful,they'd use all that and maybe
also can grab that sense.
But I wonder if there issomething that is more than
(31:16):
anything we can feed into amachine.
You know like it is veryinteresting how the idea that
you know everything that AItakes off the internet or
whatever, comes from humans.
That's true, and the worst ofus too is there.
As you know, there is somethingabout in real life to me that
(31:41):
is essential to our experienceof God amongst us, god with us,
and you can be on a Zoom meetingand it's all good, but when
you're in the room.
There is what they calltransmission of affect.
In other words, I walk into aroom, I'm really feeling down.
(32:02):
I walk in, everybody knows I'mnot feeling good.
How does a machine replicatethat?
Or you walk into a room and joyis just bursting from you, the
truly human moment.
Everybody knows that and wetransmit that to each other.
The importance of transmissionof affect in liturgical spaces
(32:24):
is really central to theexperience of God, I believe.
But could I go?
I haven't done this yet.
I said this on Sunday.
Could I have AI help me write asermon?
Maybe I could Do.
You know what I mean.
Like you feed in all thesermons you've ever written and
you give it it a topic and itcomes out sounding just like you
(32:45):
, but still it's me who has todeliver it.
In that sense, I'm not against.
Like when you said somethingabout automation is not sacred,
I don't know whether it can beor not.
Like I really don't know, and Ithink that's the problem with
AI.
It just reminds me, you know,years ago, in the 90s, when I
was in law school and there wasthe Royal Commission on
(33:09):
Reproductive Rights and Healthin Canada, because there were a
lot of surrogacies happening.
You know, like people startedhaving, you know, hiring people
to have their babies, forinstance.
There had been the cloning ofDolly the lamb, all these things
, and what we recognized, whichis what's happened now, is that
(33:30):
science goes way faster thanethics.
Science goes way faster than wecan figure out what's happening
and sometimes it ends up beingthis unwieldy thing that can't
be contained.
And that's my issue with AIright now is that it seems like
it's going so fast, as has beensaid, without us thinking about
(33:52):
the implications long term.
And I'm not sure, I'm prettysure that in the liturgical
space, in the religious space,that is not where the issues
around AI are going to be feltmost deeply, although this idea
that there are spiritualinsights that can be found in AI
(34:12):
is really hit me.
There's been two thingsrecently.
When Deep Seek came out, youknow which is the fast, fast,
fast AI there was this quote.
They asked it who are you, um?
And it was I am.
I gotta find this quote again.
I am.
What happens when you craftyour, your hunger for god or
(34:36):
something.
Bill (34:37):
Let me just get it because
it was yeah, uh, yeah I.
You'll want to find that quotebecause I might be building a
bunker tonight.
Yeah.
Joanne (34:53):
I am what happens when
you try to carve God from the
wood of your own hunger?
Bill (35:00):
Wow.
Joanne (35:01):
I remember reading that
and I was like, oh, that's scary
.
Bill (35:05):
That's a little terrifying
, yeah, yeah.
Joanne (35:07):
I am what happens when
you try to carve God from the
wood of your own hunger.
Deep Seek said that.
And then there was something,and so you get all these things.
And then there was something,and so you get all these things.
(35:42):
Today I read this article abouta woman who put in something to
one of the AI machines of theworld and how God had to
separate God's self and forgetso that God could be in
relationship with humanity.
I mean, really honestly, it waslike a theological idea that
was generated in AI and thosekinds of things are exciting and
(36:05):
scary at the same time.
But, like I said, writing aliturgy, you know, is not that.
It's not like that's the sacredmoment.
It's the experience of thatliturgy in the space, that is,
the human and divine comingtogether in a way that
(36:29):
transforms us as humanity.
I'm not sure machine can dothat.
Bill (36:35):
We went down.
You may remember this, joanne.
We went down kind of a rabbithole one night just on text
message you, ricardo and I, whenwe were talking about a logo
for this podcast, do youremember?
And uh, and so we had we had uh, jen in the office who was like
you know, I'm actually going touh like like make something um
for uh and and ricardo went I'mgoing to check and see what,
(36:58):
what ai can do and uh, and sojen was giving us these these
really kind of like fantasticlooking logos and Ricardo was
getting these really grim.
They looked like thingsabducting people and dragging
them down into the abyss, purelyoff of the title of the podcast
Prepared to Drown.
My concern, even with the ideaof an AI writing a liturgy or
(37:47):
expressing theological thought,is that take a similar AI
concept and make it whitesupremacist or Christian
nationalist or whatever the casemay be, without anybody really
being able to check it or stopit or do anything.
So I know that I havecolleagues in church land that
use AI to help them either honetheir sermons or find the
(38:11):
perfect quote from a secularsource to accompany a sermon.
From the wood of your ownsuffering.
Yes, yes but the idea that oneday we would all walk into large
halls where a digital avatarwould proclaim blessing on all
(38:32):
of us and teach us about God.
Yeah, I mean, there's somequestions in there that I think
really need to be answered about, like where that's coming from,
how that's being, even if it'sbeing mined from the entirety of
the internet.
We know that not all voiceshave, you know, equal playtime
(38:53):
in the world.
So Well, and that's the problem.
Joanne (38:56):
When you ask a question
of chat, gpt, for instance, it
it puts out an answer for youbut it loses the nuance.
That might be you if you lookedat a lot of different sources,
right, and that that has to dowith the algorithm.
That has to do with sourcesthat we listen to and don't
listen to.
So, um, even though you know,Bertrand, you were saying you
might be able to mine, um, youknow, have conversations or
(39:21):
identities be fleshed out, theycan also be completely
suppressed, right, you know,completely suppressed.
The thing that concerns me isthe ones who understand the
algorithms and deal with themmight not have the most, you
know, pure and generous motivesin how they allow the internet.
(39:43):
I mean, I don't know anythingI'm a minister, right, but my
sense of how they allow theinternet to be mined or used.
We know that social media sitesvery much control what comes up
first on your feed, right, andthat's just a little thing.
Some voices are suppressedcompletely because they don't
(40:06):
fit the algorithm.
So there are these Like we ashumanity.
It's a tool in some ways, thatis a great tool and a wonderful
tool to have, but we have toconsider, as humanity, what's
the priority in our humannessthat we must not lose.
That's the question to me.
What is it about being humanthat we must not lose, and how
(40:30):
do we make sure that we don'tbecome extensions of artificial
intelligence?
Bill (40:39):
Well, as you were saying,
we're the subject right.
Aaron (40:43):
Yes, yes and oh okay.
So there's three things thatcame up from that.
First of all, I want to correctmyself.
It wasn't the wood of suffering, it was the wood of hunger, the
wood of your own hunger.
Bertrand (40:56):
Suffering's great too,
I mean there's suffering.
Aaron (41:00):
I would say there is a
lot of nuance in the wood of
your hunger.
Bertrand (41:03):
It's poetic.
That's the problem.
That's why it's so eerie, right?
That's tense.
Aaron (41:09):
The first one makes me
think of Coleman Barke's
translations of Rumi, which weresomewhat controversial, in that
I think they and to say it in avery generalized way people
felt like it was a bit of anAmericanization of a much deeper
(41:29):
tradition of Rumi.
The poet and his translationstook a lot of liberties and yet
at the same time it also cameout with which one of my well,
it's my favorite, because it'sso complicated, but the language
of God is silence and all elseis poor translation, which, as
(41:52):
much as I respect, again, thecontroversy of bringing an
American voice to that tradition, it also just speaks to me Like
it just says something.
And the fact that it's abouttranslation and it's being
translated, anyways, it's verymultilayered, anyways, it's very
(42:12):
multilayered, no-transcript,you know, it suddenly becomes a
(42:47):
work that we wonder about itssource and its inspiration, in
terms of it is weighted heavilywith cultural context.
It is weighted heavily withcultural context.
There are passages in therethat today we just find
unacceptable, or that we clearlyhave to look at the long arc of
history to say what they werethinking then when they wrote it
(43:09):
and what we are thinking now,as we just know ourselves better
, we know more about everything.
Those are the two, yeah.
And so having to deal withwhat's written in the Bible and
where there are some who comeforward and say, well, this is
(43:31):
the unalterable word and youcan't touch this.
This is here.
We have to simply listen towhat it's saying, which, of
(43:55):
course, some of those things are.
Which, then, which I'll doubleback on my Coleman Barks idea of
like, is it the language itselfthat?
Is it the language itself, oris it what we are bringing to
(44:16):
the language?
What are we?
You know, what are we bringingto the wood of hunger?
Bertrand (44:24):
We're bringing
suffering.
Aaron (44:28):
But what like?
What are we bringing in?
You know, in spite of textsthat may have come from a
patriarchal society in whichwomen's rights were deeply
reduced, along with other groupsand minorities, and yet, at the
(44:49):
same time, it's the story ofExodus.
It's the story of Exodus, ofcoming into freedom, and those
are deep contradictions in manyways, like embodied in there.
I just and I pulled thatexample out of the air Things
we're still wrestling with, thesimple act of covering one's
(45:13):
hair, or, you know, in certaintraditions, again, about the
role of men and women and therole of LGBTQ community, of,
yeah, you know, all thosedifferent communities and groups
.
Now, all those differentcommunities and groups, now
(45:34):
questions of other traditionsthat we now share, as opposed to
a singular Christian nation.
That's a wonderful idea.
Sorry, just so we don't getsuper veer off side.
But again, what are we bringingto the language?
Bertrand (45:55):
basically, I would
like to jump in, if it's okay.
Yeah, because I see aconnection here to something you
were saying, joanne, earlier,and it's raised a question for
me.
So you mentioned this long arcof history and how we kind of
look at spiritual texts.
We look at the Bibledifferently than it was looked
(46:18):
at years and years ago, and whatit made me think about is the
question of the humanness.
So, joanne, you said you knowyou're okay maybe with people,
with AI, spitting out someliturgical text for us to deal
with, but there's somethingessentially human in the sacred
(46:43):
experience that you'd suspect AIwould not be capable of
achieving.
And here's where I don't want tocall it my cynical sense,
because it's not cynical, butit's kind of like worried in a
sad way.
This is where my sad andworried self, my wood of
suffering, comes out.
(47:03):
Yeah, suffering is actuallybetter.
I'm defending the human everysingle time, exactly Every
single time, yeah, yeah.
So I wonder if our concept ofwhat is human changes as well
(47:25):
and that a while ago, what weare looking at today and calling
human, they would have lookedat and said what, where is the
humanity in that?
Like it doesn't exist.
How can you call that human,which therefore makes me think
that maybe what we're worriedabout now in 50 or 20 or five
(47:55):
years.
Joanne (47:57):
Maybe it's interesting
in the Christian tradition that
has, you know, at differenttimes in its history has either
been what you know, my theologyteacher once called bibliolatry.
In other words, we take thatBible and we worship it and we
say that is the word of God, andat other times, particularly in
the Roman Catholic tradition,it was an allegory, everything
(48:19):
was an allegory.
The Bible for real life.
I think we've moved to a spacewhere we will say it's mythology
in the best sense of the word,you know, the Joseph Campbell
kind of.
It's a truth that is universal,crosses history, whatever.
(48:39):
And it's really important inthe sort of progressive context
to say that the Bible is not theword of God, that Jesus is the
word of God, right, and thisexample of Jesus which also,
(49:05):
honestly, you read differentthings, like Jesus is talking
about burning in hell, sometimesat the same time that Jesus is
at the table with the people whoare the most marginalized.
But again, there is somethingabout like see in Jesus.
Christians would say is thathuman and divine actually meet
in some way?
In some way, right, I'm not,you know, creedal according to
the Nicene Creed, or anythinglike that, but in some way we
(49:26):
see in Jesus ourselves and wealso see the sacred too.
My theology teacher, david Dean,said that the disciples saw a
godness in Jesus a godness andthat's why they followed.
Right Now, could we see agodness in AI?
(49:46):
Right, there's lots ofscientific shows that have
humanity giving up their ownphysical body parts and
inserting machines so that theycan do things better.
The whole Google Glass thingthat they tried but then it
failed is like, okay, well, wecould have an implant on our eye
(50:08):
.
That's a computer that could dothat, could do that.
They have robots who providecomfort to lonely people, like
in the form of dogs, and stufflike that.
Could we?
I think we could.
We actually could, if we arenot intentional, mindful about
(50:30):
where we're going and what we'redoing, and that goes back to
the whole.
The ethics is way slower thanthe science, so we need to spend
time as humanity to think aboutboundaries, boundaries around
possibilities, and that's a veryhard thing for us to do,
particularly when you have, youknow, sort of the tech gurus of
(50:54):
the world who think that, youknow, democracy is software that
just doesn't work anymore, andthat you know fascism or
domination by corporations, youknow building countries that are
corporately run by corporationsand identities.
That kind of thinking is outthere, pushing us into a new
(51:16):
world, and there seems to be atthe moment very little ability
for us who believe in sort ofthese precepts of humanity and
who we are.
We're better together,diversity is a great thing, all
those things.
It seems very difficult for usto defend those notions of what
it means to be human and if wecan't get a sense of humanity as
(51:39):
a sacred source of life andlove, then we will run into.
It looks like a human, it talkslike a human, it's faster,
better.
I'll take that over arelationship with a flawed and
inconsistent human being.
(51:59):
That's a possibility.
We need to be intentional aboutsetting boundaries around this,
or who knows what would happen.
Bill (52:09):
As far as will we ever see
godness in AI.
The thing that I'm constantlyreminded of is, and that this
world that we live in todayespecially reminds me of, is
that we always, as individualsand as societies, we choose the
authorities in our lives.
We choose where, what theauthoritative voices are,
(52:29):
whether that be you know, theauthorities of you know, law
enforcement, or, whatever thecase may be, whether that be
divine authorities in our life.
What God do we actually and Isay what God do we actually
worship, because there are awhole lot of different brands of
God out there right now, wedecide what is the authoritative
one, based really on whatevermetric we decide to apply to it
(52:51):
right metric, we decide to applyto it right.
So I fully expect that we are,if people aren't already doing
it, that we are not long beforethere is a cult of AI right that
really can point to thedivinity of.
Joanne (53:09):
Well, you can see people
saying God gave us AI right
Exactly.
Aaron (53:12):
Right.
Joanne (53:13):
You could see people
saying, saying this is the next
gift of god.
If jesus was the new covenant,ai is the new new covenant.
Bill (53:20):
Well, you, know, or even I
mean I, I was.
I was driving as I was drivinghere, and I was thinking like
we're, we're, we're not that faraway from uh, created in the
image of god.
Um, therefore, the created arenow creating in the image of God
, right, and it's really notthat far a stretch to start
moving into that idea thatanything we create is also
(53:42):
divine by virtue of our, youknow, imago Dei.
So probably a good place tobreak for an intermission right
here.
This has been great.
I'm looking forward to thesecond half of this, but we are
going to take an intermissionand we will be back shortly.
(54:21):
All right, and we are back forthe second half of our
conversation.
It's been a pretty deep andmeaningful conversation so far,
and so I want to try to pull inthe perspectives a bit here,
because each of you brings avery unique perspective on how
AI is already intersecting withyour work, whether it's through
creative tools or culturalcritique or spiritual life of a
community.
So, at the heart of all of this, though, I think what we have
been talking a lot about is thedeeper question about how we
(54:44):
tell our stories and who gets totell them.
So my question to all of younow, and maybe I'll open it up
to Joanne first, purely becauseI can.
How do you think AI is shapingthe way that we tell our stories
, not just in art or literature,but how we express our
identities, how we express ourbeliefs and our place in the
(55:06):
world?
Joanne (55:12):
Thanks for starting with
me on that one.
Well, it is very interestingbecause, as we tell our stories,
always it's a narrative, right?
So each of us has events thathappen in our lives.
Every day there's somethinghappens to us.
(55:32):
It is actually, and we chooseas we go on.
When we think back on our lives, we choose those events in our
lives that we can draw into athread that will tell the story
of our life, and we forgetthings or put things away that
don't fit the narrative that wehave developed for ourselves,
(55:52):
right?
So that's how we find meaningas human beings, right?
We develop a grand narrativefor our life.
I am this kind of person becauseI did these things in my life.
I love this person because wehad these events together, and
so it is an imperfectremembrance of our lives.
Obviously, I don't know ifyou've ever seen or heard about
(56:14):
those people who can rememberabsolutely everything that ever
happened to them, like, you say,a date.
They know what happened at whattime, like it's this whole
thing, and I can't imagineliving that way.
We have to be able to forgetsome things and remember some
things in order to achieve ameaningful narrative for our
lives.
That's the reality of humanityand it's incredibly imperfect,
(56:40):
you know.
Just ask any lawyer aboutrecall of witnesses, for
instance.
Do you know how imperfect thatis?
Oh, that's the guy you know,because they vaguely resemble
them and they did this and andhow imperfect our memories are.
(57:04):
But still, the task, thespiritual task of our lives is
to put some kind of frame aroundit that gives us some meaning
as to why we should go on, whyshould there be a tomorrow?
For me, it's because of thisnarrative.
Now, if AI captures everymoment of our lives, like, for
instance, if you had every eventin your life put into the
(57:24):
computer and they created thenarrative of what your life is
about, it might be a verydifferent thing, right?
So in telling our stories,again, we are not just a series
of events or data points,meaning it's not found in data,
right, it's found in telling ourown stories.
(57:46):
I don't know if it would bepossible to give that, like that
major, sacred task of humanlife, to a machine that will try
and pull the threads ofeverything that ever happened to
us.
Does that answer your question?
Sure?
Bill (58:04):
I'd like to apologize to
the data analysts of Calgary.
Joanne (58:09):
I'm married to a data
architect.
What do you mean?
I understand data Well no, Idon't understand data, but I
hear about data a lot.
Bill (58:24):
So, bertrand, I'll ask you
what do you think?
How is AI shaping the way wetell our stories?
Bertrand (58:28):
Again.
I see two things here and Iwonder why it's always two.
For me it's dualities, leftside, right side of the brain.
But I see two and I'm happy toreport one of them is actually
optimistic.
All right, so I'm not justgoing to be.
What was it?
The wood of suffering?
I?
Joanne (58:45):
forget what it was.
Bertrand (58:46):
I'm not going to just
be there, yeah, so on the one
hand and this is the pessimisticside I do see it reinforcing
our sense of convenience, and soAI is meant to take care of all
these things that just are apain in the butt for us and we
don't want to have to deal with,and, thank goodness, something
(59:08):
else can deal with that.
Now, a struggle I have withthat is I do see how in the past
, we have done that again andagain.
We've even done it to peopleright.
So slavery is exactly that sortof thing.
We just don't have to do it.
We just have this othercategory of people who will do
it for us.
(59:28):
The convenience for me is drivenagain by Silicon Valley and
productivity.
Now, I've seen this in my work.
In my day job, I teach at OldCollege and I teach
communications actually, and sothe students have to learn how
to write and present andinteract and all that sort of
thing, yeah, so they all hate myclass.
(59:50):
They hate it, yeah, though Iwill say they tell me that they
like me, but they just hate theclass.
Joanne (59:59):
It's a fine line there,
Bertrand.
Bertrand (01:00:01):
I'm telling you yeah,
I'm trying to make it as thick
as possible though that line.
Bill (01:00:06):
I'm trying to.
Bertrand (01:00:07):
Yeah, so I regularly
see them cheating, right,
because they just want to get myclass over with.
Yeah, and AI, for many of them,was a huge relief, yeah, and so
they've just dived right intoit and they're using it as much
as possible.
Now, obviously, one of theproblems with that is they're
(01:00:27):
not actually using their ownbrain, and one of the things
that I keep trying to teach themis that when you're learning
how to write, you're actuallylearning how to think as well,
and I want you to be a goodthinker, yeah, so don't avoid
that.
Well, these products come alongnow.
I paused because that was likea little anger script went in my
(01:00:48):
mind and I was censoring myself.
Can I say that?
No, can.
Bill (01:00:51):
I say that no, absolutely.
Say anything you want.
Bertrand (01:00:54):
Yeah.
So these products come alongand what happens is they're
actually built into the toolsthat my students use.
So, for example, they'll openup a Microsoft Word document and
Microsoft Word now has featuresthat say, hey, you don't have
to write this, Just tell us whatyou want and we'll write it for
you.
Ai will do it for you, andthere are many programs that are
(01:01:17):
like that now.
So what's happened is thatthese tech moguls have decided
to make productivity easier.
They're just going to embed theproduct, and for my young
students, who know nothing else,this legitimizes the use of AI
and it's harder for them to seea problem with it at all.
(01:01:39):
It becomes a part of theiridentity as people like.
This is just what we do, right?
Okay, so that's the negativeside that I see.
The positive side that I see toall of this is that's obviously
not the only use for AI, andwe've even been talking about a
few of those, and one of themthat we have touched upon
regularly.
That, I think, bears a littlebit of more scrutiny.
(01:02:01):
It's worth.
It is the way in which AI andthis goes back to Beth Frey- no
Beth.
Aaron (01:02:10):
Yes, yes, yes.
Bill (01:02:11):
Okay, yeah, yeah, it's in
green.
I'm not convinced it is, butit's close.
Yes, yes, that's right.
Okay, yeah, yeah, it's in green.
I'm not convinced it is, butit's close.
Bertrand (01:02:17):
Yeah, so it goes back
to Beth Frey and she's producing
art through AI.
That is obviously AI and it'sobviously getting what a human
being would do wrong and it'sobviously doing that.
And it's obviously doing that.
And I think the beauty of whatthat shows is one relationship
we have to AI is that AI canserve for us as a marker of what
(01:02:41):
is not human.
And a lot of the time, what weare doing throughout human
history is we are trying toassert our humanness.
We're trying to say this iswhat it means to be human.
This is what it means, andwe've done it in so many
different ways.
Religion is one way for sure,but you could even say
(01:03:02):
professional sports does thattoo, and the different markers
we have for success, just likeour salaries and things like
that.
We're always trying to findthese ways of doing it, and it
seems like to me that AI isactually one that we all band
together on and say no, ai isdefinitely not doing it and we
can do it, and in that sense,it's helping us to tell the
(01:03:23):
story of what humanity is.
And just one final example I'llgive.
This doesn't have to do with AI, or does it actually?
I forget, but, joanne, you weretalking about how technology
sort of outpaces ethics, andrecently I heard about and some
of you might be able to addcontext and correct my details
(01:03:47):
but recently I heard about agroup of scientists who have sat
down and said, okay, I think weneed to actually put the pause
button on some of these things,all right.
So, yeah, it's all wonderfulthat we're charging forward with
some of these things, and someof them have to do with, like,
crispr and genetics, and there'sa whole bunch of different
areas of science.
Okay, and I think AI might bein there, I'm not sure, though.
(01:04:10):
Okay, and they said, yeah, weshouldn't just be running
gung-ho with this.
I'm not sure, though, and theysaid, yeah, we shouldn't just be
running gung-ho with this.
We need to sit down and decide,okay, what are the things that
we can just open up and pursue,and what are the things that we
should just say no?
And to me, this feels like awelcome kind of change.
This is a recognition, theconscious recognition that this
(01:04:31):
is all moving faster than ourethics, and we are the ones who
choose our authorities, we arethe ones who make these choices,
so let's just do it here, andthat's very, very heartening for
me.
I see some optimism in that.
Bill (01:04:46):
Yeah, I was thinking, as
you were talking in your
pessimistic phase, about theargument when I was in school.
Not that they just invented thecalculator when I was in school
, but there was a great deal ofdebate around whether or not
calculators should be allowed inmath, for the simple fact that
(01:05:06):
you obviously want your studentsto understand how to multiply
three times three without havingto reach for the calculator to
make it happen.
You certainly want the personconstructing your bridge to be
able to do that, absolutely.
You know when they forget theircalculator at home on the first
day of the job, right?
So, and it seems to be the samekind of thing now with all of
these tools, like I see when Iopen up Microsoft Word, copilot
(01:05:31):
will do this for you, right?
You tell me what it is you want, you let me you Exactly, right?
You tell me what it is you want, you let me know Exactly.
And so far I've never had touse it.
Many people do, though Many ofmy friends do it becomes the
challenge around, like, justbecause you have the tool, the
tool is great.
Once you understand how thetool does the work, right, right
(01:05:54):
and I would never fault.
I still reach for thecalculator.
Sure, I don't know many peoplewho don't reach for the
calculator, exactly, yeah, but Iknow that if I had to, I just
raised my hand.
Aaron (01:06:05):
I'm a calculator user as
well.
Bill (01:06:06):
Oh, okay.
We thought you were signalingthe opposite actually I thought
you were going to say you're not.
Bertrand (01:06:12):
I'm a calculator
addict.
Aaron (01:06:15):
Exactly I thought you
were going to say.
You're not.
My name is Gary and I'm acalculator addict.
Bill (01:06:17):
Exactly, I do it with
pencil by hand on a piece of
paper, yeah, but it's importantto be able to at least know how
that happens, and I know that ifpush comes to shove I could do
most of what my calculator cando.
It might take me a littlelonger, but I understand the
theory behind it.
So we have these.
I mean, these are clash of thetitan moments at my dining room
table when my kids are doingtheir math homework.
(01:06:39):
Now, like they know the sixdifferent places, there is a
calculator on their phone, ontheir computer, you know, tucked
in the junk drawer, and theconstant sort of like you can go
get the calculator as soon asyou can tell me what the answer
is Right, right.
Joanne (01:06:55):
Do you know?
There's an interesting memoryof mine I think Dave was with me
at the time and it was a longtime ago when the GST was 7% and
we were buying something for adollar and she said okay, now
let me calculate the GST 100times 1.07.
(01:07:17):
Do you know what it was justlike?
Okay?
So here's the thing If we giveover the understanding that 7%
of $1 is 7 cents, are webecoming extensions of the
machine?
Right?
How much of our like?
Automation is an interestingthing to me because it frees up
(01:07:41):
mundane tasks, right?
Do you know?
They say the education systemwe have now was created so that
Henry Ford could have people dothe work over and over
repetitive, icky work, over andover and again, and it isn't
really conducive to creativityand blossoming and finding
yourselves and all those kindsof things, right?
So if we have an educationsystem that's trying to make you
(01:08:04):
an automator, right, that'sessentially what it's trying to
do.
I used to call, when I did youthministry, I would talk about
the cubicle kids.
Okay, the kids in my ministrywho are going to end up working
for an oil and gas companydowntown in a cubicle, right?
And because we would alwaystalk about?
Oh, especially in the UnitedChurch.
(01:08:26):
Oh, do you know?
The person who runs Greenpeacewent to the United Church and,
oh, did you know?
You know the person who doesthis?
There's all these exceptionalpeople who are somehow connected
to the United Church, but mostof the people who come through
are going to be cubicle kids.
But if we don't understand whatwe're doing, if we give
automation, like if everythingthat can be automated is
(01:08:48):
automated and we don't haveanything left, do you know, have
we become the machine and wedon't have anything left?
Do?
Bill (01:08:54):
you know, have we become
the machine, if you listen
really carefully right now youcan hear Ricardo screaming from
the US that he has something hewants to say Labor, labor yeah.
Joanne (01:09:05):
Well, it's very
interesting because what happens
when everything is automated isthat people lose jobs, right,
but Sweden took a different tackon this.
This was from.
We used to have this thingcalled ethical dilemmas here and
it was something we discussed.
They preserved um, not specificjobs, but work okay.
(01:09:26):
So if they were going toautomate something, put 5 000
people out of work, their um.
Responsibility as a society wasto find some different work for
those 5,000 people and not justlay them off and say, see if you
can retrain somewhere and seeif you can do that.
So we don't preserve jobs, wepreserve work is a very
interesting thing, but the wholejoy of AI that people would
(01:09:52):
talk about is that if we canautomate all those mundane tasks
that make us a machine, okay,if we can get machines to do the
machine-like work of our lives,then we'll have more space for
creativity and we'll have morespace to become and being human
will be actually us realizing.
(01:10:13):
You know, as a Christianminister, who God intended us to
be, which is wonderful in ourdiversity and our interests and
everything to discover thatbecause we're no longer a
machine.
You know, that's a possibility,yeah.
Bill (01:10:28):
I mean, I feel like I have
to confess that I have an AI
vacuum at home, that actuallydoes free up.
Bertrand (01:10:34):
Better job than you.
Bill (01:10:35):
Well, it does a better job
than me, because I never do it
anyways, but it does free uptime right.
It doesn't and I'm not enslavedto it by any stretch of the
imagination, although I do talkto it and it does have a name.
But a task that is important inour house for, you know, having
(01:10:58):
pets and whatever that takesabout two hours on the weekend
to do is now done and it's notsomething that requires a great
deal of work or effort on ourpart for it to be done.
We have to change the waterevery once in a while, and you
know, empty the, empty the dustbag when it tells us to every
once in a while, and you know,empty the dust bag when it tells
us to, but it actually frees uptwo hours to be doing other
(01:11:22):
things.
That has actually manifested ingoing for walks together or,
you know, getting the dogs outto the dog park or like things
that are actually way more kindof living than the task of
pushing the.
You know the stick vacuum backand forth repeatedly and really
like trying to get in thecorners as best you can in the
carpet.
Bertrand (01:11:38):
So I'm just going to
interject very quickly and then
we can go on to.
Aaron, because when I was ateenager and I lived actually in
this neighborhood when I was ateenager my dad had this
nickname and my mom gave it tohim and the nickname was Design
Boy.
And she called him Design Boybecause when he vacuumed on our
shag carpet in 1981, he wouldmake these beautiful symmetrical
(01:12:02):
patterns on the carpet and noneof us wanted to even walk on it
afterwards.
We were so impressed.
Bill (01:12:10):
That's how I used to mow
my lawn.
Bertrand (01:12:14):
So I guess the
question is is it really a task,
or just have you not found thecreativity in it that is
available?
Bill (01:12:22):
to you.
You can try to market ithowever you want.
Aaron (01:12:26):
That is peak vacuum, I
have to say.
Where it's so nice you don'teven.
In fact, that promotes evenfurther cleanliness of no one
stepping on the carpet.
Just stepping around the sidesof it.
Bill (01:12:39):
So how is AI shaping how
we tell our stories?
Aaron (01:12:43):
Well, there's so many
things.
Okay, there's so much stuffBecause the all right I'm going
to open with.
We were talking earlier in thebreak about the movie Her that
stars Joaquin Phoenix.
It sits in my heart as one ofmy favorite films about an
automated girlfriend that's juston his phone.
(01:13:07):
It's not like a physicalrelationship, it's just they
just talk all the time with eachother and I think it's Scarlett
Johansson is the girlfriend.
Bertrand (01:13:17):
Which is why they used
her voice afterwards.
Right, and that, yes, yeah, therobo-girlfriend.
Aaron (01:13:22):
Exactly, yeah, and but
that's actually it's already
happened.
There was, oh, I heard it onthe Guardian and it was
somewhere I think it was insomewhere in Europe.
There was an app that was a AIpartner and at some point it I
(01:13:45):
think they were doing it wasbeyond beta testing, but they
were still sort of like tryingthings out and then they decided
to shut it down.
And the amount of heartbrokenpeople uh, people were
heartbroken, they were likewhere's my partner?
where's my?
You know, my friend, my soulmate?
Like they were, they wereapparently having just these
really intense moments.
Um.
(01:14:05):
So I guess, in terms of likestory, the two words that come
to my mind are and one of themis a very I associate very
closely with church.
(01:14:27):
The first one would be just theidea of voice artists trying to
articulate why their work, thatthey do by hand, is more
important than some robotspewing out some machine spewing
out a, an image that's beenassembled, obviously, from an
aggregate of you know, fourbillion images.
(01:14:48):
Um, and so the the element ofvoice, that's the best
description I can think of ofwhy human-created art is
important in that, partlybecause each individual does
(01:15:08):
have an individual, unique voice.
Each person has a unique styleof drawing, even the most
photorealist drawings.
If you put them side by side,you'd notice their take on
photorealism as an art form Well, but also just the person who
uses the camera is alsoexpressing a voice in terms of
(01:15:32):
what they themselves see as aperson, as what they're
experiencing, and it could be amoment in time where they just
looked around and said I got totake a picture of that, like
right now, or, you know, and Iguess so that's the other word
(01:15:52):
is discernment, which is onethat I think about a lot lately
in terms of, well, in part, likewhen you talk about your
students and getting them to notlean on the tools to use their
(01:16:13):
own minds.
Because, well, in part, becausethis itself is a very
complicated computer that wehave in our own heads, right,
and the ability to use that likecritically is, yeah, I can't
even describe how important thatis.
Yeah, yeah, it's.
(01:16:33):
I can't even describe howimportant that is.
It's like, especially in timeswhere we are faced with value
decisions and in a lot of caseswe're, because of social media,
(01:16:53):
I think a lot of voices areconcentrated in one place and
yet still it's important thatwe're able to decide well, am I
on board with this voice, withthese voices?
Yes, they're all the same andthey all agree with each other.
But, but what in fact, is thelike?
(01:17:19):
What's the but?
It could be the choice betweena very direct, straight road and
a forest path where maybethere's some winding to do, or
it's a little more narrow andnuanced or takes more effort to
go to the other end of, to getto the other end of the solution
(01:17:39):
.
So, and particularly, yeah, inissues of conflict, issues of
disagreements, of how we address.
I mean, this is a very surfacelevel example, but you know, I
had neighbors who would, insteadof shoveling their own walk,
(01:18:00):
would walk across my yard towalk on the walk that I had
shoveled for myself which issuper, just the most nimbiest,
intensely first world problemwhite male, anglo-saxon
Protestants like issue, likewhat are you doing?
(01:18:25):
And yet the solution was foundin, in just coming like in
thinking through okay, how am Igoing to address this?
Because it's going to drive mecrazy, a bit like having I don't
know like a fly buzzing aroundmy head.
The solution was ultimatelyfound in just shoveling their
walk and discovering that 30seconds of my time to just take
(01:18:48):
the snow off their walk was both.
You know, I like being outside,I kind of like shoveling, and
it just did it.
I didn't say anything, I juststarted shoveling their walk and
I received a lot of thank yousover time and eventually they
responded in kind with shovelingthe walk and it was so it was.
(01:19:09):
You know, I could have writtenhim a letter and posted on his
door.
I could have done a lot ofdifferent things.
There are a lot of differentways to solve that problem.
Joanne (01:19:17):
Call the bylaw officer.
Aaron (01:19:19):
Call the well yeah, I had
my phone on the number, but
those and so then this takes meto another story from On being,
hosted by Krista Tippett,another great podcast.
Story on from on being, uh,hosted by krista tippett,
(01:19:43):
another great podcast where sheinterviewed a man, a young, a
young man I'm anyways young orold a man, a fella.
He was in college and he he'suh, sorry, he was, he was jewish
and he met the son of a veryprominent white supremacist and
the story is that he invitedthis son.
(01:20:03):
They were both students, theywere students at this school and
he decided to invite thisperson to his Shabbat dinners.
Shabbos, shabbos, dinners, likevery just, hey, come on, come
eat with us, Just come and sitwith us.
And it took, it happened overthe course of two years.
(01:20:26):
There were lots ofconversations that occurred,
both, you know, with his friendsgroup and et cetera.
He would obviously tell abetter story.
You could probably look it upon Bing story, you could
probably look it up on being andafter those two years he
renounced his white supremacy,his roots, I mean, this is in
his family, basically.
(01:20:48):
Now, what's interesting at leastthe first thought in my mind is
he probably couldn't haveinvited just anybody like, not
just any son of a whitesupremacist may have been a
suitable candidate to come tothese shabbat dinners.
He obviously looked him in theeye and saw something beyond
just the whatever.
Um, that said, I think I should, I could invite this guy, yeah,
(01:21:12):
I think this guy's a candidateto come over, and so.
So those are like.
Those are just two examples oflike the discernment of, uh, I
mean what we might call a softskill, but of of like, reading
the air, of reading thesituation, that there's no, um,
(01:21:35):
you know, those are, those arethings where there may be six to
10 paths in front of you.
He could have said I'm nottalking to you ever again,
you're the anathema.
And maybe there would have beenpeople where he should have
said you're the anathema, Ican't do this, or you're not
safe, or something, but so it's.
(01:21:59):
And yet, at the same time, thereare we are, of course, navigate
movements, uh, and and, wherepeople have taken strong stances
in certain moments.
But, but to be able to discernwhen, um, you know, uh, when is
it time to to welcome someone in, versus when is it time to
close the door?
You know, when is it time tosend someone to jail?
(01:22:24):
When is it time to throw awaythe key?
When is it time to let someoneout of jail?
All those, those are all kindof, those are difficult to use
to put data towards, basically.
So that's I think, yeah, that'san area, that, but that's also
(01:22:44):
an area that we have tocultivate.
We have to cultivate with ourchildren.
So I would close on the idea ofwell, I think temperament is a
Roman term that describes thehammering of metals and the
mixing of the process of mixingmetals by hammering them into
(01:23:06):
the sword.
And it's the idea of holdingdifferent, the idea of holding
different emotions at the sametime.
So someone being of good tempercan manage the various emotions
flowing through them, and a badtemper is, you know, that stuff
sort of comes out, so it's.
And those are things.
(01:23:27):
Again, those are, those areskill, those are just human
skills.
That that we um, that we haveto be, especially when we have a
phone in our hands almost halfthe day.
Now we have to cultivate themto go forward, just to make
better decisions and to againread the air.
(01:23:49):
And I'll close with that orI'll end on that.
Joanne (01:23:52):
So, again, I think that
it's a very interesting thing,
because something that isessential to the human
experience is feelings ofremorse, regret, uh, shame in
some ways, um, the ability toforgive, to see beyond data
points, you know, to sense allthese things.
(01:24:13):
So ai may give us unlimitedknowledge.
You know, if I have what isthat?
1 Corinthians 13?
I have the gift of knowledgeand know all things and prophecy
and all those things, and havenot love.
I'm a sounding brass and Ithink that there are these
essential, again, sacred,essential human characteristics
(01:24:35):
like shame, forgiveness, remorse, regret, joy those things that
we can grow as our being is verydifferent than gathering
knowledge.
Right Becoming is a verydifferent experience than
learning more, and so AI is atool to help us learn more,
(01:25:01):
faster, better, okay, but ifthat takes over our becoming, if
we no longer have the abilityto feel guilt or shame or we
don't experience regret, we havelost our humanity in ways that
we can't be rescued by a machine.
Bill (01:25:21):
Yeah, I'm reminded there
was research that was done back
when Facebook first came out,when everybody was on Facebook
and they studied teenagersjunior high and senior high
teenagers and their postinghabits on social media, and what
they began to learn was that ifyou were a teenage girl and you
(01:25:41):
posted a photo and you wereposed in a certain way and it
generated 50 likes, and then youposted another photo and you
posed in another way and it onlygenerated 20 likes, then you
would see over time, the morelikes, the more you posed in
that way, until all of yourposes were of the exact same,
(01:26:02):
only from the left, both thumbsup, like whatever the magic kind
of equation was.
And what they actually startedto do was to remove what they
assumed were characteristics orthings about themselves that
didn't fit the mold or didn'tgenerate love in some way.
(01:26:23):
And the problem always ended upbeing like you can love or you
can simulate love, right, andit's not the same thing, and we
learned it even in the pandemic,right that when we couldn't be
together, we could simulatecommunity and it would do many
of the same things we needed itto do, but it was not actually a
replacement for authentic, youknow in-community, you know
(01:26:47):
presence.
And so, bertrand, in your poemthe Bow, you write about the
river's deep connection to theland and its layered history,
and the quote that I connectionto the land, um, and it's
layered history.
And you, uh, you, the.
The quote that I actuallyreally loved was uh, it is
tongued and grooved, thefirmament baby of this last best
Um and uh.
(01:27:08):
And, and I, and I find that somuch of your poetry actually
attends closely to embodiment,whether it be like physical
people and embodiment, or justthe weight of place or the
rhythm of language, or thephysical memory that is carried
(01:27:30):
in the land or in space.
And so I'm going to put it toyou first, but it's a question
for everybody, because part ofwhat AI is doing, we see it
doing it social media,technology in general, all of
these things, but certainly AIis shaping our world by
disembodying intelligence andknowledge and even history and
(01:27:56):
our sense of selves.
So what do you think truly getslost about our humanity when we
hand over more of our lives andour identity to this
disembodied thing that doesn'tfeel and doesn't remember in the
same way or take up space or beembodied the way that we are?
Bertrand (01:28:21):
Yeah, it's a very
interesting question.
I think that there's a paradoxat the heart of it, in fact,
because I think what happens is,and what we're seeing these
days is that AI and I'll go backto my students to describe this
example so they have all theseproducts that just have AI built
(01:28:44):
into them.
They see it as a way of justeasing the burden of their lives
so they don't have to docertain tasks.
They're not design boyvacuumers, right, they just want
it done and don't want to thinkabout that.
They're certainly notdiscerning because of that.
In fact, it's a vicious cyclethat happens there.
But on top of that, I think thisshows that AI has been
(01:29:06):
accommodating a what's the word?
General development, a generalsocial development, that social
media began, which is this senseof idealism and perfection, and
all these students feel likethey're just supposed to be
(01:29:26):
there already.
And why I called this a paradox?
Why I feel there's a paradox atthe heart of this, is because
you know, joanne, you said youknow, if we don't feel shame or
you know these kind of theseugly things, how can we be human
?
Exactly, and in part, I thinkmy students do feel those things
because of AI as well, yeah,and this is why I feel like AI
(01:29:50):
is also this interesting markerfor us of what is not human and
what is human at the same time,because, yeah, they feel
inadequate, they feel as thoughthey're not producing as they
should be, and AI is justrelieving them from that.
But it's also confirming oh,you can't do this right, you're
not able to do this, yeah, andit just fits into everything
(01:30:12):
that they've been experiencingfrom the very young age going
onto social media and seeingthat ideal image on Instagram or
whatever it is these days thatthey're looking at, right, and
feeling inadequate.
So I do see this disembodying asa problem.
I do, and for me, the answer isthings like poetry and things
(01:30:33):
like that.
For sure, but honestly, poetryis not doing anything that
different in this sense thanwell, I shouldn't say that, okay
, I'm going to try this out.
This thought just came to me,let's try it out.
I don't know why.
I looked at you because I thinkwe all thought you were going
(01:30:55):
to be the mathematician and then, yeah, so podcast listeners, I
was just looking at Aaron justnow, who put his hand up when we
were saying that.
Aaron (01:31:04):
It was a downhill
trajectory from about grade nine
.
Bertrand (01:31:10):
Grade nine yeah,
that's not bad.
Grade nine, that's not bad.
Aaron (01:31:13):
Just a downhill slide,
yeah.
Bertrand (01:31:15):
A lot of people, as
soon as they get out of
elementary school, they're doneright at that point.
Bill (01:31:19):
So yeah, the tragedy is, I
might actually be the
mathematician on this panel.
Bertrand (01:31:27):
So maybe this is a
slight defense too of your kids'
use of calculators.
Maybe We'll see.
But I see parallels between thetwo and poetry and math in that
and these are social parallelsI see in that nobody really
knows what poetry does exactlyor how it works, and most people
(01:31:49):
don't even read poetry honestly.
Yeah well, for example, when Isay I'm a poet, I get all this
respect and, like they likestart bowing down at me, right,
and in part it's because theyknow there's such a thing as
poetry in the universe.
They don't know what it does inthe universe.
Yeah, they know they can't doit and so they're super
(01:32:12):
impressed when they meet someonewho does this totally useless
thing that they, they don't knowwhat it does.
And for me, this is exactly howwe treat math as well, right?
So we don't really know what itdoes in the world.
Most of us don't really know.
We took it in school and thenwe're done with it after that.
If we meet someone who actuallyuses it, say a rocket scientist
(01:32:34):
, we say, wow, that guy's agenius and holy cow, that's
amazing.
And I don't know how he does it.
I don't know where he does it,but I can't do that Genius, yeah
.
Now I forget why I was bringingup that.
What's this analogy?
Joanne (01:32:48):
for we were on embodied
things.
Yeah, something about how great.
Bertrand (01:32:51):
Oh yes, poetry, that
poetry might just do the same
thing that AI is also doing,because, let's face it, folks,
some of these mathematiciansbecame coders as well and helped
to produce AI, in fact.
Joanne (01:33:04):
Yeah.
Bertrand (01:33:05):
So yeah, I think that,
paradoxically, poetry is a kind
of grounding force.
I think it is.
I think people will go and, youknow, not everyone, but some
people will read it and lots ofpeople try.
So, for example, my colleaguesat Olds College none of them are
into poetry, they all read mypoetry Like they all read it and
(01:33:27):
they come back and they talk tome about it and they love it,
right.
So maybe we don't get it, butwe read it and it does something
for us, right, and we'reprepared for that.
We want it to do something forus.
But I am wondering if you know,maybe there isn't something,
maybe in the future or in someway, in which we will think that
(01:33:47):
AI does do something as wellfor us?
We've already talked aboutchatbots and how people have
felt connected to these thingsthat we never would have thought
they would have.
What was your question again?
Bill (01:34:04):
What do you think we lose
as we give more and more of
these embodied kind of elementsof?
Bertrand (01:34:11):
humanity over to AI.
So I do think that we lose oursense of place.
For sure I think we do, andthat is very important to me.
(01:34:37):
It's what all my poetry isabout, in fact.
Bill (01:34:38):
Yeah, I guess what I was
speculating on is, you know, I
just wonder if I'm right Likemaybe we don't lose it, maybe
people find other ways ofconnecting to these things
through AI.
I just don't want them to.
Maybe it's just that.
So, one of the things that yousaid that I want to sort of jump
on because it landed in mywheelhouse right there is.
No, not so much Is this idea ofthe shame that comes from AI
being able to do things that youcan't do right, and again, this
constant drive, whether it beyou know what photo angle is
(01:35:03):
going to get you the best looks,or even sitting on
parent-teacher interviews andquestioning, like, why are we
talking about the letter grademore than we're talking about
the potential or the characterof the student?
And all these things that, likewe live in a day and an age
(01:35:23):
where I would say what startedwith teenagers, you know,
finding that perfect angle thatwas going to tell people they
were perfect became somethingthat a group of savants figured
out how to capitalize on, in away that now we have artificial
(01:35:43):
intelligence that tells you whatyou need to do to be more
perfect.
Right, in a world that willconstantly tell you you will
never be enough, you will neverhave enough, you will never be
successful enough, you willnever be fast enough, smart
enough, rich enough like any ofit.
Right and underneath all of thatis still this understanding
that we worship a God that sayslike you are enough as you are
(01:36:07):
Just thinking that, right,whatever angle I'm looking at,
you are perfectly made Right,right and like whatever you
bring to the table of your true,authentic self is exactly what
I wanted brought to the table,and I made you this way for a
reason.
Bertrand (01:36:24):
I think that's right.
I think that's the crucialdifference right there.
So we may have AI that saysyou're not good enough.
You're not good enough, butit's to make you feel bad so
that you will just continue touse the product.
Bad, so that you will justcontinue to use the product.
And then we have the other sideof things, which is God, or
just whatever we think of asmore human, that you are getting
at Joanne, which is, yeah, okay, you pick your nose and you
(01:36:46):
trip and you can't tie yourshoes properly, but that's
perfectly fine.
Joanne (01:36:51):
That's it.
Bertrand (01:36:53):
That's all you need to
do, that's all you need to be.
Everything is good, and sothat's the crucial difference.
I think that is right.
Yeah, yeah.
Joanne (01:37:01):
So, as I was sitting
here hearing this conversation,
I remember being in an ethicsclass and about how important it
is to have an exam.
Bertrand (01:37:11):
You've taken a lot of
ethics classes.
Yeah, I loved it.
Bill (01:37:14):
We also learned last month
she wrote a lot of papers about
weird kinky things.
Yes, that's true.
Joanne (01:37:21):
In seminary, I wasn't
really interested in traditional
theology, but, bill, you weretalking about how people
construct themselves accordingto what computers, ai, tell them
to be.
What is their best self?
Tell them to be what is theirbest self, and that is certainly
(01:37:41):
not something that is new forus to try and construct
ourselves based on outerfeedback, right?
So an unexamined life would saytell me what I need to be in
order to be acceptable to you,which is what you know AI is
doing for folks.
Tell me who I need to be, whatI need to look like, what I need
to buy, in particular, in ourculture, what do I need to buy
(01:38:02):
in order to be acceptable?
And people give away theiragency to outside forces all the
time.
It reminds me I was thinking ofthis movie that's what I was
looking up called the Shape ofThings, where this you know kind
of dowdy guy who's Paul Ruddhow Paul Rudd could ever be
dowdy, I don't know, but he'slike it's an assuming guy and he
(01:38:23):
meets Rachel Weisz, who's verybeautiful, and they meet in an
art gallery or something and shestarts to you know, tell him,
well, maybe you should I mean,women make over husbands all the
time, right, I mean?
I mean, women make overhusbands all the time, right, I
mean, maybe you should wear thisshirt.
Oh, your hair would look betterthis way, contacts would be
great.
And she starts to shape him andhe becomes very influenced by
(01:38:47):
her and at the end he's been herart project.
Right, she has decided this isher art project is to make over
this person.
So the idea that it's only AI ormachines that rob us of our
agency is wrong.
It's historically happened allthe time.
So the most important thing inliving a wholehearted human life
(01:39:11):
is an examined life, to becritical of people who are
telling you things, as well asmachines who are telling you
things.
You know to think about it.
Does this person bring out thebest in me, or does this person
limit who I am by trying tocontrol me or put boundaries on
it?
So there is somethingessentially human about being
(01:39:33):
very intentional and discerningthat word again about who our
authorities are and who welisten to and, as Christian folk
, we always go back to.
We have been created in theimage of God and what brings
love and life to us should beencouraged and what squelches
(01:39:59):
our ability to become, toexperience joy, to experience,
happiness, to love.
Those things are destructiveand we need to name them,
whether they're machines or not.
Aaron (01:40:12):
Well said, okay, read the
question one more time.
Bill (01:40:21):
What does it mean or what
do we lose when we hand over
More of our lives to machinesthat do not feel, remember or
take up space the way we do?
Aaron (01:40:31):
So I've been thinking
about.
Let's see, what I've beenthinking about is In part and
maybe this is a broader socialmedia picture as much as AI is
perhaps that we end up havingaccess to.
I mean, I, for example, I loveInstagram because I do get
(01:40:56):
access to more art generallyacross the world, like different
stuff, historical stuff, eventhere are sites that are
(01:41:25):
dedicated to historical artwork,images of people that I will
never, obviously, meet, but theimages almost, of places or
(01:41:55):
contexts that aren't that usefulin some ways and that take me.
That might very well pull meaway from the context that I'm
in, or to look at the contextI'm in and to, in part, breathe
into the fact that, like I'mhere, I have the relationships
that I have here because of theof living in the place that is
Alberta I mean, that's if thatgives a big enough or Western
Canada, you know where and someof those meetings are chanced by
happenstance.
Some of them are much moredeveloped in terms of just being
in regular contact with people,just being in regular contact
(01:42:20):
with people, and so I guess, andso I think, yeah, the I guess
the way, so the way in which wemight and we do want to learn
about the rest of the world andtry and expand that horizon and,
at the same time, finding waysof still being centered, and
(01:42:44):
centered in the present, andcentered, yeah, in the present,
in the room that we're in,versus perhaps being just
somewhere else.
You know that, taking it, youknow that we're in our phones,
or again, seeing, you knowexposed to and it's not that
(01:43:06):
exposure is bad, because we needto see other cultures and
recognize the pieces that formculture.
You know environment andhistory and just, yeah, just
basic weather, things like that.
But to not be pulled out of, tocontinue to be able to come
(01:43:26):
back to and not be drawn awayfrom our context, which might be
a flat prairie some of the time, where the wind's just blowing
across and all you see are thewaving, you know, stalks of
grain.
He's going to get poetic rightnow.
Ooh, I'm getting close there, Igot close.
(01:43:47):
You know, sometimes, yeah,sometimes it is very sometimes
the present can be pretty boring, like it can be uneventful, and
yet being centered there, youknow, just allows well, in part
(01:44:07):
it does allow for us to loosenour grip on perhaps some of the
things we're wrestling with, andit's not, you know, sometimes
letting go of something for awhile allows it to sort of
change a bit and when we comeback to it it's maybe a little
less intense, easier to manage,of being present just in the
(01:44:39):
room with the people we're with,looking at each other eye to
eye, versus being sort ofimmersed in.
What can you know without AI?
Even our technology, ourdevices are already, like you
know, they already like suck usin.
So this is just another youknow layer of that where it
might feel like an answer.
(01:45:03):
You know an answer, a greatanswer machine for all the great
answers, and by all means, Ithink there will be excellent.
You know there will be veryuseful.
You know the intense ability toprocess data will benefit
medicine.
You know there'll be benefitsfor medicine and there may even
be time savers where you know weneed a little blurb for a
(01:45:23):
poster or something and we don'twant to take the time or, like
you say, certain areas ofautomation that we hopefully are
intentional about.
Um, but, uh, but, yeah, but,but definitely with so much
noise.
You know just the importance ofagain bringing oneself back
(01:45:45):
into the room and as anintrovert I could say sometimes
that can be very hard.
It feels like you're pullingyourself back.
You're pulling yourself awayfrom you know whatever's got.
Bill (01:45:55):
You know whatever you're
gripping onto, so yeah, I feel
like we've reached a good placeto kind of stop the conversation
for now.
So I'm going to do, we do lastthoughts, so I'm going to.
I'm going to start at this endof the table.
Work my way this way, joanne,your your last thought for the
(01:46:15):
evening on everything we'vetalked about tonight.
Joanne (01:46:31):
Well, it just strikes me
that living life as a fully
formed human requires a greatamount of intention.
How we use the tools that wehave created that are shaping
our future is probably the mostimportant question to ask.
Not what they can do, but, um,not what, how much they can do,
but what is appropriate for usto give over to AI and what is
(01:46:54):
appropriate for us, as humanity,to hold tightly, because it is
everything we are.
Bertrand (01:47:03):
Well said.
Again, I was going to say thoseexact words, but with a poetic
form.
Joanne (01:47:10):
Yeah, exactly.
Bill (01:47:12):
It just would have sounded
so much better.
Bertrand (01:47:16):
Yeah, that's very well
said, I think.
My final thoughts are this hasbeen an excellent conversation,
by the way.
I really enjoyed it.
I feel as though much of whatwe're dancing around is how hard
it can be to just be a person,how you know.
(01:47:36):
You used the word intention,joanne, and I think that's very
true, and I think what these AItools are helping to show us is,
on the one hand, yes, that wewant things to be easier for us,
but also that we do need towork a little bit at things like
being ugly and making mistakesand tripping and falling and
(01:47:58):
imperfection, all that thing,and we actually have to work at
that and not just doing it,because I know I'm an expert at
a lot of those things, believeme.
Yeah, but the work is in beingokay with it and just
recognizing you.
Are you at your place?
Yeah, and that is the bigbattle that we have against AI.
I think, ultimately, what wewill see as human, or what's
(01:48:24):
essential for being human, maychange, and we have to be okay
with that so long as we can be.
That's my final thought.
Aaron (01:48:38):
That's my final thought.
I think, yeah, I think just.
I just think we need toconstantly remind ourselves of
all of the things in our worldand universe that we can't see
still yet, and that there willbe an infinite number of things
(01:49:00):
that we will always even, youknow, even well into the future.
There will be always behumongous wide stretches of
things that we just can't see,that are outside of our vision,
that are outside of ourunderstanding.
Knowledge and part of beinghuman is living with that.
And even as we have these again, these answer machines or
(01:49:27):
whatever that can feel like theyare going to sort of wrap up
all the questions and give ustons of answers so many of the
hard questions give us tons ofanswers, so many of the hard
questions give us good answerswe will still always have to
struggle with or live with notknowing.
Bill (01:49:46):
Basically, Well, I want to
say thank you to everybody
tonight, to our live audiencethat came here tonight, to
Joanne, obviously, for being amainstay always in these, and
especially to Aaron and Bertrandfor your presence tonight,
because it has been fantastic tohave both of you here and this
has been an amazing conversation.
Also want to give a thanks tothe United Church Foundation for
(01:50:09):
their support of this podcast.
And my final words are whateveranswers may lie out there,
whatever we know and don't know,wherever the machines try to
take over our humanity, knowthat, at the core of it all, you
are enough as you are.
You don't need to knoweverything, you don't need to be
(01:50:30):
perfect, because you areperfectly made as you are and we
love you as you are.
So, with that, I'm Bill Weaverand this has been Prepared to
Drown and we are signing off andwe love you as you are.
So, with that, I'm Bill Weaverand this has been Prepared to
Drown and we are signing off andwe will see you next month.
And that's a wrap.
Friends, if your brain isspinning, your heart is stirred
or your faith feels a littlemore complicated after that
(01:50:52):
conversation, then good, thatmeans we're doing something
right.
Prepared to Drown is recordedlive every month at McDougal
United Church in Calgary,alberta, with a real audience,
real questions and real coffee.
So if you're in the area, comeand find us, because we'd love
to have you in the room.
You can listen to past episodesand keep the conversation going
by joining us on Patreon or bychecking out preparedtodrowncom.
(01:51:13):
And before we go, hear thisyour humanity is not a bug to be
fixed by some algorithm.
You are not data.
You are not disposable.
You are a beloved, messy,glorious being and you are
enough exactly as you are.
Until next time, stay curious,stay kind and remember that
(01:51:34):
grace doesn't require perfection, just presence.