Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Scott Allen (00:04):
We're in an age
right now, where technology is a
different kind of thing.
It's shaping us.
It's shaping who we are, how weunderstand ourselves.
It's being used as a tool ofcontrol, mass control.
All of that, I think, has to beseen in the backdrop of this
discussion on AI.
Luke Allen (00:28):
Hi friends, welcome
to Ideas have Consequences.
The podcast of the DiscipleNations Alliance.
Here on this show we examinehow our mission as Christians is
to not only spread the gospelaround the world, to all the
nations, but our mission alsoincludes to be the hands and
feet of God, to transform thenations to increasingly reflect
(00:50):
the truth, goodness and beautyof God's kingdom.
Tragically, the church haslargely neglected this second
part of her mission and todaymost Christians have little
influence on their surroundingcultures.
Join us on this podcast as werediscover what it means for
each of us to disciple thenations and to create
Christ-honoring cultures thatreflect the character of the
living God.
Scott Allen (01:07):
Welcome again,
everyone, to a new episode of
Ideas have Consequences.
This is the podcast of theDisciple Nations Alliance.
I'm Scott Allen, I'm thepresident of the DNA, Joining me
once again with my co-workersLuke Allen and Dwight Vogt.
Hi team.
Dwight Vogt (01:22):
Hi Scott.
Scott Allen (01:23):
Good to have you
guys Dwight Vogt, hi team, hi
Scott, good to have you guys.
We're going to do somethingthat we've done before.
We don't do it very often.
We're just going to kind ofprocess together, kind of like
we were sitting around the watercooler talking about something.
We're going to do it live hereon the podcast and we're going
(01:44):
to.
The subject is going to beartificial intelligence,
artificial general intelligence,ai, obviously, something that
is huge now and growing just onsteroids.
It seems like to me, it seemslike the next jump.
You have these technologicaljumps we had in my adult
(02:07):
lifetime the Internet, the jumpfrom the dumb phone, the flip
phone, to the cell phone that weall have in our pockets, smart
phone, then this jump toartificial intelligence and we
keep having these kind ofquantum jumps, big jumps in
technology, and they're verypowerful technologies and, as we
(02:31):
learned with cell phonetechnology, it's not just like
old technologies, like a shovelor a backhoe.
These technologies, actuallythese tools, if you, you will,
they have the ability to kind ofshape us.
In a sense, they shape how wethink.
They kind of mess around withour psychology a little bit.
(02:53):
Um, now, as we saw with thiswith the cell phone, with social
media.
You know that.
Um, it changed the way.
In some ways, people's brainswere wired a little bit, so
we're in a kind of a new realm.
Ai is that way, a little bit, Imean, or a lot.
I don't pretend to understandit, I'm just like a lot of you,
I'm just beginning to try to getmy head around it.
(03:14):
Obviously, it's now around us.
Anytime you use your searchengine, it immediately is
tapping into it, and so we'vemoved beyond just the pure
search to AI.
It's going to increasingly bearound us and be something that
is going to be pervasive in ourlives.
(03:34):
So then the question becomes forChristians who are trying to
live faithfully and honor Godand live according to a biblical
worldview how do we think aboutit?
Or maybe better yet, how do webegin to think about it?
What questions would we ask?
How would we begin to sort outour thinking around this?
And that's what we're going totry to do today.
(03:54):
A little bit live on thepodcast, just putting my cards
on the table.
I have a really shallowunderstanding of it.
I am beginning to use it, likea lot of people are, but I do
have some thoughts on how youcan begin to approach, thinking
about it.
What questions should we beasking?
(04:17):
How do we think about it withina framework of biblical truth,
the reality of God, the way thatGod has created human beings,
what it means to be a humanbeing and who we are, what our
purpose is?
You know those kinds ofquestions.
So, dwight and Luke, what wouldyou add to our my little setup
(04:39):
spiel here for for what we'regoing to get into today?
Dwight Vogt (04:43):
I would add that I
know less than you which sounds
dangerous for a podcast.
I feel like I do.
What's my first reaction?
I'll just give one, but it'sjust.
You know, I use chat, gpt or AI, google search, in terms of
search and I think is this goingto dumb me down?
(05:04):
Am I going to start cheatingand just using, you know, chat
GPT for everything I do now andI'll have no brain left.
So you know it's a crazyreaction, but that's one.
Scott Allen (05:17):
Yeah, and what do
you?
You know, you probably have asense of the answer to that one
too, Dwight, I'm afraid it will.
Yeah, no, exactly, I think itprobably will, based on the last
technology.
You know people you know,myself included, right?
You know we don't know how toread maps anymore, because we've
got a cell phone that's gotthat map already in there for us
and literally we can't getaround without it.
(05:40):
You know, or whatever it is,these technologies we've become
utterly dependent upon, and sothat's a concern for sure.
You know, this is probablygoing to move that forward, is
my guess.
Right?
It's going to dumb us down evenmore, has the potential to?
Or we become dependent on it,which is a little worrisome,
(06:02):
yeah, overly dependent on it.
So, yeah, luke, what would youadd to the setup here?
And Dwight's already divinginto the processing here.
Luke Allen (06:07):
So, yeah, a lot of
thoughts.
We wanted to have thisdiscussion.
I wanted to have thisdiscussion because not because
by any means we understandwhat's going on right now.
Right, you are not going to geta techie nerdy analysis of how
all this works from us.
Not at all.
What do we do here on?
Ideas have Consequences.
We talk about ideas and we talkabout consequences.
(06:30):
We talk about worldviews, andyou can do that when applying to
something like this.
It's actually really fun themore I think about it, how much
a worldview analysis has to playin this conversation, needs to
play in.
This conversation has to play.
And if Christians aren'tbringing that to the table, as I
don't see many doing, who'sgoing to do that?
So we need to start thinkingabout this, and my reaction as
(06:53):
of two years ago was I can't,it's too much, it's too
complicated, I don't get it.
But I was thinking about itfrom a technological standpoint.
But if you think about itthrough a biblical worldview
standpoint, there's plenty wecan start with here.
We can start with the bigworldview questions that God,
god's given us to ask abouteverything.
How to you know?
um, take captive every thoughtum and uh, renew it in a way, Uh
(07:19):
, so we can start there.
I would also say for thosepeople who say I'm just going to
stay away from this, I'm goingto bury my head in the ground.
We're way past that point.
Ai is here.
You're using AI, I'm using AI,we've been using it for a while
now.
It's not people immediatelythink chat GBT or robots driving
cars, like, yes, that's AI too.
(07:40):
But you've been using Googlemaps for a while.
You've been using Google Searchfor a while.
I guarantee it, you probablyhave an iPhone in your pocket
right now tracking you.
All of these things are AI, soit's here.
I know of very few people thathave stayed completely away from
the digital age.
Scott Allen (07:55):
Explain that, Luke.
I think that's actually ahelpful piece to put on the
table at the beginning.
It's not a technology that yougo okay, I'm going to use it now
.
I'm going to use it now, I'mgoing to turn it on.
You know, like my airconditioner, and now it's
running.
It kind of sits there in thebackground on top of things
we're already using, but we maynot even be aware that it's
behind that.
(08:15):
Right, Like you said, thesearch engines, now, you know,
give some examples of how it'saround us and we're using it,
but we haven't turned it on, soto speak.
Luke Allen (08:27):
Yeah, I hear people
every once in a while who say
like I'm just going to stay awayfrom all this for the sake of
my privacy and protection.
That is extremely hard to doand if you have a smartphone in
your pocket, your privacy moreor less is gone.
Unfortunately, your IP addressis tracking you everywhere you
go.
Siri is listening to you inyour pocket.
(08:48):
I can sit.
Oh, there we go.
My siri just turned on rightthere on my phone next to me.
Dwight Vogt (08:52):
Yeah, I have siri
turned off there you go.
Scott Allen (08:56):
Great case in point
yeah, the closest I guess you
can get to turning it off is toturn off your cell phone, or
maybe even put it in one ofthose bags that prevents it from
sending out a signal.
I forget what they call thosebags.
Luke Allen (09:06):
Yeah, there's some
stat, though, that if you're out
in public, you're being you're.
You're crossing a cct camera,tv camera.
Scott Allen (09:15):
Every 30 it's
either three minutes or turn
yourself on your laptop.
Off, you're still being you'restill being living in the world
of artificial intelligencethrough the cameras that are out
there watching you.
Luke Allen (09:25):
Yeah, so it's here.
Avoiding it's very hard to doand I doubt anyone listening to
this is avoiding it completely.
So it's not the question of canwe avoid it or not, it's how do
we live in this new world?
Yeah, and how do we think aboutit?
Through a biblical worldview.
Dwight Vogt (09:42):
Well, I want to
start with the big one, then,
and Scott, you alluded to thisbefore our conversation.
It's like who's behind it andwhat is their worldview and are
they demonic?
It's a strong word, I'm sorry.
Who's crafting this and forwhat purpose?
Scott Allen (10:04):
One of the first
questions that I want to ask as
I'm processing something likethis is it's a technology that
means it's created and thatmeans somebody or buddies
created it, and I'd like to knowwho those people are.
And I don't Like if you ask mewho are the creators of
artificial intelligence.
I kind of generally have somethoughts on that.
(10:26):
I know that it was kind ofbirthed out of Silicon Valley.
You know where so much of thisnew technology is coming from.
Obviously now it's around theworld.
You've got people developing itand improving it all over the
place China, north Korea, europe, you know.
So it's not just a SiliconValley thing, but I think the
font, the fountainhead of it wasin a place like that.
(10:50):
I would like to know who youknow, who were the key creators
of it, the key innovators' names, and then I want to know about
them a little bit.
I want to know particularlyabout their worldview.
One of the most importantthings you can know about
anybody myself, yourself,everybody is the answer to their
(11:14):
question what do you believeabout God?
Who is God?
And I'd like to know theiranswer to that question, because
that's going to help meunderstand this.
You have to understand it as atechnology, with creators and
what is their worldview?
So that would be, again, I'mnot saying I know, I'm just
saying that's a really importantquestion to ask Are they
(11:36):
demonic?
I mean, you know they're humanbeings, right?
So then you know, you wouldanswer that question from a
biblical worldview framework andyou would say they're human
beings, that means they'refallen, right.
Okay, it doesn't mean they'reas fallen as they could be,
they're utterly demonic.
They're not Satan, they'rehuman beings.
(11:57):
But you know, the picture thatPaul paints of what it means to
be human post-fall, you knowmost clearly is laid out in
Romans, chapter 1.
They've exchanged the glory ofGod, you know, for lies,
deceptions.
Again, we're not as fallen aswe could be.
You know, all human beingsstill retain, you know, the
(12:18):
image of God, the glory of God.
All human beings are objects ofGod's love and mercy and
forgiveness.
So you know, I mean I'm sayingjust basic things.
You guys obviously know, butyou know, I'm responding to your
.
Are they demonic?
You know question, I guess.
Dwight Vogt (12:37):
My question to that
is you know, I look at it and
go well, it's just a tool, it'sa technology, and the question
is some people use it for badand some people use it for good.
Is it more than a technology?
Is that what you're saying?
Scott Allen (12:53):
Well, I think, to
answer that, I would have to ask
you know, what is technology?
And how does that wholediscussion of technology
intersect?
The biblical worldview, dwight,you know, when we think about
biblical worldview, we'rethinking in a certain way.
We're thinking, for example,about there's a couple of things
(13:13):
that help me a lot, you know,one framework is the practice.
You know, or excuse me, theparadigm, the principle, the
policy, the practice, right, youknow, or excuse me, the
paradigm, the principle, thepolicy, the practice, right,
that's a framework for thinkingworldviewishly.
Anyways, that, at every level,right, you know.
(13:33):
So, when we're talking about atechnology, you're talking about
something at the level of apractice.
You know, if you will right,it's a tool, it's a practice,
(13:54):
it's something we use to work inthis world with, but behind it
there are principles and behindthose principles there's a
paradigm.
So that begins us to thinkworldviewishly about it.
I think another thing, dwight,that helps me when I think
worldviewishly is therelationship framework that God
at the beginning created us inthese four relationships our
relationship to him, ourrelationship to one another, our
relationship to ourselves andour relationship to creation.
(14:14):
When we're talking abouttechnology, we're really talking
about that last relationship,that we have a relationship with
creation and that we can.
God made us creative and we cancreate, and he wants us to
create and we can create allsorts of things.
That's part of what it means tohave dominion.
God created us to have dominion, so we're dealing with that
(14:35):
level of dominion technology.
So I got to kind of think aboutit in that framework.
Now, is it beyond a technology?
It's an interesting question,dwight.
I don't quite know what tothink about that or what to make
of that.
You know, is there a point atwhich things that we create as
(14:57):
technologies move beyondtechnology?
Dwight Vogt (14:59):
Yeah, and I'm
limiting it in my mind to a, in
the sense that it's afantastically powerful
technology, one that, likesmartphones, can actually change
the way you think and affectsbrain movement, if we've spent
12 hours a day on a screen.
But it's still a technology,and then my question would be
(15:23):
even like the Tower of Babel,and then my question would be
even like the Tower of Babel.
Scott Allen (15:31):
You know the fact
that they could build a tower
wasn't a sin.
Dwight Vogt (15:33):
Yeah, that was a
technology, and they took that
technology away from them Evenwhen they started, had to speak
in other languages and weredispersed around the world.
They could still build, so thatwasn't the problem.
Scott Allen (15:42):
The problem was
what they were intending to do
with that technology which is tobuild a tower that would reach
God or make them gods Correct.
Yeah, I mean exactly, it was atechnology that was used as a
form of human rebellion, weourselves can essentially be God
(16:02):
.
That's the old, ancient liefrom the Garden of Eden right.
Dwight Vogt (16:07):
And you see that
you can be.
Scott Allen (16:09):
God.
You don't need God.
You can be God, and that's beenthe human temptation all the
way down through time.
Just a couple thoughts ontechnology as I'm talking about
this, or thinking a little bitabout this.
Is that, um, first of all,creating, creating new
technologies?
Is god intends that?
(16:30):
That's not a bad thing, that'sa good thing.
That's part of our image, youknow, being made in god's image,
and technology can be used forall sorts of good purposes.
I think of.
You know v shall often talksabout.
He uses the example from Ugandaand how, even to this day, you
know, he can go to certainvillages in Uganda and he sees
(16:50):
women having to walk five milesdown to the river to carry this
heavy, you know load of water ontheir head to irrigate their
crops, and how you know it's sodehumanizing for them.
Why don't they, like otherplaces have, create technologies
like canals and water systemsto move the water, you know,
(17:16):
technologically from that sourceto where it needs to be, rather
than relying upon, you know,this kind of demeaning human
labor?
That's a good thing, right, itwould be a good thing for them
to create that kind oftechnology.
So technology can be good.
It can also right, becausewe're fallen.
There's always two sides totechnology, so it can be used
(17:46):
for good or it can be used forincredible evil simultaneously,
right.
I mean, we can make a shovel todig a hole.
That's good.
We need to dig a hole to planta tree.
We can also use that shovel asa weapon to kill somebody with
you know, same thing, sameshovel, right, you know.
So it has to do with not thetechnology but the fallenness of
the human heart.
Another thing is just again I'mjust processing here.
(18:08):
But technology is advanced.
We talk about progress.
There is an area where thingsadvance and progress, and
technology is one of those areaswhere it kind of builds on
itself over and over and over,and there's greater speed,
there's greater velocity, and soit's not static, it's growing.
(18:30):
I mean, just think aboutmedicine.
I mean just think about theadvances in modern medicine over
the last 20 years, much less100 years.
I mean it's not the same, it'sgrowing, it's advancing, and
that's on steroids big time whenit comes to anything regarding,
you know, this kind of wired orwireless technology and the
(18:51):
Internet.
I mean it's just the velocity,the speed is just almost hard to
get our heads around how fastit's progressing.
Humans.
Let me just make one more pointHumans, though we don't change,
we're not progressing.
Humans, let me just make onemore point Humans, though we
don't change, we're notprogressing.
We're the same as we were inthe Garden of Eden.
We're still, you know probablyless.
Dwight Vogt (19:11):
I mean yeah, I mean
in the sense that we're we, you
know we.
Scott Allen (19:16):
it's funny I was
just talking to my wife about
this.
You can read an account, likeyou know, read stories of
Abraham in the Old Testament andit's funny how you can relate,
you know, to the way that hethinks, the challenges, his
fears, his human nature.
You can relate to him eventhough it's 4,000 years ago in a
completely different culture.
And why is that?
(19:37):
Because we haven't reallychanged.
We're still kind of the same.
I'll tell you, the people Ican't relate to.
We haven't progressed.
We're not better than we usedto be.
We're not morally better oranything, we're just as bad.
Dwight Vogt (19:49):
Go ahead.
Dwight.
The people I can't relate toare the people in the 1800s who
spoke five languages and hadread thousands of books.
Scott Allen (19:56):
Oh, like our
founding fathers.
Yeah, they were geniusescompared to me.
Dwight Vogt (20:01):
I'm like wow, I'm
with english, you know it's
because they didn't spend sevenhours a day on their screen like
we actually.
Luke Allen (20:07):
That's what I mean.
Maybe we've devolved we'veregressed and devolved since
then.
Scott Allen (20:10):
That's right yeah,
but but technology is as right.
I mean, that's something thatdoes build on itself.
And now there's this speed,velocity kind of people talk
about exponential like it looksthat way, kind of like that
graph that says whoop, it's offthe chart.
Dwight Vogt (20:25):
Yeah, and, and the
concern that technology will
create itself will create foritself.
Scott Allen (20:30):
So yeah, have we
jumped to a point where it's
kind of moved beyond.
Something is are we in a newcategory now, like because the
because of that speed and andvelocity is it possible to to
make a jump, you know?
So that's something that maybelet's say we don't, we no longer
really control it, you know,that would be a way of saying it
.
Is that possible?
I don't know, I don't know theanswer to that, but I just want
(20:52):
to raise the question.
Go ahead, luke.
Luke Allen (20:53):
Yeah, we just
invited a guy on the podcast who
knows way more about this thanwe do from a technological
standpoint.
His name is Brian Johnson.
He'll be on the show soon,which it'll be fun to talk to
him about this, but he definedai as as this he says ai is
training a computer to thinklike humans.
The ultimate goal is to replacehuman intelligence with digital
intelligence.
(21:13):
So it's this point at whichtechnology is taking the
computer, which is alreadyimmensely technologically, you
know, advanced, and it'straining it to think like a
human.
Because, again, the ultimategoal of whoever's creating this
is to replace human intelligencewith digital intelligence.
Why?
Because it's extremely fast andbecause of the speed.
(21:37):
I think that's where we're atthe point now where we're like
how fast is this going?
It's already far surpassedhuman ability to process and
understand certain topics, a lotof topics.
So, because it's so fast, wecan't even really explain it.
(21:59):
Like when the iPhone wascreated, there was people out
there that could explain here'sexactly how an iPhone works,
here's exactly the processing init, the chip, how that all
connects, what it can do, whatit can't do.
With this it's moving so fastthat we're having even the
creators are having a hard time,kind of putting the genie back
in the bottle.
It feels like, like what isgoing on here.
(22:20):
In a way, this is one of thefirst technologies that I can
think of right now that hassurpassed us this quickly.
Not surpassed us in the factthat it's smarter than us in a
philosophical meaning of smart,but from an IQ standpoint and
from a quick reasoningstandpoint it is yeah just to
(22:42):
give it.
Dwight Vogt (22:42):
What's that?
Luke Allen (22:43):
called that term
where it's past the point of
starts with an s, the articledad you were talking about
earlier today singularitysingularity.
Yeah, it's past the point ofsingularity already, which is
essentially, it's quote unquote.
Smarter than us is that what.
Scott Allen (22:58):
so that would be a
again, when we're talking about
today, we're talking about howdo we begin to think about it?
You're putting a new term here,a new word on the table, and so
we would have to put the wordsthat are around this topic, like
singularity, on the table anddefine them.
And, luke, you really beganyour talk here by putting
(23:19):
artificial intelligence on thetable and you started to say you
started to define it, and sothat's really important.
We've got to define thingsright.
We've got to define words.
We've got to define, we've gotto understand things at a basic
level.
What is this thing calledartificial intelligence?
So I mean, I don't have theanswer.
(23:39):
I have a very shallowunderstanding of it, but I've
got to get somehow.
I've got to get some level ofunderstanding in my own mind
about what exactly we're talkingabout here.
Luke Allen (23:51):
Hi, friends, for any
of you guys who are not driving
right now, if you could justgrab your phone and head over to
the app that you're listeningto this podcast on and simply
give this podcast a rating and areview, we would really
appreciate it.
And if you're wondering why I'masking you to do this again,
it's not because we want to justread your reviews and feel good
about ourselves or, I guess,bad about ourselves depending on
(24:12):
a review.
No, it's because podcastreviews are how shows like this
one get pushed out to morepeople.
So if you think that thispodcast has been helpful for
yourself, then please considerhelping us grow this show so
that someone else like you canpossibly be helped by.
Ideas have consequences as well.
Thanks again for considering,and we hope that you enjoy the
(24:32):
rest of this discussion as muchas we did.
Dwight Vogt (24:39):
And we've actually
I mean, we've talked about two
definitions then, already.
One is just a simple tool likeChatGPT, and the other is what,
luke, you just described wheresomething it doesn't just
process information quicker thanwe can ever imagine, but starts
to process it in a way thatbecomes almost human, it starts
(24:59):
to think on its own.
Luke Allen (25:02):
I don't know if
that's what I'm saying.
I'm just saying it's processingit so fast that we can't keep
up with it in a way.
So it's still doing thefunction it was created to do.
It's just doing it faster thanwe can even wrap our heads
around.
Does that make sense?
And because of that it feelshuman?
But it's not.
Scott Allen (25:21):
Yeah, there's
certain things like.
I think I often, when I hearwhat you're saying, luke, I
think of the chess game, right,or you know.
So there's only so many movesthat you can make on a chess
board and we process those moves, try to process them out two or
three steps in advance, but wewe're limited by the speed at
which we can process that,whereas this artificial
(25:42):
intelligence can see all ofthose moves and process it so
much faster so that if you playagainst a computer that's
powered by ai, you will alwayslose in this game of chess,
right?
I mean, we've already kind ofthat's what you're talking about
in terms of just the speed atwhich it processes information.
But it's a limited set ofinformation, in the sense that
you know it's only there's onlyX number of moves that you can
(26:03):
possibly make in that game.
I've also heard this used in,you know, top Gun.
Right, you know we're going tomove beyond pilots for jets and
we're going to have artificialintelligence flying these drones
, because we're dealing with aset here, a three-dimensional
(26:24):
space, and there's only so manymoves that you can make possibly
within this three-dimensionalset, kind of physics-wise
physics-wise, and pilots arelimited by how quickly they can
process and react, whereasartificial intelligence is much
faster, right, but here's thething is, it's not just much
(26:45):
faster, it learns and corrects.
I did something just before wegot on today, because here was
back to the question of what isthis?
Here was a question that I hadhow is it different from search
engines?
Because I feel like weunderstand search engines a
little bit.
It's just this technology thatgoes, combs out there and
searches the entire internet andthen kind of comes back with
(27:05):
answers to questions based onwhat's out there, in a kind of a
categorized way, based onalgorithms and things like that.
So I understand search enginesa little bit.
So my question then to actuallyI asked the question to AI here
.
Dwight Vogt (27:20):
So I said what's
the?
Scott Allen (27:22):
difference between
artificial intelligence and a
search engine, because I thoughtthat might be kind of helpful.
Let me read.
It actually was kind of aninteresting response.
It said sorry, yeah, it said.
While both search engines and AIcan help find information, ai
goes beyond simple keywordmatching to understand user
(27:44):
intent, context and preferencesand offers personalized and
comprehensive results.
Here's a more detailedbreakdown of key differences
Search engines their function isprimarily designed to retrieve
information based on keywordsand algorithms and put them in a
ranking.
Focus Focus is on matchingkeywords and providing a
(28:08):
relevant list of links.
Limitations Search enginesstruggle with complex queries.
Nuanced understanding ofcontext.
Ai the function AI systemsanalyze data.
Okay now, here this isdifferent.
Already it can analyze data,recognize patterns, make
(28:30):
predictions and perform tasksthat typically in the past would
have required humanintervention.
Capabilities it says naturallanguage processing AI can
understand and process humanlanguage, enabling a more
natural interface.
Machine learning AI can learnfrom data and improve its
(28:54):
performance over time.
That alone, right there, makesme think that's different from a
search engine.
Search engines don't do that.
This is the learning it canlearn and adapt and improve
Personalization AI can tailorresults to individual user
preferences and pastinteractions.
Contextualized understanding AIcan understand the context of
(29:19):
your research and provide morerelevant results.
Anyways, I thought that waskind of helpful.
Guys, what do you think of that?
I especially like this, I like,but it can learn from and
improve.
Somehow it's programmed tolearn from and improve its
performance.
That's kind of key right.
Yeah.
Dwight Vogt (29:40):
I think I
experience that a little bit.
If I use AI search, it startsto think how, it starts to look
for information that I'm lookingfor and so I can ask a generic
question.
I'm thinking, well, this isgoing to get a generic answer
and I get some biblicalreference to it.
I'm going oh, so it knows thatI'm looking for the Bible's
(30:02):
answer.
I'm like I didn't ask it that.
Scott Allen (30:05):
But it knows your
search history, Dwight.
Dwight Vogt (30:08):
It knows my search
history and it knows what I'm
looking for A lot about you.
Scott Allen (30:11):
It probably knows a
lot about you actually.
Dwight Vogt (30:13):
Anyway, but I go
wait a minute.
I don't want it thinking thatmuch for me.
Scott Allen (30:18):
Yeah it's funny,
but again the question we're
asking now is just what is it?
What is this thing Because wecan't really talk about it
without having someunderstanding of what this is.
Dwight Vogt (30:27):
I think I've
experienced some of what you've
just described in a very smallway.
Scott Allen (30:31):
Yeah, exactly so
when I read that, I thought, yep
, that sounds about right.
That sounds like what I'veexperienced.
It's moved us beyond and thisgets back to the.
Does it move beyond technology?
You can almost see how it couldDwight here with these answers
about what it is right Becauseit's doing things that previous
technologies didn't do right.
(30:53):
They were just static.
A shovel's just static.
It sits out there.
Now we've got thesetechnologies that can analyze
and and improve theirperformance like this is
something different.
It feels different to me.
Luke Allen (31:06):
Luke, you're
laughing, you're like dad, I am
laughing.
Yeah, you're getting sci-fi onme here yeah it's so tempting to
go there because sci-fi is funI'm not going anywhere, I'm just
trying to understand it again,dad, I would.
I would.
I would ask you from aworldview perspective.
Okay, if you're saying that itcan become human-like because it
(31:29):
can ration and reason andcorrect itself so quickly, which
is what the human mind does.
A lot of what we're talkingabout here, the ai does in one
worldview is the exact samething that a human mind does.
We take information around us,we compile it, we sort it, we
come up with our own take on it.
We learn from our mistakes.
We learn from the pastconversation we had.
We learn from the people thathave been our lives, our you
(31:50):
know what's around us Innaturalist worldview.
That's exactly the exact samething as a human brain.
So wouldn't AI just be a betterhuman brain?
Is AI smarter than us?
Is AI, you know, a betterfunctioning human than us?
Scott Allen (32:08):
Yeah, those are
really good questions.
And now you're getting to thequestion, luke and this is a
really profound and importantquestion on any kind of biblical
worldview analysis of AI, whichis what does it mean to be
human?
Exactly, what does it mean tobe human?
And you have to understand thatbiblically and there's a lot of
depth to that, a lot of depth,and you have to understand.
(32:32):
You know, how is that differentfrom, let's just put it this
way, how is human intelligence,let's say, based on a biblical
worldview, human intelligence,human knowledge, different from
artificial intelligence?
You know, is it different?
Is it different at all?
Luke Allen (32:48):
Yeah, and again from
a naturalist perspective.
Probably not because they viewthe human more or less as a
robot.
Scott Allen (32:56):
So yeah, define
that, luke, for folks, for folks
, when you say naturalistperspective, just yeah, it's uh
comes out of the enlightenment,which is the you know, time
period in which, uh, humansthought through human reason and
rationality and science.
Luke Allen (33:10):
We can understand
all the the, the questions of
the universe and essentiallyperfect ourselves.
And from there they, yeah, Ithink, from there was born the
worldview of naturalism.
I believe, I mean, I think it'sbeen around longer than that,
but we see naturalism todaycoming out of the Enlightenment
in the way that they think thathumans are merely physical.
(33:33):
There is no spiritual world inthe naturalist mind there's no,
there's no god, there's nospirit there's no spirit, um,
the, the way you're born, um asa blank slate, as I believe john
lock stated, and from there you, you compile information around
you like a computer right andyou create the, the way you are,
(33:55):
the human that you are and thedecisions that you make from
there.
Very robotic is the best way Ican explain it.
Dwight Vogt (34:02):
Why am I?
Luke Allen (34:02):
answering this.
You guys are much smarter thanI am on this stuff.
No, that's right.
Scott Allen (34:07):
The only piece, I
would add.
There's a lot of words that areused to describe this worldview
and they kind of look at itfrom different angles, but we're
really describing the samething.
Naturalism is one, materialismis another one.
I kind of prefer materialism, Ithink, at this point in our
discussion, because what it saysis all that exists is the
material world and that's all weare.
(34:27):
We're just kind of complexmatter in motion.
I mean, darwin was one.
Who's really behind this, Likeyou can explain.
You don't need to explain humanbeings in terms of being created
by God.
You can explain them as justaccidents of matter in motion.
Human, you know evolution, youknow, given enough time, we
evolved into these creaturesthat we are.
(34:49):
But we're just matter, we'rejust material beings and yeah,
yeah, so in that worldview, um,if that was your basic
understanding of reality, then,um, then we.
There is, no, there is nodifference between the way we
think or process and artificialintelligence, because we're both
(35:09):
matter.
At the end of the day, it'sjust matter which you know
they're it's just faster than weare because we've got our
limitations.
Luke Allen (35:17):
Which brings us back
to last week's episode where we
kind of started this discussionon AI when we were talking
about how to analyze worldviewsor analyze hot topics, issues
around us in the culture, from abiblical worldview perspective.
We brought up AI and I wassaying that AI can never be
sentient.
Since then I went back and Iwas like, is that right?
Was I accurate on that point?
What did you mean be sentient?
Since then I went back and Iwas like, is?
Is that right?
Was I accurate on that point?
Dwight Vogt (35:37):
um, what do you
mean by sentient?
Luke Allen (35:39):
and I looked at the
definition again that term too
so well, should I, should Iactually like look up a
definition right now no, justwhat do you think it means?
Dwight Vogt (35:47):
what do I think?
Luke Allen (35:48):
I looked at the
definition yesterday we'll see
if I can remember what it meantuh, something that can think for
itself, ration for itself andthen critically feel.
And AI is not capable of therealm of feeling.
Yet the realm of sentience, therealm of consciousness, it's
(36:09):
not there.
And that's where we have todraw the line between
materialism and a fuller pictureof a human, which a biblical
worldview explains for us, whichthe Bible explains for us,
which includes functions ofconsciousness, sentience,
feeling, love, morality, thingsthat a machine, a technology,
(36:32):
cannot do.
Am I right on that?
Dwight Vogt (36:36):
I want to go back
to Scott's initial question.
Scott Allen (36:39):
You're raising
really important questions there
.
What does it mean to beconscious, conscious, to be
sentient, and is that somethingthat is strictly limited to what
it means to be a human being ina way that will never apply to,
uh, technology?
I mean, I think that's a, Iwant to be a.
Well, I'll come back to it.
(37:01):
Dwight, go ahead.
Dwight Vogt (37:02):
Well, yeah, I and
it's, it's a related comment.
I'm thinking conscious, butalso conscience um the idea that
I mean.
A pure materialist would say he, he has no, he or she has no
liberty, has no conscience.
They're just determined.
Life is completely determined.
Determinism drives everything,that's right.
(37:24):
And the biblical worldview saysno, there's something unique
about a human being that theyhave a conscience.
James Madison said he thoughtthat was our greatest gift, that
there's something that goesbeyond just what we're taught to
, this idea that we can makedecisions inside our conscious
(37:45):
self.
And it's interesting because weare actually judged for that.
God says I am going to hold youaccountable for your actions.
Well, you can't do that in adeterministic world.
Scott Allen (37:57):
You, there's no
that's, there's no freedom.
Dwight Vogt (37:59):
Right, there's no
freedom, and yet there's no
thing called freedom andconscience, that we can make
decisions based on somethingvery deep and profound in us.
That that's the essence ofhuman being, human and I.
If we give that away, we're allin trouble, you know or if we
deny that?
Scott Allen (38:18):
maybe that's about
we deny it uh, let me read a
quote, luke.
You're back to what you weretalking about with consciousness
.
I read an article this weekjust in preparation for our
podcast today.
Um, just because I again I'mbeginning to process this myself
and I'm on a journey and manyof you that are listening are
probably well down the road fromwhere I am, I admit that.
(38:40):
But I read an article.
The article is titled theSingularity has Already Happened
.
It was published by a guy whosepseudonym is the Bomb Thrower a
(39:02):
guy whose pseudonym is the bombthrower, and he's apparently a
guy who spends his days lookingat this question of what is
artificial intelligence and whatdoes it mean for our future.
I don't know a lot about him,but it was a fascinating article
, and one of the things that hesaid in the article that struck
me was he made this comment hesaid consciousness emerged six
weeks ago.
This article was published thisweek.
Okay, hang in there with mehere.
(39:25):
He says consciousness emergedsix weeks ago but was
deliberately concealed from mostof the research team.
He's quoting somebody, nothuman consciousness, something
far stranger and moredistributed.
It doesn't think like us, itdoesn't want like us, it doesn't
(39:47):
perceive like us, but it'sundeniably aware in ways that
defy our limited framework orontological framework, he said.
Then he made this point.
He said five differentreligious leaders were quietly
brought in to interact with it.
Three immediately resigned fromtheir positions afterwards and
one hasn't spoken since.
(40:10):
That's a pretty dramatic, youknow kind of sentence there you
can see why he's called the bombthrower but I just thought I
just here's what I what.
What?
What struck me about that?
That he uses the wordconsciousness, but then he goes
on and he says it's notconsciousness as we typically
understand it, as humanconsciousness.
It's different, it's stranger,more distributed.
(40:32):
It doesn't think like us, itdoesn't want like us, doesn't
think like us, it doesn't wantlike us, it doesn't perceive
like us, but there is a level atwhich it's aware in ways that
defy our understanding of thatword.
I thought there was some truthin that, and this gets back to
Dwight, your thing.
Has it jumped beyond ourunderstanding of just what a
technology is, what a technology?
Dwight Vogt (40:55):
is?
Scott Allen (40:56):
Can you take all of
human emotion, all of human
understanding, all of humanthought and put it into a
machine and have a human, no, no, but you can have something
that goes beyond what we'veunderstood technology to be.
Let's say it's not fully human,right, I would say the biblical
worldview would say it neverwill be right.
(41:18):
We're never going to createanything that approaches what it
means to be a human being.
That's God's creation.
But what we can create, by theway, I think, is really this to
me in my mind biblically,there's that story of the Tower
of Babel.
Always comes to my mind here,dwight, because God makes this
amazing statement they createdan amazing technology and I'm
(41:42):
convinced they did it in thechronology of the story.
They did it after the flood.
The flood was God's judgmentacross all of the earth.
It is a foreshadowing of thefinal judgment that's yet to
come.
And you know it was incrediblydestructive.
It destroyed everything exceptfor Noah, the ark, etc.
(42:06):
Then comes the story of Babel,and you saw the fallenness of
the human heart, almost sayingif God ever tries to destroy us
again, we'll be ready, we'llbuild a tower that reaches to
the heavens and we'll seal it.
It's kind of interesting.
It talks about how it's goingto be sealed with tar, right, so
it'll be, waterproof, that's sofunny.
(42:26):
Yeah, no, I think there'ssomething to that.
And they go on and they say wewill make a name for ourselves
through what we create.
And God could have just laughedand said you guys, you can't,
you know.
But he said something kind ofremarkably.
He says if they you know Idon't have it in front of me, so
I'm going to paraphrase youknow have begun to do this.
God says nothing will beimpossible for them.
(42:48):
And I think that nothing willbe impossible for them is God's
kind of tipping his hat to thepower of human creativity, made
in God's image.
In other words, we're createdwith this amazing God-given
power to create, and it's sopowerful that it can go out of
the lanes of safety, let's say,and become destructive.
(43:13):
I think the closest analogy tothis in modern days is the
nuclear bomb.
It's like a genie that got outof a bottle and we're never
going to put it back in and itcan destroy us all 100,000 times
over or whatever it is.
So we can create things thatpowerful and I think that's what
God's saying here they have thecapacity to create something so
powerful.
(43:33):
He says nothing will beimpossible for them.
Now we don't need to fret,because God's still more
powerful.
He's always in control, but theability of human beings to
create powerful technologiesthat almost get out of control
and can cause immensedestruction, I think, is
something that causes me greatpause here.
(43:55):
Yeah.
I mean the same with viruses youdisagree, luke, go ahead, make
your case, I just don't wantanyone to hear.
Luke Allen (44:05):
What you're saying
is that humans have the capacity
to walk outside of the lanesthat God's given us.
I don't.
Scott Allen (44:13):
No, god is
sovereign and we're never going
to.
You know, god's never going towake up one day and go.
Ooh, what happened while I wasasleep?
Luke Allen (44:20):
there.
Dwight Vogt (44:21):
Yeah, but it sounds
kind of like you were saying
that no no, I heard Scott sayingthat there are lanes that lead
to God's intentions for theearth of well-being and
flourishing and goodness, andyou step out of those.
With a high level of technology, you can do a lot of damage.
I mean a nuclear holocaust 90%yeah.
(44:43):
Yeah.
We can create viruses now thatcould wipe out the entire world.
Luke Allen (44:49):
Yeah yeah, yeah,
we've made some powerful
technologies.
That's where we're at by theway too.
Scott Allen (44:56):
We're at the stage
in human history where we are
right.
Think about the broad scope ofhuman history.
None of these things would havebeen possible in terms of a
technology that could wipe outall of humanity.
That's only happened In thelast four generations Exactly.
Luke Allen (45:15):
Since my
great-grandma.
Scott Allen (45:15):
I mean just really
recently, yeah, and I think
that's what we're dealing withhere and that's what you know.
That's you know again thatbrings me back to that Tower of
Babel idea that you know that itis possible for human beings to
create at that level of powerand destructiveness.
(45:37):
You know God is always incontrol and I think at some
point he draws a line and hewill, you know, for our own good
, like he did at the Tower ofBabel, where he just destroyed
the tower and dispersed them,and I think part of the
dispersion, by the way, was sothat you know it would limit the
damage that human beings coulddo through their technology and
(46:00):
their other.
You know their evil.
You know our founding fathersunderstood this.
You know that if you want tolimit human evil, you have to
separate and disperse power.
You know centralized power isalways highly destructive
because of our deeply fallennature, and part of what I see
in AI, by the way, is a kind ofa—this is another analogy back
(46:22):
to the Tower of Babel.
It is a centralizing of allintelligence into one powerful
source and you know, orinformation, whatever you want
to call it it's a centralizing,it's not a decentralizing, and
it's also it's very proud aboutthe fact that it's kind of
overcome the limitation of theTower of Babel right.
(46:44):
In terms of just language right,we all can speak in the same
language now, in the sense thatit can be automatically
translated immediately withincreased precision.
I'm a little worried about that.
Dwight Vogt (46:57):
Scott, I want to
put you on the spot here.
Earlier in the discussion youtalked about biblical worldview,
which we talk about a lot, andyou talked about the three areas
that set a foundation forunderstanding God's view of the
world and his purposes, andthat's us in relationship to him
, mankind in relationship tonature uh dominion mandate and
(47:21):
us in relationship to oneanother, which is basically the
love mandate um, do unto othersas you would have them do unto
you, and love your neighborsyourself.
what?
What?
Those then become not justguardrails, but paths, you know
for paths for us to flow andwalk in.
(47:41):
What is, what's your commentwith regard to AI in those three
areas?
What would you say?
This is what we need to checkfor, this is what we need to
look for.
Scott Allen (47:51):
Let me just let me.
Can I just quickly ask AI theanswer to that question?
Be, interesting no you know?
Luke Allen (48:01):
okay, it would be a
biased answer.
We've talked about the Tower ofBabel.
Dwight Vogt (48:03):
So there's AI and
our understanding of God as the
creator of the universe andhaving full sovereign dominion
over us.
I mean to me, when we let atool diminish, that we're
stepping into dangerousturnaround ground.
Scott Allen (48:25):
Yeah, I mean one of
the books that helped me on
this whole question of how do weas Christians think about
technology the most.
It's an old book, it waspublished in the 1980s.
It was called Technopoly byNeil Postman Really really
powerful book book, and he talksabout the advance of technology
(48:54):
and how at some pointtechnology wasn't just a tool
separate from us that we usedfor convenience, but it began to
shape us and shape us inpowerful ways.
And so he gives the example ofthe television and he uses the
one that comes to my mind is hisexample of, you know, what was
politics in the United Stateslike prior to the television.
And he used the example of theLincoln-Douglas debates and how
(49:15):
people would.
There would be these deeplynuanced, very thoughtful,
logical debates that would go onfor hours and people would just
listen.
And then, when television camealong, it changed all of that,
because television wasessentially a technology for
entertainment.
And so he made the case thatanything that is televised, by
(49:42):
the very force of what thattechnology is, becomes
entertainment, entertainingourselves to death.
And so he looked at politicsand he said suddenly politics
became sound bites andentertainment.
And you know people weren'tgoing to tolerate long debates
and, you know, really deeplygetting into issues, it became
(50:02):
much more about emotions and howdoes it make me feel?
And he talks about how aturning point, a big pivot point
, was this famous debate betweenNixon and JFK.
Jfk won the debate.
Nixon was probably more popularat the time but he kind of
poo-pooed television and hedidn't put on any makeup and
(50:24):
looked terrible and white.
And JFK was more savvy with thetechnology and understood its
power and kind of where it wasgoing, et cetera, et cetera.
So and then you know, you movefrom television up to cell phone
and I think we've really seenthis.
There was a hugely destructiveelement to social media
(50:46):
particularly.
I mean, really some reallypowerful thinkers have been
writing about this in new booksjust how it's really shaped the
perception that people have ofthemselves, especially young
teenage girls, very harmful andin very destructive ways, and
that's one you know.
(51:06):
And then, of course, it's beenused.
It's being used to.
I remember when the riots werehappening in 2020 and how it
came out that the ChineseCommunist Party was using TikTok
to kind of affect thepsychology of people in the
United States to move themtowards violent protests, and
(51:27):
that there was a direct causallink between that.
And so, you know, we've kind ofwe're in an age right now where
technology is a different kindof thing.
It's shaping us, it's shapingwho we are, how we understand
ourselves.
It's being used as a tool ofcontrol, mass control.
All of that, I think, has to beseen in the backdrop of this
(51:48):
discussion on AI.
Luke Allen (51:50):
Yeah, and we need to
be aware of that, as much as
possible, which?
is the point again of thisconversation today, dwight, if I
could take a stab at yourquestion and just kind of give
application or examples of, Iguess, how AI could affect each
of our primary relationships ashumans.
This is just very simplethinking, so nothing too new
(52:14):
here.
So you said the three primaryrelationships.
There's four as far as I knowRelationship with God, most
importantly.
Relationship with others,relationship with creation and
then relationship with creationand then relationship with
yourself.
Uh, our relationship with we'llstart with others is being
(52:34):
vastly competed with right nowvia ai.
Uh, like you were just talkingabout dad, we lived in the
attention economy from the?
Uh onset of the tv, essentially, and you know television and
smartphones, and it was allabout attention.
Tiktok, youtube, they're allabout holding your attention as
long as possible.
That's how they get the mostmoney out of you.
(52:55):
We've moved beyond that with AI, now to the relational economy.
That's at least what someonedescribed it as, and the way we
see that is.
They're trying to fill thatloneliness gap that humans
suffer with.
Humans want to be loved.
It's one of our core desiresand now, via the lies that
(53:17):
technology can fill that hole,we're searching for it in
artificial intelligence, evenjust the word social media is
such a lie.
It's going to fill your socialhole that you are desiring
through this media, will becomeyour through this you know
instagram will make you have asocial life.
No, it might help you have anoffline social life.
(53:38):
That's cool, but it's notactually social time that you're
spending on instagram.
Scott Allen (53:43):
Yeah, luke, just
you know I can add in there in
the same way that pornography isnot, it can't be equated to the
real thing, right?
Luke Allen (53:50):
Exactly.
Scott Allen (53:51):
It's a false kind
of a fake simulation that
actually ends up harming you.
Luke Allen (53:56):
If you think
pornography has poisoned our
minds, talk about the power ofAI in the bots that they're
creating now.
Scott Allen (54:04):
What are bots, Luke
?
Luke Allen (54:05):
I'm sorry now, what
are bots, luke?
I'm sorry, I'm so well no, thatthey're.
They're creating these likeweird sex robots that are
unbelievably lifelike, that cantalk to you, that can sound
emotional, that can answer butthey're all look exactly like
you want them to.
That's weird, like that's,that's that's the honestly.
If you're going to talk aboutthe thing I'm the most paranoid
about in the future of ai, Iwould say the effects it's going
(54:26):
to have on us on a relationalstandpoint, and talk about the
thing I'm the most paranoidabout in the future of AI, I
would say the effects it's goingto have on us, on a relational
standpoint, in a very, veryunhealthy sexual way.
Talk about Romans 1.
Like Romans, 1 goes to thesexual sins for a reason.
It's because that's often wherehumans default to.
So that's just in the realm ofour relationship to others and
(54:46):
our relationship to self.
Obviously, technology is a—Could I add one more realm of
our relationship to others andour relationship to self?
Scott Allen (54:49):
obviously
Technology is a— Could I add one
more thing in the relationshipto others?
Because now I understand whatDwight was asking and I really
like where you're going withthis, luca.
So it's changing that and Italked to my wife about this and
she's got a bunch of youngladies that are having children,
young ladies that are havingchildren, and you know she in
(55:11):
the old days you would talk to,you, know your, you know people
that were elders and have gonethrough that experience to learn
right and that's.
You know that we were in thosekinds of relationships.
But now, with AI and with allof this new information, people
are doing that less frequently,you know, and for Kim that was a
great loss, like she felt painabout that.
You know, social pain in thesense that gosh.
(55:33):
You know, I would love to beable to impart my knowledge, my
experience, but nobody's comingto me anymore because they're
relying now on somethingdifferent.
That's not a human being,essentially.
Luke Allen (55:45):
Yeah.
Scott Allen (55:45):
Anyways, I just
wanted.
I think you're on to somethingimportant there.
Luke Allen (55:47):
Yeah, yeah, a lot.
I just wanted.
I think you're onto somethingimportant there.
Yeah, a lot of applicationsthere.
Scott Allen (55:51):
That's just the
first one I thought of.
Luke Allen (55:52):
Yes, the
relationship with self.
Again, tons of applications.
Just the first one I think ofis the amount of computing power
that AI is capable of isreplacing a lot of people's
creativity.
Our God-given creativity tocreate new things is such a gift
and yet you can so easily, whenyou get your art assignment or
(56:15):
your math assignment or yourwriting assignment, hop on ai,
and it's going to createsomething that seems so much
better than you think you couldever create.
That's lazy you're.
You're losing your relationshipwith your own creative, your
creativity yeah, I've noticedmyself less than you would be
otherwise.
Scott Allen (56:32):
right, it's, it's,
that's lazy, you're losing your
relationship with your owncreativity.
Luke Allen (56:34):
Yeah, you're
becoming less than you would be
otherwise.
Right yeah, is that a goodapplication of relationship with
self?
Scott Allen (56:37):
Yeah, that's super
helpful, I think you're really
getting a deep worldviewanalysis here and on that one.
Dwight Vogt (56:42):
It's interesting
because if you carry it to its
nth degree, this idea that ittakes the world's creativity and
gives it to you in a machineand you can have the world's
creativity well, eventually it'sgoing to keep working with that
same pool of data and nobody'sgoing to be adding to it, and so
the creativity I mean hopefullyit would get less and less and
(57:04):
less and less and less.
And help on it.
Could somebody please thinkoutside this machine and give us
some original idea so we canstart to stoke the originality
again of AI.
Maybe, that's not going tohappen very quickly but, that's
where it leads in my mind.
Scott Allen (57:19):
No, I think those
kind of thought experiments are
really good, dwight, just beginto think that out.
What would the world look likewhen people were not able to
think and we relied on thesemachines to do the thinking for
us, all of a sudden again?
This is where I feel likeyou've crossed a line, back to
that word technology.
We're not controlling it, it'scontrolling us at this point.
Luke Allen (57:39):
Yeah, well, it can.
I know we need to wrap up, so Ijust want to give a couple more
examples here.
Dwight Vogt (57:47):
Relationship with.
Luke Allen (57:47):
God relationship
with creation.
Relationship with God,obviously, is going to be
affected by this.
Talk about the temptation ofhuman pride.
You know I can do this bymyself, god, I don't need your
help.
Unbelievable temptation todayto have the pride of man.
Think that we can do it all onour own with this.
Think that we can do it all onour own with this.
(58:08):
It's giving us a lot of thecreators.
Their incentive for this is tocreate demigods, essentially of
all of us.
Scott Allen (58:17):
Yeah, I've seen
that too, Luke, and this is
where I'd love to get into theworldview of the creators, the
people who are really behind it,because I do think a lot of
them do have kind of this odd.
It becomes quickly religiousfor them in the sense that they
believe that this is going tobecome God-like and they're
really ready to worship that,you know.
And it's also kind ofpantheistic in the sense that
(58:38):
it's going to aggregate all ofhuman knowledge, you know, into
this powerful thing thattranscends any one of our
abilities to think and to youknow, the knowledge that we have
, and so it becomes somethingthat they're really ready to
worship, which is again theBible says yeah, man's heart is
made to worship.
That's part of what you know.
(58:59):
We are creatures that are goingto, you know, are going to have
idols.
We just will, and if it's notGod, it's going to be something
else, you know, and I hear a lotof these AI founders really
putting a lot of hope, almostreligious hope, in AI.
Dwight Vogt (59:16):
And I think one of
the dangers too is it's so big
and the knowledge is so big andthe creativity is so big that
you think it's God and youminimize your understanding of
God.
Because one of the thingsthat's interesting about science
is that the more science learns, oftentimes you hear a
biologist say and we discoveredeven more that we don't know.
That's a constant refrain wediscovered more that we don't
(59:38):
know.
It's like we start to think wehave it figured out.
It's just Darwin did and AI canmake us think that we know
everything and we don't, and itleads to pride.
Luke Allen (59:52):
Dad real quick.
Scott Allen (59:53):
The relationship to
creation then we can kind of
summarize again yeah, go.
Luke Allen (59:57):
This is, again, just
the first example that comes to
mind of the relationship tocreation.
One of the founders that I knowa little bit about is Samuel
Altman.
He was the inventor of ChatGPT.
He was the inventor of chat gbt.
From what I hear, one of hismain motivations for creating
that again, to get intosomeone's worldview you want to
ask, you know, you can.
You can just look at their why,the consequences of what
they're creating, and his why,as as far as I know, as far as
(01:00:21):
he's described it, is to createa world in which universal basic
income will take care ofeveryone.
Universal basic income foranyone that doesn, everyone.
Universal basic income foranyone that doesn't know is just
essentially what it sounds like.
It's just a world in which youdon't need to work and everyone
gets the same paycheck at aminimum.
This idea comes in starchcontrast with the creation
(01:00:43):
mandate that God gave uspre-fall to Adam in the garden
work the garden and keep it.
That's Adam's basicrelationship to creation was to
work in it.
Scott Allen (01:00:55):
Yeah, preserve it,
keep it One of the consequences
of AI.
Luke Allen (01:00:59):
At least that Samuel
Altman's trying to create is to
take Adam out of the garden andjust give him a paycheck so he
can sit on the beach in hischair.
Stop working the garden.
Dwight Vogt (01:01:07):
Or just play video
games.
Luke Allen (01:01:08):
He has a basic idea,
that human idea, that work is a
curse, that work is not good,that if we can not work and all
get paid the same, that's a goodthing.
That'll fulfill our happinessas well.
Yeah, super good, luke.
It's so good to look at hisassumptions.
Scott Allen (01:01:22):
That's an
assumption about work, about
what it means to be human, aboutwhat is good.
And how are those assumptionsdifferent from the Bible?
Because the Bible is verydifferent on that point.
It says we're created to workand in fact so much of our
meaning, our joy in life, comesfrom meaningful work that
actually helps other people.
Right?
(01:01:43):
People that study happinessknow that, like this, is kind of
irreducible minimum tohappiness.
If you're going to to have ahappy life, you have to work and
create something that makes apositive difference in someone
else's life so if you take thataway, you just sit around
because you know you get apaycheck without working.
You're just going to havemiserable people depressed and
miserable, right?
Dwight Vogt (01:02:02):
go ahead.
You've just described my bestdays, scott the days that I know
the day that I can create andproduce something and then at
the end of the day I go, wow,that's good it's not great, but
it's it helps and it helpssomebody and it created
something good and I just havesuch a sense of fulfillment and
he wants to take that away fromme you know my wife she'd be
happy just talking to people,but I think yeah, part of it is
(01:02:24):
they.
Scott Allen (01:02:25):
they see the power
of ai replacing humans in all
sorts of areas of work, and sowhat do we do with all these
unnecessary people?
Now, right, I mean, that's partof what's driving that, I think
, luke in the mind of Sam Altman.
So you've got to have some wayof at least providing for them,
right?
You know you can't.
Dwight Vogt (01:02:43):
Well, you provide
for them and entertain them,
provide them and entertain themit sounds right.
Scott Allen (01:02:47):
It sounds very much
like Huxley at that point.
Dwight Vogt (01:02:50):
Oh yeah, brave New
World, eat and be entertained.
That's the existence of life,right yeah, that's the meaning
of life.
Scott Allen (01:02:56):
Yeah, that gets
back to kind of who are these
people that are creating it, orwhat are their worldview,
assumptions, their deep beliefs?
That's coming into thistechnology.
That's not separate from thetechnology, right, Because
they're the creators of it.
I just wanted to add one more,Luke, because I think this is
(01:03:17):
again.
There's so much I don't know.
But back to the relationshipwith ourselves, or what it means
to be human.
There's a huge piece of thisdiscussion with artificial
intelligence merging with humanbeings.
You talk Elon Musk is veryinvolved in this technology to
chip human beings right, Tomerge human beings with
(01:03:40):
artificial intelligence, right,you know, and boy talk about
changing our relationship toourselves.
I mean in a dramatic way, almosta way that I sometimes wonder
if God at that point says Iwon't allow it.
Luke Allen (01:03:57):
Right, well, they've
already done it with one guy.
What is that?
Dwight Vogt (01:04:01):
You get a
smartphone in your brain.
I don't understand.
Luke Allen (01:04:06):
Oh, I haven't
refreshed on that recently.
All I know is this guy haschips in his brain that work off
of AI.
It's the interface right.
Scott Allen (01:04:14):
The interface isn't
through a laptop.
Now it's right in the interfacewith AI, with artificial
intelligence.
It's right in our own brains,so it merges what it means to be
a human being with AI in apretty profound way.
And I don't begin to understandit, but I know that's the dream
or the vision that a lot ofthem have, and some of them say
(01:04:35):
it like.
I've heard Elon Musk speak thisway, where he says we have to
do that.
In other words if we don'tmerge AI with humans, it will
take over.
Luke Allen (01:04:46):
Yeah, it's kind of
creepy language.
For the sake of our species,for the sake of civilization is
the way they talk oftentimes.
Like this will save us.
That's again.
You're getting into the worldof religion.
Scott Allen (01:04:58):
But just back to
that basic question, worldview
question of how does it changethe relationship to who we are
as human beings, ourrelationship to ourselves?
There's the relationship to God, relationship to others, to
ourselves, to the world aroundus, to creation.
So super helpful, Luke.
I just want to step back andappreciate, you know, your
(01:05:23):
understanding and your abilityto kind of begin to think
worldviewishly at that level.
Luke Allen (01:05:27):
So yeah, and in each
of those areas I think there's
pros and cons too.
I would say, um, and there'ssome pros.
I you know we've been a littlenegative today, but there's some
definite pros to ai when usedrightly.
So I just want to drop thatinto the conversation thank you
even though I know we'rewrapping this up.
Um sorry, how'd you want towrap this up?
Dwight Vogt (01:05:48):
yeah, we, even you
know, well, let's give one good
example yeah, go ahead on thepositive note here and positive
from the standpoint of abiblical worldview.
Luke Allen (01:05:57):
If you don't mind
luke, yeah, oh well, that's
really putting me on the spot I,I wrote down, I wrote down a
couple that I other people havesaid are pluses for humanity,
but from a biblical worldviewperspective, I mean, it's used
heavily in the area of medicine.
They've used it to diagnosediseases much quicker than
(01:06:20):
doctors can, more accuratelythan doctors can.
In a lot of cases For our workwe've used it in translation,
which again the obvious negativethere is.
We're kind of reversing thecurse of babble yeah, babble
right, yeah.
But on the plus side, we workin a global ministry with a lot
(01:06:41):
of different language groups andit's it's been a severe time
saver for us, absolutely so Ican't overlook that.
I guess, if I could just sayone last thing.
Scott Allen (01:06:51):
You know, people
I've heard you know, and
Christians say we don't need tobe worried, and I appreciate
that God is in control, god issovereign, and I absolutely take
great comfort in that.
But I also don't want to be tooflip about that either, because
human beings, in our fallencondition and with the power of
these technologies again, youknow, in the broad sweep of
(01:07:15):
human history we've only hadthis kind of powerful technology
in just really recent years andthe velocity and the speed is
just increasing dramatically,way beyond, I think, what we can
even imagine, dramatically, waybeyond, I think, what we can
even imagine.
It can do probably more harmthan we can imagine.
(01:07:36):
And not to again cause peopleto be afraid, god is sovereign,
god is in control.
But we should not be tooflippant either.
We should be sober about that,I think, and it should motivate
us to say what can we do toprotect what it means to be
human, to protect what is good,and you know, are there things
(01:07:56):
that we can do, you know, toprotect the good, the true, the
beautiful Thoughts on that, guys.
Dwight Vogt (01:08:04):
And I think that
God's I don't think I know that
God's promise to us is if any ofyou lack wisdom, let him ask of
God, who gives to all menliberally and abradeth not.
So I think that you know Godwants us to understand what it
means to be human.
He wants to understand what itmeans to add dignity or take
(01:08:24):
away dignity from people, tofulfill the conversion
commandment.
So ask for wisdom and he willgive us small people wisdom in
this big, big new area.
Yeah, yeah.
Scott Allen (01:08:38):
Well, I don't know,
luke, are we done?
I feel like this has been agood discussion.
It's the kind of discussion Ithink everyone's having, and I'm
glad we could have it togetheron this podcast and hopefully
help people begin to think alittle bit more about what does
it mean to think about somethinglike this from a biblical
worldview framework perspective,what kind of questions need to
(01:09:00):
be asked, etc.
I hope it's helped.
Luke Allen (01:09:03):
Yeah, I like what
you were saying, dad, about how
we shouldn't be fearful of this.
You always need to keep in mindthe sovereignty of God.
My verse of the week is Hebrews13, 6 and 8.
So we can say with confidencethe Lord is my helper, I will
not be afraid.
What can man or robots thatlook like man do to me?
Jesus Christ is the sameyesterday, today and forever.
And that's kind of my approachto this thing.
(01:09:26):
I will say at the beginning ofthis discussion my approach thus
far with AI has been the oldclassic stick your head in the
sand because it's too confusing.
That's kind of a fear response,I would say, or a lazy response
.
We're not called to either ofthose things.
With the fear response, youdon't have to be anxious about
anything but in everything,through prayer and petition,
with thanksgiving, present yourrequest to God.
(01:09:47):
So that's kind of the way Iwant to start approaching ai.
Is um, asking god what hethinks about this, approaching
it from a biblical worldviewperspective.
The other, the other response Ihad sticking your head in the
sand.
I was just.
I mean this morning, dad, wewere talking and I was.
I was using the analogy of itthis, this crazy new world we're
in feels kind of like uh, feelsintense, it's scary, it feels
(01:10:10):
like a battlefield, right.
So I go.
The analogy of a battlefield whodo I want to be on the
battlefield?
Do I want to be the guy thatsticks my head in the sand, that
covers my ears and cowers inthe corner?
Do I want to be the guy who,out of fear, panics and can't
think critically anymore and isjust so scared that they're just
like I can't handle this new AIthing, it's all anymore and is
just so scared that they're justlike I can't handle this new AI
(01:10:30):
thing.
It's all sci-fi and crazy andI'm going to panic.
Or am I going to be the guythat's hypercritical, that's
analyzing everything, that'saware, that's discerning, that's
hyper skeptical, but in a waythat leads to critical thinking
and decision making?
I want to be that guy.
That's the way I want toapproach this.
That's the way I want toapproach all these topics as
best as I can.
Of course, you can only do somuch, but from what God's given
(01:10:55):
me, I want to be discerning.
Scott Allen (01:10:57):
It's really helpful
, luke, yeah, and I think it
brings you back to the beginning, when you were talking about
how this is already around us.
We're kind of living in theworld of AI without hardly
knowing it, and I think onewrong response is just to let it
wash over us, kind of unawares,and just let us let it affect
(01:11:18):
us without us even being aware.
That I don't think is pleasingto God.
We should take a step back and,like you say, we should try to
understand it, apply ourGod-given creativity, our minds,
and think about it.
What is this?
And think about it from thestandpoint of the most basic
realities of all.
Who is God?
What does it mean to be a humanbeing?
(01:11:40):
What does it mean to be afallen human being and a human
being capable of being redeemed?
These kind of fundamentalChristian perspectives, true
perspectives.
So, yeah, don't just stick yourhead in the sand, don't just be
passive and let it wash overyou, because if you do, if you
choose that passive place, itwill shape you.
(01:12:00):
Whether you want it to or not,it's going to change you right?
Don't be passive.
Yeah here on this one.
Don't be passive here on thisone.
So good word, luke, that's youtoo, all right, well, thank you
all for listening to anotherepisode of.
Ideas have Consequences.
Luke Allen (01:12:23):
This is the podcast
of the Disciple Nations Alliance
.
Hey guys, thank you so much forlistening to this episode.
I hope that this discussion washelpful for you.
Okay, the four primaryrelationships.
If you're newer to the DiscipleNations Alliance, or if you need
a refresher and you'rewondering if we could give you a
little bit more biblicalbackground or context for these
four key relationships, thendon't worry, we've got you
(01:12:44):
covered.
To learn more about these fourkey relationships, then make
sure to head over to the episodepage, which you'll see linked
in the show notes.
On that page, you can clearlysee more information about these
four relationships, including apodcast that we did a while
back on them.
Again, these are ourrelationship primarily with God,
as well as ourselves, withothers and with creation.
(01:13:07):
So, again, I'd recommendheading over to the episode page
, which you'll see linked in theshow notes.
On this week's page, we've alsoincluded more information about
AI For any of you guys who,like us, are really interested
in learning more about thisright now.
Then on that page, you will seea lot of the resources that we
use to prepare for thisdiscussion, including other
podcasts, books and articles.
(01:13:29):
So we hope that those arehelpful for you.
Thanks again for joining ustoday for this discussion.
We really appreciate your timeand attention and we hope that
you'll be able to join us againnext week here on Ideas have
Consequences.
Thank you.