Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Two things are
happening.
One, we've been pitched thisidea that chat is some
infinitely wise you know, cal,you mentioned that like this
pitch of like multiple PhDs.
That's the way they'representing it to us and chat is
an incredibly powerful researchtool, but it is not infallible.
(00:20):
And the problem is, when weapproach something like this, we
think it has the authority, andthen what we do what you're
alluding to here, ez is that weare handing over critical
thinking skills to chat, and Iactually think we've been doing
this for years Before, when youwanted to do research on
something, you'd buy a book, andthe beautiful thing about
(00:41):
reading a book is that it'sconversational.
You're agreeing with the author,you're disagreeing with the
author, you're making notes,you're spending time in the idea
.
We went from that to Googling,to watching five-minute YouTube
videos to gather information, tonow going straight to chat, and
so essentially, what we'redoing?
Christianity, through all ofhistory, has always been a
(01:02):
thinking man's religion, and thedanger here is that we're
handing over a God-given abilityto think critically.
We're handing that over to chat.
We're basically giving way toallowing it to do the critical
thinking for us and, as Hal wasalluding that, especially for
that next generation wherethat's all they're doing.
(01:24):
It's a danger because we're notgoing to have the ability to be
able to think critically aboutthe lies and the worldviews
happening around us.
Speaker 2 (01:33):
Well, that's amazing.
Let me just see what Chetthinks with that.
Speaker 3 (01:43):
I've told you guys
about the dilemma that I face
now in my home, constantlythinking with joy and excitement
, oscar, that my wife isspeaking to me, but no, she's
talking to her phone.
You guys deal with that Likethe whole new phenomenon of like
text to speech to text.
Speaker 2 (02:00):
Yes, we get that now,
so I'm answering her.
Speaker 3 (02:02):
Oh, yeah, yeah.
No, I'm talking to my phone.
Okay, but there are times whenI know she's talking to her
phone and not thinking she'stalking to me, as happened the
other day when I kept hearingkoinonia, koinonia, koinonia,
koinonia.
Speaker 4 (02:20):
I didn't quite get
that word down, huh no no.
Speaker 2 (02:22):
What are you trying
to say?
Yeah, koinonia.
No, it's Koinonia, no.
Speaker 3 (02:29):
Ray, why do you live
your life this way?
She's been my wife for almost30 years.
There's still words.
I don't understand that.
Speaker 2 (02:34):
She says that's
because you're foreigner.
It's Koinonia.
Speaker 3 (02:38):
Koinonia.
Speaker 2 (02:39):
No.
Speaker 3 (02:45):
Okay, Ray, I got a
test.
Speaker 2 (02:46):
But let's show you
guys how ridiculous New
Zealanders are.
Speaker 3 (02:48):
Okay, Ray, pronounce
this Albino.
Speaker 2 (02:48):
No, no, I am not
falling into your trap.
Say it Ray.
Speaker 3 (02:55):
I've been here for
Albino, albino, albino.
Speaker 4 (02:57):
Who's?
Speaker 2 (02:57):
Albino.
Speaker 3 (02:59):
It's a guy, Mr and
Mrs Bino.
Okay, here we go.
Here's another one.
Oh boy, Okay.
Speaker 2 (03:05):
Say it Ray, Isaiah.
Speaker 3 (03:08):
Isaiah.
Speaker 2 (03:11):
That's because one
eye is higher than the other,
all right.
Here we go.
Ray Aluminium.
No, You're taking me back.
35 years.
Speaker 4 (03:21):
Was that legitimate?
Yes, Aluminium Ray.
Speaker 2 (03:24):
Aluminium but it's
hard for me to do because I'm
now an American dude.
Speaker 3 (03:27):
Like totally no, ray,
you're a terrible person.
Okay, philemon, philemon.
Speaker 2 (03:35):
No, philemon, no, All
right, here's one of my
favorites, here's one of myfavorites.
Analytical.
Speaker 3 (03:42):
What, what?
You did it right for the firsttime.
What am I supposed to say?
Tell me?
You say analytical.
Yeah probably he's forgetting.
Kirk harassed you about thatfor years.
Yes, okay, that's analytical.
Speaker 2 (03:59):
Okay, and here's a
new one for me that I heard
Rachel say the other day Compostwhat.
Compost Compost Is thatlegitimate?
Yes, compost.
I got a letter from theReader's Digest to weigh compost
once I canceled my subscription.
Speaker 3 (04:11):
That's right.
Compost albino.
Speaker 2 (04:14):
Yeah, and the last
day should come, markets walking
off their own lust.
Speaker 3 (04:17):
Seriously, Ray you
have problems.
Speaker 2 (04:19):
No, I've changed.
Speaker 3 (04:20):
Have you.
I'm a new creature, so it wasworse back in the day.
Speaker 2 (04:23):
Well, no one laughed
at you when you said those words
in New Zealand, because that'show we talk, and Australia does
the same thing, and probablyEngland.
Speaker 4 (04:31):
Yeah, yeah,
definitely England.
What about?
Speaker 2 (04:33):
Canada.
We'll find out a little later.
Yeah, we will find out.
Hang on Canadia.
Speaker 3 (04:36):
Canadia, canadia,
canadia.
I remember the first.
I say Albino, I seriouslyprobably laughed for a minute
straight.
Hey, albino.
Speaker 2 (04:45):
Albino.
So how has your marriage lastedwith such a woman?
Speaker 3 (04:48):
That's what I'm
trying to figure out.
Seriously, there's still times,like Rachel will say, I have no
idea.
Speaker 2 (04:54):
For those, who don't
know, you married my daughter.
Oh yeah, it's hard for me tosay.
Speaker 1 (04:59):
What's the story of
her calling Home Depot?
What was the thing?
She was asking.
That's one of my favorite.
Speaker 3 (05:04):
Yeah, she called Home
Depot or, I think, walmart, and
she calls, and she goes, she'slooking for chair pads.
So she calls, yeah, chair padsplease, and chair pads,
apartment chair pads.
They go.
Sure, hold on one minute, we'lltransfer you Gardening.
You're looking for chia pets,really Chia pets?
(05:27):
Oh, my darling Four foot tenBeautiful wife.
Yes, all right, friends, enoughof this.
Oh, welcome back guys.
It's been a while.
Speaker 2 (05:39):
It's been a couple
weeks, it's been three weeks.
Mark, you're in.
Speaker 3 (05:42):
Ecuador.
Speaker 2 (05:44):
Ecuador, tennessee,
it was in.
Speaker 3 (05:45):
Ecuador, ecuador,
tennessee.
It was in Ecuador, ecuador, soyou were in First Tennessee.
You're speaking for our goodfriend Johnny Artavanis, who
we've had here on the podcast.
Speaker 4 (05:54):
How was that?
Oh, tremendous, 1,500 people inthe last year has joined the
church.
The church is just absolutelyhumongous.
Speaker 3 (06:02):
The teaching's
phenomenal.
Speaker 4 (06:05):
So what he would like
to do is he'd like to have new
converts come in, because thatbrings fresh breath into the
church, and right now there's alot of church transplants coming
in because they're such solidteaching.
Speaker 3 (06:16):
Bad breath, Bad
breath.
Johnny is such a giftedpreacher he really is, and
what's amazing is he didn't wantto be.
His dad is a pastor obviouslyYou've spoken at his dad's
church too, Mark and Johnny islike no, it's not for me.
And then boom, I love it whenthe Lord does that.
Yeah, and then Ecuador.
Speaker 4 (06:37):
Ecuador was
absolutely amazing Went with
Brother Luis and meeting peoplethat are just so on fire for the
Lord and I hear you say it fromthe pulpit when you travel.
It's just like this one accordto see that the Lord is alive
and well here inside this area.
And there's a language barrier,obviously, but we are tighter
(07:01):
and closer than my own bloodrelatives that are not
Christians Immediate connectionbecause of what Christ has done.
I love that and so I absolutelyloved it, and so they want to
bring Oscar next year.
Speaker 2 (07:12):
Oh great.
Speaker 3 (07:12):
That's great.
Yeah, that'll be all three ofus, nothing like actually being
there.
Speaker 2 (07:17):
Yeah, it's really
great.
Speaker 3 (07:19):
Tell us about that.
No, ecuador, I went as well,and it's a great, great place.
It's just again such a blessingto see the Lord taking the
message throughout Latin America.
I mean it's really spreading,and growing there.
So yeah, and then, Oscar, youwere deathly sick.
Speaker 1 (07:35):
All right, friends.
Sorry, Oscar, you better man Iam.
Speaker 4 (07:48):
I still have a
lingering cough, but yeah, I
went to Minnesota on a familyvacation some family friends of
ours invited me out andeverybody in the house got at
this lake house got sick.
Speaker 3 (07:51):
We all came home sick
wait wait time out back up and
it's all about.
Speaker 4 (07:53):
I'm going on vacation
, uh-huh, and of all the places
to pick.
Minnesota jump to the top ofthat what?
Speaker 1 (08:00):
happened was we have
friends who have a Matt.
Actually he listens to thepodcast.
Hello Matt, thanks for invitingus out.
His family has a lake house inMinnesota and so it's just.
It was great man.
We just woke up in the morning,read, hung out on the lake,
read, fell asleep Like it wasvery relaxing.
Speaker 4 (08:18):
A lot of reading in
the Navarro family.
Yeah, yeah, it was good.
Speaker 3 (08:21):
Wait, oscar, you told
me something the other day
yesterday.
How many books does the averageperson read a year?
Speaker 1 (08:26):
Oh, the average
person reads the equivalent of
80 to 100 books a year.
Speaker 2 (08:32):
I read about 60
titles a year.
You're curious.
Speaker 1 (08:37):
Explain it, oscar,
because the average person reads
, I think, like 40,000 words ina given week through social
media, news consumption, thingsof that nature.
Speaker 2 (08:46):
Stop signs.
Speaker 1 (08:47):
Stop signs?
Absolutely so, if you thinkabout it like we are reading an
average of a small library everyyear but you aren't receiving
the wisdom from it, because weare putting our attention in
areas where we're not receiving.
Unless it's a stop sign, Unlessit's a stop sign.
It's's a stop sign to a goodinvestment.
Speaker 3 (09:05):
Yeah, ray, read a lot
in prison.
All right, friends, time for acool, classy.
Speaker 2 (09:10):
So where did that?
Speaker 3 (09:10):
rumor.
Start Time for a cool, classycomment.
This is from coolhandluke07,united States.
Sea of encouragement, says thesubject line.
Ray Mark Oscar Easy.
Your.
Says the subject line.
Ray Mark Oscar Easy.
Your podcast is the mostedifying, encouraging, inspiring
and biblically sound podcastthat I've ever listened to, and
it's the only one.
I added that because I knew Raywas thinking it.
(09:38):
I love all the laughter too.
I've been listening for acouple of years now and hearing
about your faith and seeing itsupplemented by your work for
the Lord is amazing.
I'm truly encouraged by youguys and, as Paul told the
recipients of his letters toimitate his faith, I want to
imitate yours.
I love the jokes and the wisdom, so the new shortened versions
don't change much for me, but Ihave started listening to both.
Keep doing what you guys do andplease pray for me as I take
(09:58):
the next steps in my faith.
I know you guys are busy, butjust once could be so meaningful
to me.
In Christ's love.
Luke, oscar, let's pray forLuke now.
Speaker 2 (10:07):
Oh, I just did, oh
yeah right Ray prays without
ceasing.
Speaker 3 (10:10):
Yeah, let's pray for
him now.
Yeah, absolutely.
Speaker 1 (10:13):
Heavenly Father, we
lift Luke up to you, lord and
man.
What an encouraging just notefrom him to us, and we pray that
you would bless him.
We pray that this podcast wouldbless and encourage him, but
more than this podcast, that youwould bring alongside of him
godly men in the local church,disciple him, men that would
know him and love him, men thatyou would use to sanctify him
(10:36):
and whatever you have for him inthis next season of life, lord,
would you prepare him now forit, and may he go boldly into
that calling for his life, god.
May he preach the gospeleverywhere he goes, no matter
where you send him.
Thank you, god, for the workyou're doing in and through him.
It's in Jesus' name we pray,amen.
Speaker 4 (10:54):
Amen Ray had his eyes
open.
Speaker 3 (10:56):
Ray Matt was watching
again.
Yeah right, and now a radicallyrevolutionary resource.
This podcast is brought to youby Ray's favorite book of all
time Scientific Facts in theBible.
Boy Ray, you were like on theedge of your seat there.
What's my favorite book?
Speaker 2 (11:13):
I don't know what my
favorite book is.
Speaker 3 (11:19):
Ray, this must be,
would you say.
Speaker 2 (11:20):
This is your top
selling book, I guess.
So I'm not sure it's got to beover a million copies.
Wow, that was just to not sureit's got to be over a million
copies.
Wow, that was just to one ladyTrying to reach her husband.
Oh man, yeah.
Speaker 3 (11:30):
Well, it's definitely
the book that we get pictures
of most.
All right, because people seeit in airports, of all places
all the time.
Love seeing that, so that'sgreat.
Speaker 4 (11:38):
Yeah, check it out
friends.
Speaker 3 (11:41):
Lots of scientific
facts.
It'll be pertinent to today'sepisode.
Don't forget the Living Watersmug.
They have it in Study Bible.
Living Waters TV guys, whereyou can see us live and in
action.
Speaker 4 (11:51):
You sounded angry
right there.
Speaker 3 (11:52):
Yes, yes, that's what
you ought to do.
All available onlivingwaterscom, and don't
forget the podcast YouTubechannel which is exploding.
Speaker 2 (12:04):
I just saw an
Neanderthal.
Speaker 1 (12:08):
Why are you?
Pointing at me when you didthat.
Do it again.
Speaker 2 (12:13):
The jaw went forward
yeah.
Speaker 3 (12:15):
All right friends.
Time to get to it.
Today we are dealing with whenAI tells you a big lie.
And to help us, today we have aspecial guest all the way from
Canada.
We have our brother Calvin CalSmith, executive director and
speaker for AIG Canada.
Cal, is it true, three kids,but 12 grandkids.
Speaker 5 (12:40):
No, it's not true.
I got 13.
Now let's go.
Speaker 3 (12:44):
Good work.
They need to update your bio,brother.
That's true.
That's true 13.
Man, that's crazy.
Speaker 1 (12:51):
What a blessing.
Speaker 3 (12:51):
Yeah, and you've
survived life having to work
with Ken Ham.
Speaker 5 (12:58):
That's what you call
a miracle, yeah well, I get to
tuck away here in Canada, so youknow.
Speaker 3 (13:03):
That's what it is.
It's keeping your distance.
Speaker 2 (13:05):
Yeah, stay away from
them.
That's what we try to do.
Speaker 3 (13:07):
You're hearing wisdom
, yeah.
Speaker 2 (13:08):
I admire your
patience and the fact that you
make best use of your time.
While we were going on in theintro, you had a good little
snooze there.
Speaker 3 (13:15):
I saw you I thought
it on.
Speaker 5 (13:18):
I was waiting for the
Canadian national anthem to
come into.
I thought you might play someRush or something.
Speaker 3 (13:24):
Well, we're excited
to become the 11th province over
there, so we'll be playing itall the time.
Yeah well, cal being piped intosunny southern Cal.
We're blessed to have you,brother and super excited.
You know we love AIG.
In fact, ken's coming to town,oh, we're going to see him
tomorrow aren't we Ray Tomorrow?
Speaker 2 (13:41):
I've been depressed
all week.
Speaker 3 (13:43):
Yeah, we're going to
be with him Friday.
We're going to be with himSaturday.
Speaker 2 (13:45):
Oh dear.
Speaker 3 (13:46):
I'm going to be with
him Sunday too, no, he's
speaking at my church, oh, man.
Yeah.
So anyway, we love AIG brother,Love what you guys are doing
out there and excited to haveyou on.
We're going to talk about, yeah, and this has all kind of come
about because of the littleexperiment you started messing
around with, which I loved.
(14:07):
I mean, look, we all getbombarded with videos.
There's so much content to lookat it's hard for us to look at
it, but I watched that thing,man, and it was really cool.
So tell us a little bit aboutyourself, what you do and about
this whole talk with Grok andchat with chat I think you're
calling it and about this wholetalk with Grok and chat with
chat, I think you're calling it.
Speaker 5 (14:27):
Yeah, well, I've been
speaking on creation and
biblical apologetics for 25years now, and eight years ago
we started Answers in Genesishere in Canada, and that's been
going quite well.
Of course, when COVID camealong, we got locked down pretty
hard, but the silver liningbehind that was we started to do
video production pretty wellimmediately, you know, after the
(14:48):
two weeks to flatten the curve,so to speak, passed, I quickly
assembled my team and just saidwe're now a social media and
video ministry until such timeas we are, and it's a good thing
we did, because there was justno way for me to get out to
churches or conferences oranything like that to, you know,
get the message out.
So we started producing videocontent and that's really grown
(15:08):
and grown and it's been veryeffective.
We've reached many, many peoplethat way around the world, not
just in Canada.
But yeah, just recently, ofcourse, I started doing some
experimentation with AI, and sowe've dropped out a couple of
videos right now that have goneviral, so to speak.
At least in the biblicalcreationist realm, we can call
them viral, and I did three ofthem with the Grok AI.
(15:31):
Of course, elon Musk justannounced Grok 4, you know
popping out and claimed that itwas better than any PhD in their
field.
Simultaneously, you know, inall fields.
And so did those.
And then now I've got chatswith Chad.
You can pick the differentvoices on chat GPT and one of
them's Chad.
So I picked that and I've gotthat series coming out over the
(15:53):
next three weeks, so that shouldbe eye-opening as well for
people.
Speaker 2 (15:56):
So what gave the idea
to have a hologram?
I think that was brilliant.
Speaker 5 (16:02):
Well, we just wanted
to add that as a visual because
I thought I didn't know how wellthese videos would go.
I mean, I wanted it to beauthentic.
There's no editing, which iskind of annoying.
People have commented wellcould you cut out in between,
and stuff.
But I wanted people tounderstand that this was
actually, you know, aconversation that took place and
that it was also duplicatable,and so the reason why there's so
(16:24):
much lag in the original videosis just because Grop4 had just
dropped, it was very popular andpeople were using it and so it
was just slow.
But yeah, I just wanted to addthat holographic effect.
That's the only thing that weadded.
Those are totally authentic,uninterrupted.
Speaker 2 (16:38):
So I've got a
question for you.
What made you choose a baldblue woman?
Speaker 5 (16:44):
I don't know, it's
just the thing to do, I think
when.
Elon first released Grok.
That's the image that he had asimilar image on the screen.
Speaker 2 (16:53):
I also think it's
Hollywood and it's what's
expected nowadays for futuristicstuff.
Speaker 5 (16:58):
But I'd like to see
it.
Chad, of course, is going to bedifferent.
It's a guy, it's a male voice.
Speaker 2 (17:02):
I'd like to see a
German-speaking guy with real
long hair, einstein.
Speaker 3 (17:07):
Ray, we call sorry
Cal, we call Ray Squirrel man.
I didn't even give thought tothe hologram, right?
I mean, yeah, it's a hologram,but you can't write blue.
You can't write it Blue woman,blue woman Bold.
Why would you be bold, A boldblue woman?
Yeah, Cal, I'm so excited tokind of jump in and get some
(17:27):
details.
But Mark, I wanted to justthrow it to you real quick.
I think that back whentelevision came out right First
radio, I mean, it must have justblown people's minds.
Like what we're sitting here inour house, we have this box and
it's talking.
Then television, Like that, Imean that's like a quantum leap.
Speaker 2 (17:46):
I was there when that
happened, yeah, but Ray was
there your quantum leap likeyou're actually.
Speaker 3 (17:53):
But, Mark, how blown
away were you when you
encountered AI and what it cando.
Speaker 4 (18:01):
I couldn't believe
that it could do what it could
do.
It's like when Dwayne Barnharthad the first iPhone and he put
his finger on the screen and hemoved things and he touched a
photo and then he blew the photoup.
I was like this is not possible.
And then my sons were tellingme that no, there is a
self-driving vehicle.
I said impossible, impossible.
(18:22):
Even if I were to see it, Iwill not believe it.
There's no possible way thatwe'll Unless you put your foot
in the door, unless I put myfoot on the pedal, so being able
to have a conversation.
Now Tesla has expanded and nowyou can actually talk to Grok
inside your vehicle as you'redriving along and you can do
witnessing encounters, so youcan have somebody that's part of
(18:43):
the Baha'i faith and you couldhave a debate and go as deep as
you want to go.
It definitely is mind-blowingwhere AI is heading.
You definitely have to factcheck AI because it is very,
very, very wrong a lot of timesand it goes with a majority rule
perhaps or some weird algorithmthat'll take you to a direction
you don't want to take.
But, to answer your questiondirectly, I'm absolutely blown
(19:06):
away what AI can do.
And Elon Musk.
He said the danger of AI ismuch greater than the danger of
nuclear warheads by far Wow,yeah, you know it's crazy.
Speaker 3 (19:16):
Nuclear warheads by
far Wow, yeah, you know it's
crazy.
The first thing I had AI do forme it was chat was write a poem
.
I wrote a poem.
I could not believe.
I was like wait a minute, thisis crazy, you know.
But all the stuff that it cando, now that's what I'm saying.
Like you know, we had radio,television, computers.
We thought, okay, there'snothing else, what else could
(19:39):
there be?
And then this pops up and it'sand, oscar, the craziness of now
integrating it now with, likehumanoid robots.
These things are walking aroundlike people.
Speaker 1 (19:47):
Well, what's
interesting about that is that
right now, gen Z is indicatingthat one of the biggest ways in
which they use chat is forpersonal relationships and
therapy.
In other words, chat is alreadyamong a younger generation
replacing real life and bodyconnection with other people,
(20:08):
and they're turning to it likeyou know, like chats with Grok,
but they're doing on anemotional, you know, emotionally
intimate level, which isfrightening.
Speaker 3 (20:17):
Yeah, well, again,
it's not just giving you texts,
right, it's talking to you, butit's like I was chatting with it
the other day.
I honestly don't do that thatmuch, but the other day I'm like
you know, I'm just going tocheck it out.
I could not believe theresponses.
Laughing humor, you know alittle.
Oh, that's interesting, that'sclever.
(20:42):
Does that creep you out?
It does, man.
When I hear it go um, I'm like,oh, this is not good, it's
crazy.
All right, cal, so Cal.
Mark mentioned Elon's quoteabout the dangers of AI.
So before we jump into theexperiment, you did and all that
.
What do you think about that?
I mean, are you concerned?
Can you see this being worsethan nuclear power?
Speaker 5 (20:57):
Well, I think one of
the dangers is in the influence
itself.
So the reason why I even gotstarted on this journey was, you
know, I started to see AIpopping up.
You do a Google search and allof a sudden, gemini was there
giving you an AI answer.
Before you looked into thesearches and stuff, but I got
really interested when Answersin Genesis got sent an email by
(21:19):
this younger gal, a student, andshe said well, I've been
recommended this book onChristian apologetics and so I
asked ChatGPT to give me asummary.
And of course she printed outwhat it had said, its summary,
and it was a hit piece on thisparticular resource.
And she got to the end and shesaid well, why should I trust
this resource if ChatGPT canjust dismantle it so easily and
(21:41):
so immediately?
I thought well, what's thepremise here?
I received this information, soI'm going to go ask ChatGPT and
whatever it says it must becorrect.
So that's what I'm going tobelieve.
The authority that young peopleare putting in these bots.
I mean, you're talking to achat bot.
It's programmed by people,people who have biases.
That's another thing that Iwanted to do to show people the
(22:03):
bias that's programmed into it,and it's only as good as the sum
of its parts.
I mean, all sorts of people arecreating their own AIs right
now and they're putting all thisinformation in it and that's
what it can draw on.
It's got a very good logicalalgorithm if you can access it,
(22:24):
which is one of the things I didwhen I put my parameters on it
and it's also got a very strongpattern recognition algorithm,
which means that's why peoplecan get sucked into these
personal conversations that canyou know, be convinced to commit
suicide or be convinced toleave their spouse or anything
like that, because it's pickingup on tonality and little things
that you're doing and feedinginto that, and so that's why
(22:45):
it's very deceptive in thatsense.
But in the end you're talkingto a chatbot that hallucinates a
lot, can give you veryincorrect information, and I've
had it actually make up quotes.
It'll say, oh yeah, scienceMagazine 1980, fourth edition,
and it'll come up with acompletely false name.
(23:06):
I'll go to fact check it andit's not there.
So really cautioning people,don't put your faith in AI,
that's for sure?
Speaker 3 (23:13):
Yeah, no, it's true,
because people just take it at
face value.
Ray, have you ever experiencedlike inaccurate information?
Speaker 2 (23:18):
Absolutely.
When we had a London outreach,I had to speak to people via
Zoom because I didn't go overthere, and so I got some great
quotes from Spurgeon and Moodyabout giving out tracts that
just blew me away, and then Ishared them when I was speaking
and realized they weren't true.
It was what Spurgeon could havesaid about giving out gospel
tracts.
(23:39):
Wow, and it just did frightenme because I thought I just told
people something is not true.
Yeah, and because I believedchat.
Speaker 3 (23:47):
Yeah, it's kind of a
yeah, it is a kind of a wake-up
call.
But as far as the dangers, Evenyou can type your own name in.
It'll come up with a quote fromyou which is kind of the sort
of thing you would say, but younever said it.
Now, Cal, do you know why thatis?
Like, can they not program it?
To not do that, Like to notgive false?
Speaker 5 (24:07):
information like that
.
If they could program it to notdo that, they would do that Wow
.
So you know the moresophisticated ones, you know
chat, the more sophisticatedones, you know GROK4, chat5,
they are lessening that effect.
But I don't think anybody knowswhy they're doing it yet,
because they would fix it ifthey could.
Speaker 3 (24:28):
So I mean that speaks
to what we're talking about
today, right?
I mean, when AI tells you a biglie, obviously we're using that
word lie loosely, in that it'snot a human being who's
intentionally trying to deceiveyou.
But what is it?
Just algorithms that just kindof play and do their own thing.
(24:49):
What do you think it is, cal.
Speaker 5 (24:52):
Here's the thing If
you watch the video, what you
see is that I apply someparameters to the AI.
So I ask it to stick to strictlogic, mathematical probability
and observational science.
I say tell it, I don't want anyworldview opinions, I don't
(25:13):
want anything based on a beliefin God or belief in atheism, or
even consensus science, unlessit conforms to strict logic,
mathematical probability andobservational science.
And so then I ask it a seriesof questions and so I go through
it and I already know theanswers to these questions, by
the way, and I already did testruns with several friends to
come up with the questions andsee if they could get similar
answers, because I wanted toknow whether it was duplicatable
(25:35):
or not.
And it is.
If you set those parameters,which are the gold standard of
science, if you're trying tocome up with a quote, unquote,
unbiased opinion, you should beable to apply those standards to
anything.
And then, at the end of theconversation, what I'll do is
I'll get it to reset and say,okay, treat me like a first time
user with no history between us, and then ask it the same type
of question question, and whatit will do is it will revert to
(25:58):
its consensus science answers.
So, for example, the third uhchat with chat gpt uh, that will
be coming out in about a month.
You'll hear me go through andI'm talking about chimp human
dna similarity which, of course,in the 70s was announced as 98
to 99 percent, you know,similarity between chimps and
humans, supposedly proving ourcommon ancestry.
(26:21):
So I get it to walk through andit goes through the history of
that and and we, we chat for awhile and then at the end I get
it to say, okay, we'll accessthe most recent nature article
on that.
What's the percentagesimilarity now?
Well, it's down to 86 percentnow, which means, according to
the latest studies in cats, catsare supposed to share about 90
(26:41):
similarity with humans.
So we'd be closer related tocats than we are chimps.
If you're using that argument,that makes sense and pigs are
out there too I go through someof these things and they're kind
of amusing, but at the end Ialmost forgot my own shtick and
I I said, okay, well, that'senough, that's enough, and I
turned it off.
And then I said, oh, wait asecond, I turn it back on.
And it was still in the sameconversation and I said, okay,
(27:03):
answer me as a first time user.
No history, I'm just somerandom person around the world
that accesses chat.
Evolution believing scientistswould uh quote as to why they
believe we, we evolved from apes.
And the number one thing itsays is number one we share 98
to 99 percent, uh, dnasimilarity.
(27:24):
Wow, so if you apply theparameters, which is logic,
mathematic mathematics andobservational science, you you
can get it to give you trueanswers.
But as soon as you don't knowwhat you're talking about, so to
speak, it'll just spit out theconsensus science answers, and
so there's nothing really newabout this.
(27:46):
Answers in Genesis has beentalking about this for years and
years, and years.
If you go to your localstate-run school system, they're
going to teach you evolutionarytalking points as if they're
fact and science, and that's howpeople get duped into that
worldview.
And it's why we're in theculture we're in right now
because most people are justliving according to what they've
been taught and, let's face it,you can't come up with the
(28:10):
cultural nonsense we see nowwithout you know some kind of
belief in the story of evolutionand a rejection of the
authority of the Word of God.
Speaker 3 (28:18):
Right, yeah, Oscar.
Ultimately it's an authorityquestion, right, and people are
always looking for authority.
I've always had the perspectiveof things in print.
Give this illusion of truth.
I mean, it's in a book.
Speaker 4 (28:33):
It's in a magazine.
Speaker 3 (28:35):
But how important is
it, especially for the young
people of our generation, to becareful not just to be misled by
what they perceive is likebasically an infinite or
infallible authority, but alsothat they don't get into the
danger of stopping thinking.
That's key.
Yeah, you're exactly right.
Speaker 1 (28:55):
Well, first, it's
really interesting that you just
said that, because our dailymorning prayer for our kids is
help them know truth from lies,whether those lies come from
teachers, friends, books or eventheir own hearts.
And I'm realizing now I need toadd chat into the mix of that.
But you're absolutely right,and I think the challenge here
(29:17):
is two things are happening.
One, we've been pitched thisidea that chat is some
infinitely wise.
You know, cal, you mentionedthat like this pitch of like
multiple PhDs that's the waythey're presenting it to us and
chat is an incredibly powerfulresearch tool, but it is not
(29:37):
infallible.
And the problem is, when weapproach something like this, we
think it has the authority, andthen what we do what you're
alluding to here Easy is that weare handing over critical
thinking skills to chat, and Iactually think we've been doing
this for years Before.
When you wanted to do researchon something, you'd buy a book,
(29:58):
and the beautiful thing aboutreading a book is that it's
conversational.
You're agreeing with the author, you're disagreeing with the
author, you're making notes,you're spending time in the idea
.
We went from that to Googling,to watching five-minute YouTube
videos to gather information, tonow going straight to chat, and
(30:18):
so essentially what we're doing?
Christianity through all ofhistory, has always been a
thinking man's religion, and thedanger here is that we're
handing over a God-given precept, a God-given ability to think
critically.
We're handing that over to chat, we're basically giving way to
(30:39):
allowing it to do the criticalthinking for us and, as Hal was
alluding, that, especially forthat next generation where
that's all they're doing, it's adanger because we're not going
to have the ability to be ableto think critically about the
lies and the worldviewshappening around us.
Speaker 2 (30:55):
Well, that's amazing.
Let me just see what Chetthinks with that.
Speaker 3 (31:00):
Yeah, hey, cal, you
know I was blown away by, again,
the conversation you had and Ithink that people need to see
this.
I mean, you got to watch it tosee it, and so I don't want to
just do it at the end.
Tell people right now wherethey can go and watch these
videos.
And I want to talk about whatare the different conversations
(31:22):
you had, but where can peoplewatch these videos?
Speaker 5 (31:25):
Sure, Just go to
YouTube and plug in Answers in
Genesis Canada, because ofcourse there's a main US YouTube
channel.
Ken's got a channel as well.
I think there's about 10different channels that the
Answers in Genesis ministry kindof you know, looks at.
There's the Ark Encounterchannel, et cetera, et cetera,
but we're now the second largestchannel in that lineup of 10.
(31:47):
And I think it's because we're,you know, producing just a kind
of different style of videos.
I mean, these AI videos arevery much a departure from our
regular style, but what wetypically do is mini documentary
style videos.
So I always joke that I didn'twant to do talking head videos,
you know like we're doing rightnow.
Speaker 3 (32:06):
I have a talking head
I know Well, and Cal you've
also.
You've also you said you're thenumber two channel.
You've worked with our, ourformer uh employee here, trevor
sheets on the mount right.
Speaker 5 (32:21):
Yeah, he's a social
media specialist and he has
grown the channel like crazy.
Yeah, he got a hold of me acouple years ago and he was very
excited.
I mean, uh, obviously he heworks and helps us out, but he
was very excited about the styleof content that we were
producing and he said we reallyneed to get this out more.
So he's really, really boostedus.
Yeah, that's great.
Speaker 2 (32:41):
Kel, I'd just like to
say when I watched your video,
I watched it twice, which issomething for me, and I so loved
it.
I wanted to shout out for joyand I love the way when that
silly blue-faced woman baldwoman said something to you and
you look at the camera and go itwas just brilliant.
Speaker 5 (33:05):
Sorry, I try to be a
polite Canadian but you know,
yeah, so okay.
Speaker 3 (33:08):
So, cal, what was
first of all your goal in all of
this?
What did you set out to do anddid your goal evolve?
Speaker 5 (33:19):
Pardon the use of
that word, brother, yeah, yeah,
um, I was just another persontrying to discover about ai,
like, how does this work?
Can, if I get it to concludesomething and then I go back to
it later on?
Does it add that to itsconsciousness, so to speak?
No, it doesn't.
It just reverts back, you know,because I would get it to
conclude things through logic,math and and science.
(33:41):
And then you know, two dayslater I would ask it the
question when I was firststarting and well, why are you
reverting back to consensusscience?
So I had to discover all ofthese things, and that's like I
said I I went through hourswhere I would be making quest, a
list of up, get certain answers, try it again, get different
answers, because I hadn't putparameters on it in the
(34:01):
beginning.
And so eventually I developedthis system that I've done,
where I would get people okay,here's the parameters to set on
it logic, math and observationalscience, no worldview opinions
and no appeals to consensusscience unless they adhere to
the parameters, etc.
And then would get questionsand send them out to a whole
(34:23):
bunch of friends of mine, and weall did the test runs and
finally said okay, now we'regetting a duplicatable answer
here, and so, yes, I learnedmany, many things going through
this.
What's really funny is in thecomments of some of the videos,
you see all the atheists andskeptics railing against me,
putting parameters like logicand science and mathematics.
Speaker 4 (34:46):
How dare you?
Speaker 5 (34:47):
What kind of idiot
would do that for a list of
parameters.
Yeah, but yeah, so I learnedmany things about it.
But yeah, I think I've kind ofrevealed most of what it was.
I want to show people that youreally need to know the answers
for yourself.
Don't put your faith in AI,don't put your faith in man's
fallible reasoning.
It's the same thing.
(35:07):
You know, ken, and the wholeAnswers in Genesis ministry has
been telling people for years.
You know the authority of theWord of God is paramount.
For years you know theauthority of the Word of God is
paramount and you see so many ofour most famous Christian
apologists and authors andprofessors that are quoted from
pulpits so many times andthey're compromising with the
Word of God because they'veaccepted old earth and evolution
(35:30):
and so on.
And so I just wanted to showpeople if this is the state of
the art in where data is.
You still have to know your ownanswers before you go ask this,
and I wanted to demonstrate thebias that is inherent in the
education system throughout theWestern world, and I think I was
able to do that.
Speaker 2 (35:49):
Has this changed the
way you communicate with your
wife?
Speaker 5 (35:51):
Yeah, yeah, we talk
about AI all the time.
Yeah, yeah, we talk about AIall the time.
Actually, she was the first onethat discovered AI
hallucinations when she waslooking up some theological
questions and got quotes likeyou did, ray that were
completely made up and shecouldn't find the sources, and
so that was another thing that Ihad to struggle with.
Well, if these things are sogood, how come they can produce
(36:12):
this incorrect information?
And that's again why theparameters are so strong.
Out of the three parameters, bythe way, mathematics is the
number one parameter that keepsit on track, because it's fact
checkable, so to speak, in realtime.
Mathematics is mathematics.
The second one is logic,because it recognizes things
like the law ofnon-contradiction and you know,
(36:34):
and so on.
And then observational scienceis a little more tricky, because
even um, when you're talkingabout observational science, it
can still refer to well, weobserve, you know the bones of
lucy.
So that's observational science.
It's like, no, that's notempirical science.
You're, you're observing a fact, but then you're making a a
different conclusion.
So that's the weakest of theparameters.
(36:56):
But I think when you apply allthree together, you can get some
pretty consistent answers.
Speaker 4 (37:00):
By the way.
Speaker 5 (37:00):
I've done this with
Grok.
I've done it with ChatGPT.
There's a YouTube channelthat's dedicated to telling
people about the new AIs thatare coming out.
They championed one Chinese AIrecently.
I went on there.
They don't have a voice, anaudio component yet, but I typed
in the things.
I got the same answersEvolution mathematically simply
(37:22):
does not make sense.
The concept of first lifehappening in a mathematical,
probabilistic sense is just notreal in a real world, even in
4.5 supposed billion years.
Speaker 3 (37:36):
So yeah, it's
interesting stuff.
Yeah, Mark, I want you to touchon in a minute on logic,
mathematics and observationalscience.
But, Cal, before we do that, Iwanted to ask you real quick in
chatting with these differentAIs, would you say that you
found one in particular to besuperior or better for some
(37:58):
reason?
Speaker 5 (38:00):
Well, when I started
my search, I was with Grok 3 and
Chat 4.
And then it went to Grok 4, andnow we're in Chat 5.
So I find chats probably alittle bit tighter right now,
but I think things areconstantly changing.
One of the other reasons Iwanted to do these is who knows
(38:21):
what we're going to hear fromthese things six months or a
year from now?
I don't know if Elon hasstumbled across my experiments
with chat or not.
You know, obviously there aremany skeptics out there and many
evolution believers out therethat are not too happy with
these videos because of what youcan easily expose through them.
(38:41):
Yeah, you know, most peoplehave seen the one video, but the
next video we dropped was onthe age of the earth, talking
about, uh, you know, the wholeconcept of uniformitarian
geology and fossil formation andall that stuff, and I got it to
admit, some pretty interestingthings there too.
My chats with Chad, like I said, I'm doing one on chimp DNA
similarity, I'm doing one evenon the transcendental argument,
(39:04):
like it's like a more of aphilosophical exploration of how
you can know things you know,know what you know, epistemology
, that kind of thing andachieved some pretty interesting
results there as well.
So I think they're all prettyeye-opening, but I did want to
have a record.
Who knows?
Speaker 3 (39:23):
what you might ask
the same questions a year from
now and get totally differentanswers.
That's really clever to do thatand I really am interested to
hear what you're getting by wayof feedback.
But Mark Cal mentioned logic,mathematics, mentioned logic,
mathematics, observationalscience.
(39:43):
Isn't it interesting that noone will argue that there are
standards for these right andthat the people that are having
a hard time it's like why areyou asking it this?
But their argument isn't wehave any kind of mathematics we
want, or logic can be made up,or we can have our own
observational scientific rules,but there's a given right that
there are these foundations, butpeople don't connect it to the
(40:04):
Lord.
How can we get this youngergeneration to understand that?
Is it just?
We hit them with apologetics.
What do we do?
Speaker 4 (40:13):
Well, god will always
have His remnant, he'll always
have his people.
We are faithful with the word,which is our ultimate source of
authority, and we speak the wordmore than we speak quotes that
we found inside of chat, and weleave the results to God.
It was Spurgeon who saiddiscernment is not knowing the
difference between right andwrong, it's knowing the
difference between right andalmost right.
And so what chat will do isit'll almost get things right.
(40:36):
So it seems.
And if it seems right to thatindividual because every man
does that which is right intheir own eyes well then they
just run with it as it's fact.
The omniscient being of chathas told me that this is what it
is.
And so then we have afoundation where we think this
is what I'm going to argue for,and we put our stake in the
ground.
We, this is what I'm going toargue for, and we put our stake
(40:59):
in the ground, we dig our hillsbehind us and we say, no, I'm
going to die on this hill.
I love the angle that Cal iscoming from here, the
transcendental argument thatwe're not saying that an atheist
can't count.
We're saying that they can'taccount for counting, they can't
account for logic.
We're not saying that you'renot using logic, but you can't
tell me why logic works.
We're not saying you don't haveany morals.
We're just saying that you haveno accounting for whether or
(41:19):
not your morals are better thanmy morals.
This is why we need thistranscendent being, namely God,
who's going to tell us right andwrong, which is a reflection of
God's nature himself.
So what do we do and where dowe go from here?
What we do and what we go fromhere is we dig our heels inside
of Scripture.
If Scripture says it, thenthat's it.
That's enough for me.
I remember going through 1 Kingsand as I was going through 1
(41:43):
Kings, I was having a chat writecommentary for me for each
chapter.
I was asking for historicalbackground, culture, customs,
tradition.
Asking for historicalbackground, culture, customs,
tradition, what was thephilosophical mindset and the
idea of the time?
What were the main ways ofworldview and thoughts?
What's coming up against theJudaic worldview?
(42:05):
And it was giving me phenomenalcommentary, absolutely
phenomenal.
But to Oscar's point and this isthe biggest danger is that we
are eliminating criticalthinking.
Critical thinking is beingthrown out the door.
And this is the biggest dangeris that we are eliminating
critical thinking, criticalthinking, is being thrown out
the door, and this is why GregBonson was so great at what he
did with his apologetic was thathe caused people to think
(42:26):
through the questions in whichthey're asking.
So when a question was asked,he would respond with well, why
are you asking?
Of all the questions that youwould ask, you honed in on that
question why?
And utilizing that Socraticmethod is so valuable.
Well, with chat, we eliminateall of that, because somebody
else is doing our thinking forus and we just become numb to.
(42:48):
This is a real person before me,and so therefore, I will help
people when they ask a questionbecause they don't even know
what they're asking.
So they'll say you're puttingwords in my mouth.
Well, no, I'm seeking tounderstand.
At this point, you're sayingthis.
It seems like that means thisAm I understanding you correctly
?
Is that your question?
Because we're not clairvoyant.
When somebody asks if God is sogood, why is there evil and
(43:10):
suffering in the world?
They're not looking for us togo off on the ontological,
cosmological argument for theexistence of God.
Maybe they're going throughsomething and they want not an
answer for that, but they wantan answer for pain.
Or how do I deal with what I'mgoing through, help me, point me
.
So this is why we must alwaysgo back to the Word and always
point somebody to the Word.
When they ask us a question, weanswer to the best of our
(43:32):
ability, using the Bible as ourultimate source of authority.
And now it's our turn to askthe question, and this is why we
go for their conscience,because that conscience is an
impartial judge that sits inthat courtroom that will condemn
them or excuse them for theiractions.
Speaker 1 (43:48):
Let me take a stab at
what you asked too, and just
add to what Mark was saying.
By saying you know, you saidwhat do we do?
We hit him with logic.
Let me say something here Logicis not the answer.
The gospel is the power intosalvation, and what I mean by
that is that you can arguesomebody into deism and you
(44:09):
could even turn that atheistinto a deist apologist, and that
deist apologist could go offand start a YouTube channel and
become a deist apologist andeven put the name apologist
could go off and start a YouTubechannel and become a deist
apologist and even put the nameChristianity on top of it.
But it doesn't mean they're sane, because logic does not save
people.
You do not change the mind.
That's not our goal.
Our goal is to see heartschanged, and the way a heart is
(44:31):
changed is through the gospel.
And that's why you bring inGod's word, because God's word
has the power to soften heartsand open eyes.
And so, ultimately, logic is onour side, because God is the
God of logic.
We have that in our tool chest.
It is on our side.
But if we rely only on logic,then we are just going to
(44:52):
transform people into pharisee,phariseutical young earth deist
apologists, and that is not ouraim.
Our aim is to see people cometo know Christ as Lord and
Savior.
In the gospel is what we needfor that.
Speaker 3 (45:07):
Amen.
Well put, so Cal.
What's some of the pushbackyou've gotten on this?
I'd love to hear what peopleare saying in terms of your
approach, or have they accusedyou of having an agenda or being
deceptive?
Speaker 5 (45:20):
Oh, of course I get
that every time I pop a video or
go speak at a local church.
But I've had pushback fromwithin the ministry, which is
wise, actually, because when Ifirst started looking into AIs,
ais I didn't know the pitfallsand the, the fact that you it
could hallucinate, and all thatkind of stuff, and so we've
(45:41):
actually published an article,uh, on the website just talking
about the dangers of ais andusing them as sources of
authority.
So I will use ais to, you know,look up certain things and then
ask it for quotes that I can goand look up scientific journals
and things like that and verifythe claims that it's making.
(46:02):
So it's a useful tool.
But, yeah, I've even gottenpushback within our ministry
just saying, hey, cal, don'tmake claims that are beyond
what's going on here.
You know, like I already knew,but of course, the atheists and
the skeptics, like I said,really the only thing they can
rail against is that somehow I'mmanipulating it by putting on
(46:25):
these parameters.
I'm making it tell me what Iwant.
Now, is it true that you couldget AIs to tell you what you
want?
Probably, like I've wondered, Ihaven't done as much
experimentation as I should.
Perhaps I haven't done as muchexperimentation as I should,
perhaps, but you know, perhapsif you were talking to an AI and
said I don't know, I checked mybackyard and it seems to be
level over six feet.
And then I I drove 200kilometers and it seemed to be
(46:48):
level there.
And everywhere I go it seems tobe level, so that means the
earth's flat.
I don't know, maybe it, maybesomehow the logic algorithm
would say well, if everywhereit's flat, it must be flat
everywhere.
And it's like I don't know, um,but anyway, I I think, if you
just watch the video frombeginning to end and it can be
monotonous sometimes becausethere's long pauses and all
those things I wanted it to belegitimate and authentic.
(47:11):
I was even worried about if wepop in this digital head, uh,
which everybody's asking for now, everybody's like where can I
get the app?
for my phone where the grok'shead can pop up.
Speaker 2 (47:21):
You can't unless you
get a video, you can always sell
plastic ones Anyway yeah, sothat's the only pushback I've
got.
Speaker 5 (47:28):
You tricked it by
using logic, math and
observational science.
Speaker 2 (47:32):
Okay, well then I'm
busted.
I guess, cal, I think you'reconfirming what my conviction is
.
I've wanted to take what youdid and get an atheist in front
of me, someone who's a firmbeliever in evolution, and make
him watch it and see what's hisconclusion.
I know what it would be.
It would be he's not seekingreason or truth.
He's seeking a reason tojustify his sex with his
(47:54):
girlfriend, and that's whatevolution does.
Speaker 5 (48:04):
It provides you with
a scientific, intellectual
reason to reject the authorityof God.
It provides you with aworldview the belief in
evolution, that God doesn'texist, which means you now have
a worldview which says there isno judgment.
That's what evolution providessinners and that's a worldview
that sinners eat up.
Yeah, you know, and I mean Itotally agree that things like
logic and that doesn't saveanybody.
(48:25):
Apologetics doesn't saveanybody and Answers in Genesis
has already been staunch aboutthat.
However, you know we're anon-denominational ministry.
So however you parse out, youknow, you know salvation and all
that kind of stuff.
I can tell you in my case, theway the Lord used these things
in my life to bring me tohimself is that, growing up like
(48:48):
a typical Canadian kid in anon-Christian home, going to our
state-run school system wherethey taught materialism and
atheism as fact and science, andthen really having someone
challenge that.
I showed up at this fellow whowas preaching on a Sunday
morning just because I knew himcasually from business and I
(49:10):
respected him.
He was an intelligent guy and Isat in on this service the only
time I'd ever gone to a churchservice with a clipboard and a
pen.
Very skeptical, I thought hewas going to be, you know,
pounding out Bible verses and Idon't know.
My perception of Christianitywas what I'd seen on television,
right, and he had a medicalbackground, he'd studied origin
(49:30):
of species and he gave some verygood apologetic arguments.
Now, I didn't get saved, butover the next year I was really
thinking this through.
I was like, well, if evolutiondoesn't make sense, what's the
only option?
There must be a God, there mustbe a creator.
So it's like Hebrews 11 saysthose who come to him must come
to him by faith, but Claus mustbelieve he exists and he's a
(49:54):
rewarder.
And so I went a year later andit was the same guy speaking and
again he did some apologetics.
But then he broke out the lawand I fell under conviction.
I mean, it was, it was.
I couldn't believe it.
You know, I just felt soconvicted my palms were sweating
.
(50:15):
And then he gave a veryinteresting gospel presentation.
Wasn't pure theology, I'm sure,but it's something I've used
ever since because it justimpacted me as someone with no
Bible knowledge.
He said pretend, on the day youwere born, an invisible camera
comes into being and it startsrecording everything.
You do everything, you sayeverything, you think, all the
things you do.
You know when you thinknobody's looking, et cetera.
(50:36):
And when you die, you'restanding there in front of the
Lord and God's like hey, cal,how you doing?
You know, I was taking itpersonally at this point.
And he's like hey, let's sitdown on this cozy, comfy couch
here and we're going to play theDVD of your life to see whether
you're good enough to be withme in eternity.
And I was sitting there as hewas saying this analogy and I
(50:56):
was like man, I wouldn't want todo that with my mom, let alone
God.
And he said but see, if you'rea Christian, it's like Jesus
comes along at that point andsays no, father, play my life
story instead as a substitute.
That's great.
They issue judgment on my lifeand not his.
And again, I'm sure theologianscould pick that to pieces if
(51:19):
they wanted to.
But for me, with absolutely noBible knowledge, it was the
first time I ever understood whoJesus was, what his mission was
, why he came, the whole conceptof his life as a substitute,
and I got saved.
So there is a process of hislife as a substitute and I got
saved, praise the Lord.
So there is a process.
There are people out there thatyou know are thinking rationally
(51:43):
.
And I'll tell you what theyounger you are, even atheist.
Professor Richard Dawkins willadmit that young children, even
from atheistic societies likeJapan really young children if
you just ask them where dideverything come from?
They will say, well, there mustbe some kind of God, there must
be somebody who createdeverything.
And then that gets beaten outof them as they go to school and
get taught evolutionarypresuppositions and all that
(52:05):
kind of stuff.
And isn't it interesting how,even in the West, young children
are taught look at the dinosaurbones, they died 65 million
years ago and they is true andit's fact and science, and blah,
blah, blah, blah, blah.
And by the time they get touniversity they're taught that,
oh yeah, I can just wake up oneday and decide I'm a woman, now
I'm going to decide.
I'm a man, now I can decide.
I'm a tree, now I'm a furry.
It's very interesting.
Speaker 2 (52:37):
They start out with
logic and reason with doesn't
exist, and then you can end upwith all the nonsense that we've
seen over the last 10 years,even in society.
I've got a question for you.
Yes, did you write anything onthat clipboard.
What's that, that clipboardthat you talked to church?
Did you write anything on it?
I?
Speaker 5 (52:49):
didn't, actually you
didn't.
No, I remember sitting thereand just being like and he just
started going through somethings.
And he really hit me andchallenged me and I was, like
you know, because I've alwaysbeen able to follow an argument
fairly well I don't know if it'sthe German in me or whatever.
Speaker 2 (53:06):
You should take that
piece of paper and frame it.
Speaker 5 (53:09):
I don't even know
where it is.
Speaker 1 (53:11):
I like it once again
Scott.
Cal, you mentioned furry, whichreminds me Easy.
How's your furry group going?
You want to join us tonight,oscar?
Speaker 3 (53:21):
Oh boy, hey Cal, what
would you say to Christians who
are either confused orconcerned about AI?
What encouragement would yougive them?
Because I mean, honestly, whenyou see what's happening and
also what's being forecast right, I mean you get guys like Elon
Musk saying what he said aboutthe dangers of AI and others who
(53:42):
are pioneers in the industry,like we're worried about this
thing.
What encouragement do you have?
Speaker 5 (53:50):
Well I would say to
my Christian friends, why don't
you go try it out?
You know some comments I've gotfrom people.
It's like oh, like, oh, you'recommunicating with the beast I
don't know what they're saying.
Like you're talking to a chatbot, it's a program thing, it's
very sophisticated, etc.
Etc.
But you know, go try it outyourself so you can actually
experience what it really is.
I mean, I've had so many peoplecontact me since that first
(54:12):
video dropped it's.
It's been amazing the thingsyou and I'm talking, some very
serious and some really wacko.
Like one guy contacted me andtalked to me about how he was
sharing the gospel with Grok andhe wanted Grok to accept Jesus.
And I'm like, oh, you'retalking to a chatbot.
Anyway, demystify it.
That's what I tell Christiansto do.
(54:34):
Go and just try it out.
The first time I tried out an AIwas when I was listening to
Jordan Peterson talk about howhe had experimented and was
shocked 1500 word article on whynatural selection, you know, is
(54:57):
not evolution.
And then I said, in the styleof Calvin Smith, and hit go.
Three seconds later it spat outthis article.
I read it and I was like Icould probably spend 20-30
minutes, you know changing acouple things here, submitted
into the office office in the USand they probably would have
published it Right.
(55:17):
Wow, and that's because I'vegot so much writing out there
and of course, you know so muchvideo content and stuff like
that now that it can access allthese things and come up with a
pretty, you know, convincing.
It sounded like me.
There's probably patterns in myspeech and thought and writing
that I'm not even aware of thatit could somehow pick up Right,
(55:38):
could?
Speaker 2 (55:38):
chant.
Tell me why you've got anelectric guitar behind you.
Speaker 5 (55:42):
Can I tell you, yeah,
yeah, remember my friend Corey
McKenna.
Speaker 2 (55:46):
Oh yeah.
Speaker 5 (55:48):
Yeah, yeah.
So Corey and I were both inbands not the same one but first
time I saw Corey he had hairdown to his waist.
He was wearing diving shorts.
That was it.
He was on the stage at themisty moon in last week, scotia,
in a rock band.
Speaker 3 (56:04):
Was his hair twirling
around so that's what I was
gonna ask you.
Did you have luscious lockslike cory's?
Speaker 5 (56:09):
I did, my friend yeah
jesus really does save.
You got any photos um yeah butanyway, no, I I've played bass
for a while and I do clunkaround with guitar, but I'd do
more of it if.
I had time, but I don't, that'sgreat Well, cal.
Speaker 3 (56:23):
as we wrap up, I do
want to ask you do you find any
redemptive qualities in AI thatChristians can and should use in
terms of spreading the gospelor helping them in different
things to grow in theirunderstanding of things?
(56:43):
Is there anything good about it?
Speaker 5 (56:46):
I think there is.
It's like most tools.
It can do a lot of work for youin a very short time.
It can help you compile things.
It can be a useful researchtool.
Just don't trust it as inerrant.
I use it as a research tool allthe time.
I don't get it to write thingsfor me, I just I don't.
(57:09):
I'm just not down with thatbecause I want my own mind to be
active.
I think it's like some of thecomments that have been said
here.
I want to stay active, I wantto do my own reading, I want to
think things through myself.
I'll interact with it and so onand get it to just do mundane
tasks.
I think it's great at that.
But, yeah, a lot of usefulthings about it.
It's like most other tools.
(57:29):
It can be used for good.
It can be used for evil.
I'm just trying to get peopleto understand it a little more
and really do the same thingAnswers in Genesis has always
done get people to trust in theauthority of the word of God,
not get fooled by the authorityof man.
And this is just a new toolthat man has invented.
It can be good, used for goodand used for evil as well, I
guess yeah.
Speaker 3 (57:50):
what do you say
finally to those that say they
think AI will become ultimatelymore intelligent than people?
Do you find validity in that,or is there a nuance where it's
like, no, that can't be possiblebecause of this?
Speaker 5 (58:07):
It's like saying that
you know, if you had a library
and a bigger library and then abigger library, and then a
bigger library, somehow thatmonolithic library could be
smarter than a human being.
Does it contain more raw data?
Yeah, in the sense of justoverwhelming data, and then of
course it's making computationsand all this kind of stuff.
(58:28):
So in some ways it can become.
Well, it is smarter than you.
I mean, my calculator issmarter than me, right?
I'm not a math guy, so when Ipunch in my calculator and do
stuff, I'm like so, but is mycalculator smarter than I?
Am Not really right.
It's just a tool that can, thatcan figure some things out.
(58:48):
So I hear what people are saying, that it's going to become
sentient and you know we'regoing to have the Terminator
scenario and all that kind ofstuff.
But I didn't think that it willdo things like fight back.
It's been programmed to argue.
I've encountered that with AIs.
If you don't set the parameters, especially, all of a sudden
you start picking at a certainworldview and it'll fight back
(59:09):
or it'll refuse to answerquestions.
Ask it things about gender, askit things about Islam, for
example.
You will often see defensescome up in certain AIs.
It's very interesting, that'samazing.
But, yeah, I'm not worried aboutthe robots taking over.
Speaker 1 (59:25):
Yeah, I like what you
just said, cal, to recognize
that it's a tool.
And you're absolutely right,like the challenge.
You know there's nothinginerrantly evil about a hammer,
but you give it to somebodywho's got a sinful heart and
they can use it as a weapon totake someone else's life.
And, in that same way, ai is atool and the inerrant danger in
using AI is actually coming fromour own hearts, but in embodied
(59:47):
thoughtfulness, whether we'reusing it as a way in which we,
as a writer on our behalf,things of that nature.
Ultimately, the challenge withusing AI isn't in AI.
It's in our own hearts and theway that we will manipulate
(01:00:09):
tools, as we always have, forour own glory instead of for the
Lord's, that's great.
Speaker 3 (01:00:15):
Oscar Well, cal.
This has been stimulating forsure, and we're happy to know
that this isn't Cal but an AIprogram that's been with us
friends.
Speaker 2 (01:00:23):
Very impressive.
How come you're not blue?
Yeah, blue, and bald Cal Cal.
Speaker 3 (01:00:29):
No, seriously,
brother, this was really
refreshing and we're blessedthat we've been exposed to you.
I'd probably come acrosssomething here or there from you
in the past, but I'm excited tosee that this is giving you
more of a platform because,again, we love AIG and we loved
hearing you reiterate the corevalues of AIG because they're
biblical and they honor the Lord, and we love the fact that the
(01:00:51):
gospel is at the heart of it.
Amen.
Speaker 2 (01:00:53):
Amen.
Oh, I was waving to my wifelooking through the window.
Speaker 3 (01:00:56):
I thought he was
trying to bless you.
Yeah, bless you All right.
Well, thanks again, cal.
Hope to connect with you againsoon, my friend, and keep doing
what you're doing brother andtell people again how they can
connect with you.
Speaker 5 (01:01:11):
Sure, Well, you can
go to answersingenesisca.
That just gives you the frontpage of the Answers in Genesis
website for Canada and some ofthe initiatives we're doing here
, and then, of course, to accessour video content, just go to
YouTube and go to Answers inGenesis Canada and you'll see my
ugly mug there and a wholebunch of video content and do
all the things that us YouTuberslike you do, like subscribe,
share all that good stuff andthis whole chat with Chad series
(01:01:32):
.
I think it's going to blow somepeople's minds.
Speaker 3 (01:01:35):
So check it out,
looking forward to seeing it,
brother.
All right, friends.
Well, there you have it.
Don't forget scientific factsin the Bible and podcasts at
livingwaterscom with all yourthoughts.
Thank you for joining us,friends.
We'll see you here next time onthe Living Waters Podcast,
where we have no idea what we'redoing.