Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Luke (00:00):
Welcome back, John Great.
Hey, thank you.
Chuck (00:03):
It is good to have you
back on, John.
Luke (00:07):
Absolutely so.
What's been going on here?
We've got you know it's May,the end of the school year, may
Day, end of the school year, allthose activities that come
along with it, my namesake, Luke, graduated from Aurora
University.
John (00:20):
Congrats to him.
Degree in business andmarketing.
Chuck (00:25):
That's quite an
accomplishment, nice yeah.
John (00:27):
He doesn't have a job yet,
but you know?
Chuck (00:29):
Well, that's the next
step, right?
John (00:31):
Yeah, it's baby steps,
that's what he said, and I said
well let's make those baby stepsa little bigger.
Let's see what we can do thereFor sure, let's try.
It was fun, though.
It's good to see that.
You know it's always proud ofyour kids when they reach those
milestones.
You know it's cool to gothrough that and you know got to
spend some time with you knowmy other kids too, so it's
(00:53):
always good.
It's good to get together,spend time.
You know release from work alittle bit and get out the door.
Missed a couple things here atschool, but you know what you
guys did a great job of carryingthe weight absolutely keeping
it going.
Chuck (01:06):
Yeah, um, I saw tiktok.
Oh yes, you're a tiktok guy,I'm a tiktok guy.
And I saw tiktok that used aito put jordan, michael jordan,
up against the stats of all thegreats of today, and even in
(01:29):
this AI-generated type of story,he was head and shoulders above
everyone else, oh, 100%, 100%.
John (01:38):
There's no comparison.
Chuck (01:40):
I don't care what anybody
says.
They use the type of gamethat's played today.
I don't care what anybody says.
They use the type of gamethat's played today, the
different technology type ofthings.
When it comes to fitness.
John (01:51):
Well, if you think about
it, if you watched any of that
era when he was playing andwatching him and you had, and
again I'll just bring up thePistons again, it was the Jordan
rules you go to the hole he'sgetting hit, he's getting hit.
He's getting fouled they touchsomebody, you know it's, it's
they, he would have gone off.
Chuck (02:09):
He, this thing, said that
even he went off getting
tackled.
Yeah, for sure.
And this thing said that withthe way they use space these
days, you know how it's more ofa flow game yeah yeah, space is
broader.
John (02:20):
Whatever he said that,
they said that he would have
absolutely oh, he would have, hewould have, he would have,
absolutely, he would have eatenhim up he would have torn up.
It is kind of funny that youbrought that up, because, as we
segue in the one thing of AIthat I will if I'm in a mindless
mode where I just I don't wantto think, I just want to watch
something and laugh those littleAI things crack me up.
(02:42):
I don't know if you've everseen them where it's like a,
it's like a, it's an AIgenerated thing.
Where there's a car and it'sgoing to crash into a tree at 20
miles an hour, at 50 miles anhour, at a hundred.
Yeah, the one I saw the otherday, it was a school bus versus
a large speed bump, okay, at 10miles an hour, and so this bus
(03:02):
goes over it and it scrapes thebottom like the driveshafts and
then by the time it gets to like80 miles an hour.
Here's this school bus plowingdown the road, hits the thing
and it hits the overpass on thehighway.
Oh my, I was laughing so hard.
I'm like first of all, seeing abus going 150 miles an hour
(03:23):
just made me laugh by itself Forsure, but that shows you what
mood I was in at the time whenit did it.
But some of that stuff with AIit does make me laugh just
because it's funny.
But I think the direction wewere heading today initially
with AI was kind of a questionthat we get here because of what
we're involved with ineducation Is AI a tool that
(03:44):
supports students in theirlearning or is it a crutch that
holds them back?
And I think it was somethingthat I was talking about with
somebody over the weekend.
Is it cheating, is it notcheating?
That's kind of the question.
So I thought maybe we'd chew onthat a little bit and see,
because I know that we havepeople that listen, that are
(04:06):
parents that probably have viewsabout it, but we also have
other educators that listen tojust people in general that may
not really quite understand andknow what it is.
But I think that the biggestquestion that I had that people
are concerned about trying tolimit student use of AI.
Chuck (04:25):
All right, Can you
explain for people who may not
know what is the AI and how isit used among students?
John (04:35):
I think that's part of
what the answer is.
That I've looked at is is thatit depends on how you actually
use it.
All right, If students wereusing it.
Let's just say it's chat, GPTor something that they are going
to put in a prompt or a topicthat the teacher gives them.
Luke (04:54):
Right.
John (04:55):
For a paper, an essay,
whatever you want to say,
whatever they call it, and AIgenerates something for them.
And then they use it as theirwork.
Okay, right, that's one thing,right, that's, that's one way.
Luke (05:14):
Example way over here.
Yeah, okay.
John (05:16):
There's another example
where a student might say okay,
here's this topic.
What I'd like to see isresearch on XYZ, on this.
Show me research about this.
And it'd be just like doing aninternet search.
It's very similar.
It's not doing much except forthe fact that the AI and the
chat GBT is generatinginformation for them.
(05:40):
So you might even look at thatand say that was like using an
encyclopedia back in the day, ora dictionary or a thesaurus or
something where you're gainingmore information somehow to then
make this essay or to writethis essay Right.
Another way that they would useit would be I have this essay
(06:02):
that I wrote, they're going toput it in and they're going to
say hey, could you check thisfor grammatical errors?
Could you check this for betterways that I could write this?
Could you check this to make itflow better?
Whatever you want to do, Ithink that's another way, and
there's more ways than that, I'msure, but those are the three
ways that I kind of focused onin this conversation that I had
(06:23):
over the weekend.
But on the flip side of it, howare the teachers using it?
Exactly, okay.
So, that's kind of where Iwanted to jump in at and see
what you thought.
Luke (06:35):
So, John, what do you
think about using AI in
education?
I think it can be a wonderfultool.
I think you have to learn howto use it, just like you would
any other tool that you've got.
I think that teachers can useit to make education more
(06:59):
individualized for students.
Teachers need to use it to helpthem maybe get some questions
put together, how to check forsomeone's understanding, so all
kinds of tools that teachers canuse it for.
(07:21):
Seems like we're a little bithesitant to use it as a tool.
Look at the calculator.
When it first came out, they'relike, well, you're going to let
kids use a calculator Really Acalculator?
No, they got to think of it ontheir own.
(07:43):
They got to do this mental math.
They got to use a slide rule.
They got to do it the way wedid it, yeah, and we eventually
get to a point where you knowwhat it's okay.
John (07:57):
I think part of it is that
and I'm not going to lie that
I've fallen into that thoughtprocess before where I don't
know that we should use thingsor invent things, or that Think
for us, yeah, that think for us,or that do the work for us.
But, as we know, our societyhas completely changed toward
(08:20):
convenience.
But I think that the technologylike you mentioned, a
calculator, some of those thingsI think people said, oh, we
don't really want kids to dothat because you know when they
get into the workforce theywon't be able to do that.
It's like well, there'sprobably not a whole lot of jobs
(08:40):
that were involved in numbersthat wouldn't allow some sort of
computer use or calculator usefor people to do work, because
it would be faster and theywanted production, right yeah.
So I think that I kind of fellalong those lines for a while,
that I really, as a teacher, Iwould always say I want to see
your work, right, I want to seeyour, I want to see how you got
(09:05):
there, which I do think isimportant Absolutely.
They need to understand theprocess.
But is it wrong to say, okay,if I know that you know how to
do the process, okay, now you'vegraduated and you're able to
use this or whatever it is like.
It's some sort of a progression.
Chuck (09:17):
Yeah, is it even
necessary to learn some of those
things?
Because you can.
I mean, you can google thatanswer pretty quick.
Like, if you need to, if you'rebuilding a house and you need
to know a cut, uh, becauseyou're building a 612 pitch and
you need to know what the cut isfor, uh, you know a truss or
something.
You can find that answer outpretty quick, do we?
(09:39):
Do we?
Luke (09:39):
need to get your slide
rule out and look it up, or look
it up on the computer, right?
Yeah, that's what you're saying.
Sure, I think it's important toknow, yeah.
John (09:49):
Now you might not use that
in your day to day like as a
carpenter or something in thatregard, but I think in certain
ways there's benefits to knowingwhat got us there, wherever
we're at, whatever that is, ifit's a thought process, if it's
some sort of a project, if it'sa geometric equation, if it's a
(10:11):
financial somehow of an equationthat gets us to gross national
product or whatever it is, Ithink it's important to know
those things and the process aswe go, just because I think if
you're just working for theanswer at the end which I
realize that's kind of the endproduct I don't think that has
(10:32):
as much value as it does knowingwhat it took to get there.
I feel like we asked a lot morequestions when I was in high
school than kids do now.
I think we have to sometimespull those out of kids now, like
we have to lead them to thattrough, like we've got to get
them there to that point ofquery where they say, I'm going
(10:53):
to inquire about this, likeyou're getting them to the point
where they start to nibble atit, and then they want to take
it to the next step and say, hey, how does this happen?
Or hey, where does it go fromthere?
And then sometimes that's likepulling teeth.
Yeah, it is, that's difficult.
Yeah, because, especially ifthey know, like you said, they
could google something.
(11:14):
But I feel that it's taking awayfrom the process of being able
to say, hey, how does this, doesthis happen?
Or I'm really interested inthat.
Can you tell me how it works?
Some of those things I thinkare important for students to
know.
So, yes, I would say I feellike we need to teach more of
that, which is why, when wetalked about the AI thing in
(11:37):
those three examples I don'tnecessarily know, like the one
example that I gave, where astudent puts in a topic and they
get something out excuse me,and they claim it as their own.
That's ethical.
Luke (11:53):
That's probably not right.
John (11:55):
In the middle example
where it was.
I'm going to use this and it'sgoing to help me get some
information.
I'm going to search someresearch and then I can come to
my own conclusions, writesomething up, right, I don't
think that's cheating, that'sresearch Just using a tool
that's smart.
Chuck (12:09):
Use your time.
John (12:10):
Okay, the other one, which
was here's what I wrote Check
this for errors or check thisfor you know.
It's kind of like a glorifiedspell check, if you want to say
as long as it doesn't change thecontent too much, you know,
which I think is something thatwe need, and we have tools for
that right.
Teachers have tools to be ableto put a research paper in and
(12:32):
say how much of this wasgenerated by AI.
They can give you a real.
I don't know that I'm alwayssold on numbers because you can
put it into two or threedifferent modems and it doesn't.
It comes out different, but Ithink as long as it's, and I
don't know what that thresholdnumber is, I don't know if I
know, if I would say, is it?
Luke (12:52):
10%, 15% 20%.
Chuck (12:55):
I heard somebody say
today is actually 20%.
John (12:58):
I mean it just depends on
what you're comfortable with.
Today is actually 20.
I mean it just depends on whatyou're comfortable with.
I think you know, as far aswhat the school district would
say or what people would say, Iwouldn't say more than because
20 I mean.
If it's more, you know that'dbe like I mean, 30 years ago
writing a paper, you're writinga fifth of the paper and the
rest of it you got from somebodyelse.
Yeah, that doesn't.
(13:19):
I think you know you are.
Chuck (13:21):
Somebody I was talking to
you today said if you're citing
some of those sources, it'sactually probably not such a bad
thing well, yeah, as long asyou're just like in a normal
paper, if you're citing yoursource and you know where it's
from and you did the research todo it.
John (13:35):
I mean, they directed you
to it, but you're actually
looking through it and pickingout the stuff that you think is
important.
It's not really any differentthan what we used to do.
Chuck (13:43):
I think we need to teach
our students more ways that they
can use AI effectively.
John (13:49):
I think what the biggest
thing that came in and it goes
to what you're saying is that ifwe are, we're in a school
system and we're trying to Idon't want to say discourage but
we hold kids accountable forusing AI.
They use it over a certainpercentage, or the teacher looks
(14:11):
at it and gives them anotherchance and says, hey, this is
really heavy AI, you need toredo it.
Put it in your own words doyour own thing.
Mm-hmm.
What message does it send whenwe use AI to grade the papers?
So if we're telling them not touse it, but I, as the teacher,
am using it to grade theirpapers and I'm not putting in
the same amount of work to do it, what is that saying?
(14:32):
It's kind of like telling a kidto put his phone away and the
teacher's on their phone thewhole class period.
Does that ever happen?
I mean not here, but I'm justsaying in general.
That would be the same thing.
It's like telling your kid, hey, don't smoke while you're
sitting there, chain smoking infront of them.
Chuck (14:50):
Yeah.
John (14:51):
It's by your actions right
and you see that, and it's
difficult for high school kidsto look past that.
Chuck (14:57):
Yeah.
John (14:57):
When you, as the educator
or as the teacher in the room,
are telling them they can't dosomething, but yet you are,
because that old saying whichwell, I can do it because I'm
the adult that doesn't fly, Iwonder.
You know and never did.
Luke (15:11):
No, you know, uh, reminds
me of a quote be the example
(15:35):
that you want them to beassistant your paper and had
that assistant look over thatpaper and maybe make some
corrections or look at it andadd some content, but you still
have to be that author.
You have to be the person who'sgoing to look at that content
(15:56):
and say, yeah, this is, this isthe way I would write it, or I'm
going to change it so that itis the way I would write it,
rather than saying, oh, aigenerated all this and I just
gave it to you and it reallyisn't anything that you know, I
I put in some ideas, but it itgave me what, what it gave me.
(16:19):
I think you need to be theauthor, you need to be the, the,
the person that is making thatcontent.
And I think that's where thefirst one that you talked about,
where you just put in a topicand you get one back, and the
second one where you'd put in,and the second one where you'd
(16:47):
put in and get research I thinkthat's fine.
I think the third one you haveto look at and say, well, how do
I make this my own, rather thanI put in what I put in and I
got back what I got back and I'mjust going to turn that in.
You still have to read throughit and say, okay, this sounds
like me, this doesn't sound likeme.
John (17:08):
So I do think it's
important, just like any other
technology that comes down theline, that it's our
responsibility as educators toteach students how to use it
correctly, and I think that it'simportant that students know
how to use it correctly.
And I think that it's importantthat students know how to use
it correctly.
You know and what thoseboundaries are.
(17:29):
You know what is it that if Iwas using this, you know what
are the moral or ethicallycorrect ways to use it.
You know that follow down theright way or the wrong way, or
this is this way.
I think that that, to me, isprobably the biggest thing that
comes out of the conversation.
For me is that it's just likeany other technology that would
(17:51):
come down the pipe for kids.
Chuck (17:53):
What about this little
spin?
How many people are afraid tointroduce AI into the education
system because we think it mightreplace some of our efforts in
the classroom?
Luke (18:09):
Might replace teachers.
Is that what you're?
John (18:12):
saying Maybe we're going
down a rabbit hole here.
We could I mean the one that Ihad was, and that's a pretty
good, you know thought to godown to talk about the one that
I had is that what's one of themain things that people were
worried about with AI Is that AIstarts to learn Right, and not
(18:33):
only does it learn, but it'staking whatever the majority of
the thoughts are that aresomehow put into it and it's
learning from and it goes adirection, it goes an opinion
way, right.
It's not just factually based.
It starts to form opinions,okay, and a lot of that is based
on what input it gets, right.
(18:53):
People worry about AI learningand as it learns right, it
starts to influence back peopleas they're using it, because it
directs them a certain way, theway that AI is learned.
Chuck (19:08):
Isn't that what teachers
do in the classroom?
John (19:10):
Oh sure.
Chuck (19:10):
Anyways.
John (19:11):
But you know as well as I
do that students don't always
listen to teachers.
Sure, you could tell them, as ateacher, something and they'd
be like eh, whatever.
And then a kid tells them inthe hallway 10 minutes later and
they're like oh, that's a greatidea because it's a peer, all
right In our age, in ourgeneration.
For this technologicalgeneration that's here, this
(19:38):
means more and I'm pointing tomy computer at the moment Sure
Than what we're always saying Imean, look at how hard it is
already for them to say, oh, Isaw it on the Internet.
It's got to be true.
Luke (19:50):
Well, and yeah, many, many
people are getting their news
off of TikTok or Instagram orthat kind of thing, and some of
it is AI.
And what we need to startteaching kids is how to discern
that too, and figure out what isgood, what is not, and to think
(20:17):
on their own, and I think yousaid that we need to get to a
point where we're teaching kidsto think.
John (20:27):
And how many times have we
heard the term fake news?
Right yeah.
It happens all the time it doesOkay.
Now you really have to bevigilant.
Okay, like, if you're reallygoing to go and you're going to
look for your news, you're goingto go through your stuff.
I mean, there's trusted newssources that people get their
news from, and a lot of people alittle younger than us, through
(20:47):
our age and older, get it fromcertain sources and then from
that certain age maybe it's likethe mid-30s into the 40s get it
more from other sources likeTikTok or Instagram or wherever
these other things are, where alot of times you'll see it's
individuals that aren't in atrusted news source, if you want
to say telling you the news.
(21:08):
I think it's crazy sometimesbecause if you ever go down that
road, like I've done thatbefore, where I'll just be,
because if you ever go down thatroad, like I've done that
before, where I'll just be liketoday, a colleague told me that
Sherone Moore got suspended fortwo games the Michigan coach.
Today.
I hadn't heard.
Now I wasn't on my phone either.
I was working right, so Ididn't know.
I think it's one of thosethings where we need to be
(21:30):
careful because, like some ofthat news, like you guys one of
you two said I don't know, someof it's AI generated, but that's
where that fear comes in.
And again, I'm not promotingthat fear.
I'm saying that that is a fearthat people have about AI Is
that once it starts to think andit grows and it learns who is
(21:52):
controlling that, who'scontrolling its growth and what
information it's disseminatingto us.
Chuck (21:59):
One of the things that I
react to in more of a stronger
way than what I normally reactis fear generated type of
reactions, and that's kind ofwhat I feel like some people
have when it comes to ai is, I'mafraid it's going to do a, b or
(22:21):
c that's why I was saying ityeah, and I I kind of understand
that a little bit, but at thesame time I don't want to be
driven by fear.
Um, but I do understand we gotto guard against this.
So there's like this fine lineof how do we react.
Is that something that canactually really happen?
(22:41):
I don't know.
John (22:43):
I think there's a
difference in what you said is
true.
Is that there's one thing toreact to it or be driven by it,
and one thing to be aware of it.
Yeah, okay, so that if youstarted to see certain things
that maybe brought that to light, like hey, this wasn't
happening before, or whateveryou would see.
I just think that it'simportant to be educated about
(23:06):
it, about possibilities.
You're not driven by that.
I mean, I'm not, yeah, but Iknow some people are Right, some
people live in that world.
Chuck (23:13):
Yeah.
Luke (23:20):
I don't live in that world
.
John (23:20):
Yeah, I don't live in that
world because if you did, you'd
never get anything done becauseyou'd always be worried about
what's happening around you andbehind your back.
I mean forever.
So I I try not to and I don'treally have to try very hard to
not live that way, because Ithink it's just there's too many
things that I want to enjoy andand and accomplish, as opposed
to sit there and wait for thebad things to happen that I'm
worried are going to happen.
Chuck (23:39):
Yeah, I do wonder.
I was reading an article um, Idon't know, it was recently uh,
about AI and how which agegroups it could actually affect
more than others, and it was anarticle I want to say it was in
the like washington post or oneof those types of big papers,
(24:02):
national papers, and it saidthat the potential for it to
affect people our age,particularly people who are
professionals and who dotechnical jobs, is pretty like
pretty realistic pretty highpretty high yep and so like.
(24:22):
They gave the example, forinstance, of somebody who gets a
an x-ray done and they'retrying to work through all these
.
The doctor or the specialist istrying to work through all
these.
The doctor or the specialist istrying to work through all
these different scenarios basedon medical journals, pictures of
their x-rays that that actuallycould be a job that could be
(24:45):
replaced by AI, because all itneeds is information and
pictures in order to make adiagnosis, and I thought that
was fascinating.
Luke (24:53):
Yeah, there's a book, I
think it's called Reimagine.
Who was talking about thosejobs that we have done over time
?
Are they going away because oftechnology?
And this was a while ago, itwas before AI.
John (25:16):
I think technology is one
thing, AI, which is actually
built to learn and to expanditself is a little different.
Technology has replaced a lotof jobs.
I mean, you can't even.
I just laughed because I was inIllinois again and there's
nobody at the toll booth anymore.
But the technology can walkright in there and take care of
(25:39):
it because they program it.
But who are the guys that areprogramming it, the guys that
really know what they're doing,right?
That's where the problem comeswith the AI thing is who's
programming it?
We are Everybody is Because it'staking information from anybody
that it gets yeah, and do Iwant my broken femur to be, you
(26:02):
know, diagnosed from a bunch ofinformation that some crazy high
school kid was typing in abunch of stuff one day about
something?
Chuck (26:08):
I don't think it works
like that.
Luke (26:10):
I don't think so I'm just
saying I think, I think that it
takes information from.
John (26:14):
From what I've understood,
there's a lot of different.
It's obviously not just onething, sure, but there's
information that's there and atsome point AI needs to make a
decision as to what it is movingor what it's going toward,
because there's a lot of timesif you put something in, you'll
get something out.
That's complete gibberish.
(26:35):
It makes no sense whatsoever.
It's because it hasn't reallyeither gotten to that topic or
that area of expertise that itthinks that it can.
Luke (26:47):
I don't know if you guys
watch medical shows, you have to
refine it with your yeah, withthe way that you put it in.
Yeah, yeah, yeah you can't, I'mnot sold on it, let me just put
it in yeah, yeah, yeah.
You can't just put in this, I'mnot sold on it, let me just put
it that way.
Chuck (26:58):
I think if it could
potentially decrease maybe the
cost of health care if somethinglike that were to happen,
because instead of paying aspecialist you know $1,500 to
read an x-ray you got that donethrough the job of ai and the
(27:19):
way that this article explainedit.
If it's fishing through dozensand dozens of medical journals
that the specialist wouldprobably be fishing through
anyways but probably take threetimes as long I, I'm game, I'd
be, I'd be the guinea pig onthat.
Luke (27:36):
One another way that that
it's affected um jobs legal
research yeah, you don't have tospend all that time and have a
legal library and all that.
You can put your quarry in ohboy, see where it's going you
(27:59):
can put your query into ai andget that back now as as a lawyer
or as a legal aid.
You have to read it and look atit and say, yes, this case was
this and that you know you gotto.
You still have to do theresearch, but AI is going to do
(28:23):
a lot of that for you.
John (28:24):
So there's been a couple
of times where I've put
something in and I've read whatit spits out, and then I go and
I'll do it through another way,whether it's not necessarily
through Google, but I'll go toan online library research
(28:45):
clearinghouse, right, and lookthrough some topics to do it and
a lot of times they're verydifferent because it paraphrases
a lot.
So in some of these things,when you're looking for
something specific, it'sdifficult because it paraphrases
a lot.
So in some of these things, whenyou're looking for something
specific, it's difficult becauseit doesn't maybe necessarily
have that as far as the specificknowledge that you need, which
you would definitely need in amedical sense it can't be so
(29:08):
broadly general.
It has to be very, very, veryspecific, because one small
thing could change the entirediagnosis of anything.
Really, you're dealing in verysmall units of whatever it would
be.
I think that it's justsomething that and we've done
this before in our country wherethings have gone and they get
(29:28):
going super fast and then wehave to try to reel it back in,
and I think there's momentswhere we probably need to try to
do that.
And I think there's momentswhere we probably need to try to
do that because the technologypiece is still it's all well,
it's not I shouldn't say stillit's always going to be a work
in progress, okay, so, the areasthat we think it might help the
(29:50):
most, I would think we wouldneed to be a little careful just
to make sure that we don't letit go too far Right, without
enhancing it somehow to makesure that it's right.
It sounds like a little bit offear there.
Oh, it's not fear, I think.
(30:11):
To me it's reality to say it'sjust like anything else.
It's like am I going to bringin?
Let's just say that ourprincipal left the building.
Am I going to bring somebody inwho's not had any experience as
a principal?
Luke (30:28):
Probably not.
John (30:30):
So the idea of taking
something and putting trust into
something that you think mighthave had some experience with it
, but you don't really know forsure, you're taking a risk and,
depending on what that is thatyou're, I mean, if I'm doing it
to write a paper, obviouslythere's not a big risk, except
for me getting an.
Chuck (30:50):
F right.
John (30:51):
But if you're making a
diagnosis, or a medical
diagnosis or it's to build anairplane or build a better bomb
or something or whatever it is.
I mean clearances and all thesethings.
There's a lot of stuff and Idon't know if I said this before
, but I don't like it.
I mean it's there, it's notlike it's a problem because it's
not hurting anybody.
(31:11):
But I can't stand watching somemovies anymore Because it's
computer generated.
Luke (31:17):
Yeah, it's not real.
It drives me nuts.
John (31:20):
But I think, like I watch
it and it's like no, that's just
, and you know, and you've seenthem, Some are really good.
Chuck (31:26):
Oh yeah.
John (31:27):
And some are really bad.
And it's like man, you dumpedbillions of dollars on this
movie.
Chuck (31:31):
Well, if you look at Star
Wars.
Yeah, one, if you look at.
Luke (31:35):
Star Wars, one of the last
Star Wars movies.
Chuck (31:36):
No, it was one of the
last Star Wars movies.
Princess Leia, what's her name?
Luke (31:41):
Carrie Fisher Carrie.
Chuck (31:42):
Fisher had died Right and
they actually did an
AI-generated model of her toplace her in this particular
movie.
John (31:50):
Well, didn't they do that
with the guy from Fast and
Furious too, Paul Walker?
Chuck (31:54):
Yep, they did that too.
John (31:58):
Yeah, some of that stuff
is a little creepy to me because
, like Claymation, even thatkind of wigs me out a little bit
Really.
That's ancient, I know, but italways does kind of wig me out.
Luke (32:07):
But some of those movies,
yeah, I just watch some of that
stuff and it's like man, I wantto see some Three Stooges action
.
John (32:13):
I want to see some Three
Stooges action.
I want to see some realinaction.
Nothing generated there, exceptyou know ass-hattery, I would
say when it comes to.
Ai.
Chuck (32:27):
I've always been pretty
much an early adopter of
technology throughout the yearsand I have adopted AI pretty
heavily, to the point to where I, when I use it, um, like I'll
have it summarize articles, um,I'll have it.
John (32:44):
Uh, give me suggestions of
topics but do you read the
article first?
Not always no so you're gettingcliff's notes pretty much, but
what?
Chuck (32:53):
I would do in the
beginning is I would read the
article and then have itsummarized.
John (32:57):
Okay.
Luke (32:58):
So you built some trust.
John (32:59):
Yeah, I built some trust,
Built some trust yeah.
Chuck (33:01):
Yeah, and it was.
I mean it's been good for me.
Luke (33:05):
So you gave a talk at
church.
I don't know if you call themsermons yeah, we call them
sermons.
So you gave a sermon at church.
Did you use AI?
So you gave a sermon at church.
Chuck (33:16):
Did you use AI?
I did use AI for the purpose ofsummarizing different articles.
Yeah, that's pretty cool.
Luke (33:21):
I don't think there's
anything wrong with using AI
tools properly and how it willenhance their learning rather
than replace their learning.
(33:42):
And so I think this AI thing, Ithink it's going and it's going
to continue, yeah, and we needto be, we need to be willing and
able to help people learn howto use it and how to use it well
, yeah, and I think even beyondkids, people in their 30s, 40s
(34:05):
and 50s need to learn how to useAI so that they're not left
behind and they're not uselesswhen it comes to different
things in the workforce.
Hey, we might want to wrap it uphere.
I think we really came to somegood conclusions with it too.
Chuck (34:22):
I don't know about any
conclusions.
I still have tons and tons ofquestions.
There's a lot of questions.
Luke (34:27):
But you know, I think
learning about it and not being
afraid of it are two greatthings.
Chuck (34:35):
Yeah, I think, not being
afraid of it Kind of going with.
Luke (34:39):
Luke, you got a quote for
us.
John (34:41):
I got one.
Luke (34:43):
Is it AI generated?
John (34:44):
I don't want you to think
that I'm all afraid of it.
Some people worry thatartificial intelligence will
make us feel inferior, but thenanybody in his right mind should
have an inferiority complexevery time he looks at a flower.
Chuck (35:01):
Deep Think about that one
Chew on that a little bit.
In my opinion, that is pointingus toward a creator.
John (35:09):
Yeah.
Chuck (35:10):
And every time you look
at a flower you should feel
inferior because you knowthere's a creator behind the
flower.
Luke (35:26):
Absolutely, amen.
That's good stuff.
All right, you should feel infury because you know there's a
creator behind it.
Absolutely there you go.
Well, hey, thanks for hangingout with us here at half century
hangout.
We appreciate your listening tous and make sure you like us
wherever you listen to your bestpodcast.
So happy cinco de mayo.
John (35:37):
Peace out wherever you
listen to your best podcasts, so
happy Cinco de Mayo.
Chuck (35:42):
Peace out.