Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Amplify
the Chesapeake Public Schools
podcast.
Speaker 2 (00:13):
Chesapeake Public
Schools is located in the
Hampton Roads region ofsoutheastern Virginia.
We proudly serve over 40,000students in 45 schools and three
centers.
Join us as we share the storiesbehind our story by celebrating
the people and programs thatmake us one of the premier
school districts in Virginia.
Speaker 1 (00:33):
Hey listeners, this
is Matt Graham and I am here
with Chris.
Speaker 2 (00:36):
Vail.
Speaker 1 (00:37):
And we have yet
another hot topic to discuss
today, right, chris?
Speaker 2 (00:41):
Yeah, we're going to
dive into artificial
intelligence AI in the classroom.
Speaker 1 (00:46):
Previous episodes
that we've had.
A lot of teachers, a lot ofadministrators have mentioned
how much technology has changedover the course of their years
in education, and right now atthe forefront is AI.
Speaker 2 (00:59):
Yep.
Imagine, matt, when you and Istarted teaching together at
Great Bridge Middle.
It was we were excited if wehad a whiteboard in the
classroom and an overheadprojector.
Now we're talking about AI.
Speaker 1 (01:10):
I know man Times
change so it's so much of a hot
topic that, even while we'remaking this episode, our
governor announced an artificialintelligence task force where
they are going to meet todiscuss policy standards, it
standards and educationguidelines, and we just recently
spoke with our Chief TechnologyInnovation Officer, dr Jeffrey
(01:33):
Faust, on this topic and howbasically a big picture what
we're doing as a district tostart to incorporate AI, its
best practices and so forth.
Speaker 2 (01:44):
Hey, dr Faust not
only brings great hair into our
studio, but he's our expert forChesapeake Public Schools on AI.
Speaker 1 (01:51):
Absolutely, Today we
have with us our Chief
Technology Innovation Officer,dr Jeffrey Faust, who has been
with us since 2020, right,that's correct.
So welcome to the podcast.
We are happy you're here.
Speaker 3 (02:08):
I'm glad to be here.
Speaker 1 (02:09):
I've been looking
forward to this genuinely Well,
and today is all about AI,artificial intelligence and who
better to have help drive thisconversation than yourself?
Speaker 3 (02:18):
Well, I can think of
a lot of people, but they
probably were busy winning NobelPrizes and other things that
are happening right now.
Speaker 1 (02:24):
Correct.
Earlier this month their NobelPeace Prize for Chemistry, I
believe, was awarded to threeindividuals about predicting
protein structure.
I think that's right.
What I was reading up on thatwas it takes like years to
navigate this stuff and theybuild an AI program to help get
that done.
Obviously a lot faster.
Speaker 3 (02:42):
It is a buzz.
What AI offers for medicalsciences is something I
obviously a lot faster.
It is a buzz.
You know, what AI offers formedical sciences is something I
think there's a lot of optimismaround Right.
Speaker 1 (02:50):
For those of you that
don't know you, you've been
with us since 2020.
Correct, right, so tell us alittle bit about yourself, your
background, how you got here,sure.
Speaker 3 (02:57):
So you know I'm not a
Virginia native and I own that,
but I have lived in Virginialonger than anywhere else, and
so what happened for me is outof college I graduated from IUP,
was going to be a teacher, comefrom a family with many
teachers and many educators, andwhat was going on in
Pennsylvania at the time wascontraction, economic
contraction, jobs were tough,and I had a principal of a
(03:20):
school that I interviewed withsaid hey, listen, I'm going to
tell you what I wish.
Somebody would have told mewhen I started my career, and
that was you're going to be aphenomenal teacher.
I know you're a phenomenalteacher.
There's no way I'm ever goingto be able to hire you in my
school district because you'reapplying against a thousand
other applicants, most of whomhave been doing this exact job
for years.
Wow.
So he was like you want myadvice, go somewhere where they
(03:44):
need teachers.
And I said, sure, I was prettyadventurous, young, full of lots
of energy and optimism andfound Virginia.
So I ended up in Culpeper.
I was a teacher in Culpeper,virginia, for several years.
I moved to Fairfax where I wasa teacher and then became a tech
support specialist in Fairfax,and then I left education.
Okay, so funny thing.
(04:07):
So during my time, while I was,you know, teaching and working
as a tech specialist in Fairfax,I discovered technology.
And when I say discovered,obviously technology already
existed.
But working in graduate schoolat UVA, wahoo, there was a.
Speaker 1 (04:23):
I'm sorry, we have
some tech people in this room.
Speaker 3 (04:26):
Hey, listen, I am a
hokey-hoo, so I have a degree
from tech and I have a degreefrom UVA.
Speaker 1 (04:32):
There you go.
Speaker 3 (04:33):
So I proudly wear the
maroon and orange, and also the
orange and blue.
Okay, that's acceptable, yeahand there was a work that I was
doing at UVA required me toactually get into software
development, got into coding,and it was the first time where
I built something, a technologyproduct, that wasn't just I'm
going to use a piece oftechnology to do what it does.
(04:53):
It was I'm going to build apiece of technology to do what I
needed to do because there'snothing else out there to do it.
And that changed my wholeworldview, built some
relationships in Charlottesville, ended up working for a
technology company inCharlottesville, an LMS company
specializing in medical CME andtargeting doctors and medical
professionals who needcontinuous training.
(05:14):
So it was still education, butit was the technology side of
education and specifically adultlearners, and I did that for
the better part of six or sevenyears.
Eventually, somebody came to meand said hey, there's this job
opening in in the city schoolsin Charlottesville.
You need to apply.
And this was a friend of mine.
Our daughters swam together andI was like nah you know, I
don't think so I think I got outof education, and I don't think
(05:34):
that that's where I need to be,and he's like no he's like you
know, we need you, we need yourvision, we need you.
And I said you know, okay, whynot?
So I threw my hat in the ringand, long story short, whatever
it is now, 14 years later, Icame back into education, but in
a technology role and intechnology leadership for
Charlottesville for eight yearsand now here in Chesapeake for
four plus years, and along theway I think I've learned a few
(05:55):
things.
But yeah, so it's a strangepath, not what anybody would
call as the traditional orprescribed path to how you get
into this seat and into thisposition, but I think that it's
providing me with a perspectivethat makes my worldview, my view
of education, my view oftechnology, fairly unique.
Speaker 1 (06:12):
Yeah, that's great.
Well, we're happy to have you,and today's episode is all about
AI.
At one of the superintendentscommunity engagement council, I
believe last year, about sort ofthe history of AI, how it is
moving and what direction it'sgoing and how we're going to
incorporate that into ourschools and learning.
(06:34):
So do you mind giving a littlebrief background on some of that
information that you shared atthe community engagement council
?
Speaker 3 (06:42):
Sure, so I think that
the place we want to start is
that there was a I'm going toequate it to cloud.
If you remember, four or fiveyears ago, cloud cloud everybody
was saying cloud, cloud, cloud,and so I think one of the first
things we want to do is teardown some of the loaded aspects
of the term AI, and I think wedo need to think about it as
just being technology and a wayto augment human capabilities,
(07:07):
and obviously AI has become themoniker that we're all familiar
with and we use out there whenwe're talking to friends and
we're hearing the news storiesand everything else.
But the truth is, it's the nextiteration of technology that's
designed to support the workthat we do, and so, to me, the
first thing is sort of goingokay, well, let's relax, because
we're not talking about iRobot,we're not talking about
(07:28):
mechanized warfare, theTerminator Right.
But I do think for some people,ai invokes that.
What I want to think about,though, is it's not new.
We've just come to accept thisnewer application of
technological systems in a waythat, in providing opportunity
to us that we weren't able to dobefore, we're lumping that all
into this group of AI, so wehave to talk about what kind of
(07:51):
AI there's.
You know neural networks, andthere's generative AI, which is
the one we're most often talkingabout and the one that I think
has the most implications for us.
You know, and some people don'teven use the word AI Some
people are just callingeverything chat GPT the idea
here that we have these toolsthat are enhancing our abilities
.
I think that they're notnecessarily new, but I think
(08:15):
widespread usage and adoption ofthem and the rapidity with
which they're advancing isabsolutely new.
Two years ago, nobody wastalking about chat GPT.
Now everybody's talking aboutchat GPT, and that's just one of
many, many publicly availabletools, and so to me, the
landscape right now is uncertain, but full of optimism.
Speaker 1 (08:35):
It's like the wild
wild west with AI a little bit.
Speaker 3 (08:38):
That's fair.
I think that one thing I wantto give some credit to and
credence to is I hope as asociety we learned a little
something from the social mediatrajectory.
Social media came out, we allembraced it.
We were putting it everywhere,we were spending all our time on
it, and now here we are, 20years later or 15 years later,
(08:59):
going, whoa, maybe that wasn'tall good, maybe some of this
social media fanaticism actuallyhas been destructive towards
society at large.
What's great to see right nowis that some of the
conversations that we probablyshould have had around social
media are in fact happeningaround AI, ethical usage, bias,
exposure to things that areuntrue, misinformation,
(09:20):
disinformation campaigns andAI's role in that.
And we see, you know, evenCongress and politicians in
Washington taking an interest inasking for people to come and
speak about this already, whichis great, because it took 10
years into the social media sortof adoption before those
conversations were being had youbrought up a good thing.
Speaker 1 (09:38):
You were saying about
how kind of globally we're
addressing the ethical use of AI.
What are we doing as a districtto help navigate that?
Speaker 3 (09:47):
Yeah, so our start
has been I think cautious is
probably the right word to useLast spring we formed a
committee called the DisruptiveTechnology Committee and we have
teachers and administrators andcommunity members all
represented on this committee,and where we started was let's
talk about what is, what isn't,let's have conversation, and
(10:09):
from that we spun out some smallpilots hey, let's check out
this tool.
And so what we've been doing ishaving healthy conversation
around it.
And then one of the things thatwe were able to do this year
for our teachers was to enablethem to utilize AI that's built
into our productivity suite herein Chesapeake so that they can
begin to ask it forsupplementary help and support
(10:31):
To me.
Everybody knows that ourteachers are overwhelmed.
Everybody knows that we keep onasking more of them in spite of
knowing that they'reoverwhelmed.
And one of my favorite thingsaround AI that we're really
encouraging for our teachers iswe teachers spend and some of
the studies that I've seen anenormous amount of time just
searching the web, and let's allagree that search is broken
(10:53):
right when I search if the firstpage is all sponsored it's all
sponsored, again paid right nodoubt.
Exactly.
That's not.
That's not real search.
That's that's me pretending toparticipate in marketing,
unwittingly participating inmarketing in some cases.
But if, instead, what I reallyneed is like three questions
about a topic and appropriatefor a third grader in both
English and Spanish, if I canget those three formative
(11:15):
questions from an AI, productand a platform and I don't have
to search the web for otherteachers who have written them,
and that's a huge thing becausea teacher can get those in 30
seconds.
So where we're starting is withcautious piloting and trying and
trialing, and we have madeGemini and again, that's not
because it's better or worse orwhatever, but the fact is that
(11:36):
we've embraced our relationshipwith Google, and Google
Workspace is our productivitysuite and our students are on
Chromebooks and our teachers areon Chromebooks, and so the
Gemini tool is being integratedwith our platform and our
ecosystem makes a ton of sense,and so teachers being able to
say, wow, I can save myself time, make myself more efficient, do
some analysis, get somefeedback, is a great place for
(12:01):
us to start and getting themcomfortable with.
Hey, this is a personalassistant built into your device
, built into our resources, thatyou can leverage to help you be
more efficient in some of thework that we know is taking
teachers hours and hours everyweek.
Speaker 1 (12:10):
So obviously,
productivity is one of those
things that we're excited forwith using it for the teachers
At the same time, what is beingdone to help sort of train those
teachers with that use so thatthey can implement it in the
classroom?
Speaker 3 (12:26):
and then also just
for themselves.
Yeah, so this year, duringpre-conference, our weeks before
the school year, we had anumber of training sessions for
our teachers.
We're also we've tried to getrepresentation across our
technology innovation coaches,who are present in all of our
schools, helping them tounderstand what's available,
helping them to understand howto work with their teachers, and
to say to teachers it's okay toask you know Gemini to provide
(12:49):
you with some questions.
It's okay to ask Gemini to lookat something that you've
written or used before and askit to reword it for you, or to
take it down to a third gradereading level instead of an
eighth grade reading level.
Those are all very, very usefultools.
So, between our technologyinnovation coaches, as well as
the trainings that we've offeredin the weeks leading up to
(13:10):
school, both for administratorsbut also for teachers, and then
the ongoing work of thedisruptive and emerging
technologies committee, that'show we're approaching it right
now.
The other thing we're doing iswe are keeping our eyes open to
partners that we work with.
So one of the things is do youneed a new product?
Do you need to go out and buyone of these boutique products
(13:30):
from a company that is sellingsomething to schools that is,
the school AI.
That isn't yet another product,yet another platform.
Or do we want to look to ourbig partners and vendors the
Microsoft of the world, the AWSs, the Geminis, the you know?
For us, Instructure is a hugepartner for us with our LMS.
They're all pursuing and haveAI strategies that we're
(13:51):
watching very closely to see howthose get integrated into tools
that we're already using.
One of the most recent articlesthat I read suggested that 75%,
or better, 75%, of the currentexisting AI companies will be
out of business in less than ayear Like this is the cycle.
The cycle is really fast soyou've got these AI companies.
They stand up, and if we buy aproduct and then that company is
(14:13):
out of business in nine monthsand we spend time investing in
training, and then thatproduct's just gone, they're out
of there right I think thatthat's one of the things that we
also do need to avoid okay,something that comes up.
Speaker 1 (14:24):
I believe I'm a
parent, you're a parent yeah and
there's definitely going to besome challenges and there's
probably some concerns.
I know one of the biggest, as aparent and using technology on
a daily basis, is data privacy.
What are we doing to helpprotect our students and
teachers, our staff, with dataprivacy, and how does that
(14:46):
interplay with AI?
Speaker 3 (14:48):
Yeah, so it
interplays with AI the same way
it would interplay with anyother technology system.
One of the reasons, going backto, I mentioned the fact that
Gemini is convenient for usbecause it's integrated with our
ecosystem.
The other thing is that it'scovered under the agreement that
we have with Google, so one ofthe promises we have from Google
is that they're not going totrain their AI off of the data
that we've entered.
Speaker 1 (15:08):
Okay, that's good to
know as an education customer.
Speaker 3 (15:17):
We're very, very
cognizant and aware of those
concerns, and one of the reasonswhy we would not just go open
up product X that is the nextgreatest AI if they wouldn't
offer us the same guaranteewhich says, when you type in a
question or when you type in anidea, or when you upload a
document to us, if they're notgoing to promise us that they're
going to keep that as our dataand not their data, and that
they're going to let us use it,but they're not going to use it,
then that wouldn't be a goodpartner for us.
(15:37):
So yeah, so that's one of thepromises we have.
The other one, with studentprivacy, gets really, really
interesting, and I think this isone of the areas where I see
some overlap with the cell phonepolicy.
Part of the problem with thecell phone and apps is the
current EULA right, E-U-L-A EndUser Licensing Agreement.
It says you have to be 13, butthere's no checks that say are
(16:01):
you actually 13?
And so one of our jobs is toread through the agreements and
read through the privacypolicies and practices of the
company.
So we read through that and wemake a determination as to
whether or not they're adheringto Student Privacy, pledge,
project, unicorn, some of thestandards that are out there
from IMS Global, now One Ed Tech.
It's front of mind, for sure.
(16:22):
It's front of mind for us tothink about what's happening to
the data that we're putting in,about which ones we encourage
and or enable for staff, isbecause we want to be assured of
those kinds of things before wewould roll it out any more
widespread than that.
Speaker 1 (16:42):
Gotcha In the
classroom.
It's very simple for someone totype up maybe something through
chat.
Well, I don't know if chat GPTis open yet in classrooms.
Speaker 3 (16:51):
So I mean, here's the
thing is if we talk about our
devices and our network andstudents no.
But at the same time, like if astudent goes home, student goes
home, they're on their device,exactly Right.
Speaker 1 (17:04):
And they write up an
essay.
That's right.
How do we know if it's theirwork or the AI work, or how does
that even come into theconversation with education?
Speaker 3 (17:15):
This probably is one
of the most challenging topics
right now.
Traditional mindset,traditional perspectives would
say they're cheating.
Okay, what if they're notcheating?
What if they're just using thetools that they have in front of
them?
And I'm going to give you agood example of this.
When I was in high school,graphing calculators were just
coming out.
Speaker 1 (17:34):
Right, this was not
the TI-81 or 82,.
Whatever that costs like an armand a leg, that's right.
Speaker 3 (17:42):
Back then I remember
my family literally being really
concerned about thatrequirement.
I had a teacher in high schoolwho required us to learn how to
use a slide rule becausecalculators were cheating.
I know that teacher wasgenuinely doing what he thought
we needed and was best for us.
I can promise you that the lastday of that unit when we used a
(18:03):
slide rule was the last time Iever touched a slide rule, and
that's you know.
I went on to earn a physicsdegree and never touched a slide
rule again.
So to me that's a wonderfulcomparison for what is cheating
In a day and age when we haveaugmented tools to support my
abilities and make me more ableto do these things?
Is that cheating any more thanusing a wrench?
(18:24):
Is that cheating any more thanusing refrigeration to keep food
from spoiling, or turning onthe lights in a room rather than
having to stoke a fire andlight a bunch of candles?
So it's a tough conversationbecause we bring to that
conversation our own biases andour own perspectives.
And if I ask an English teacher,a language arts teacher, those
teachers value reading andwriting at a level that nobody
(18:47):
else does, for good reason, andthey want everybody to be as
adept and passionate aboutreading and writing as they are.
So to them it's cheating.
They're not wrong.
But to a person who hasstruggled with, maybe even has,
a learning disability aroundreading and writing, and they're
now able to do something theyweren't able to do before.
None of us would call thatcheating.
(19:07):
So I do think this is the cruxof the conversation there's not
one right answer to, but I dothink whenever something new
comes out that makes our liveseasier, there's a perception
that somehow that we're cheatingby using it, and that's a
natural human response to it.
But if what I'm actually ableto do is write 10 times more in
half the time, then maybe that'sreally good.
(19:28):
And I'm going to put one morecaveat on this Taking credit for
somebody else's work is wrong.
Okay.
So one of the things we have to, I think, normalize is, if
you're using AI to augment,enhance, create, add to the work
that you're doing, we need totalk about what the proper
treatment or citation for thatis.
It's not that high schoolcomposition classes shouldn't
(19:50):
use AI, but what we should do issay well, if you're using AI,
shouldn't use AI.
But what we should do is saywell, if you're using AI, here's
best practices, includingciting which AI you used, what
your prompt was, how much ofthat content you used.
If a student goes home andgenerates 12 pages of writing
(20:12):
and just turns it in without inany way reviewing and changing,
adding their own perspectives toit that I would argue yeah,
that's taking credit forsomebody else you didn't do that
.
Right, right, absolutely, and sothese are the conversations I
think we need to have, and Ithink, with AI, the reality that
we face is that criticalthinking and those I'll call
them uniquely human traits aremore important today than
(20:34):
they've ever been and will bemore important tomorrow than
they are today.
So, as education institutions,we need to go.
Okay, I want to make sure thatI'm not focused on the mundane
those.
If we talk about bloom'staxonomy, you know knowledge,
understanding.
We need that analysis synthesiswe need high level stuff because
(20:54):
humans do that really well,machines don't, you know, and
and as much as ai is gettingbetter, it still doesn't do
those tasks really well.
So we want to, we want tochallenge our learners to
embrace those more difficultlevels and really get into that
synthesis analysis.
You know high level inevaluation, high level thinking,
(21:17):
what we often collectivelyrefer to as critical thinking.
It's more important now thanever and it's going to continue
to gain in importance.
Speaker 1 (21:24):
I wonder if people
were having these sort of same
conversations when you mentionedthat TI calculator when it was
first being introduced.
I wonder if this sort ofdiscussion has taken place.
Speaker 3 (21:34):
I can promise you.
There's a wonderful photographfrom I believe it was the New
York Times, of teacherspicketing out in front of their
schools no calculators in class.
It literally created protests.
But more interestingly, let'sgo back to the turn of the
century, not this century, butthe previous century.
Like late 1800, early 1900s,there were major, major concerns
about introducing paper intoschool.
That was technology that wasdisruptive at the time.
(21:55):
There's a moral panic aspect,but I do think it's important
for us to work through that.
Speaker 1 (22:00):
So let's talk about
the students.
Why is it important for ourstudents to start leveraging and
using AI appropriately in ourschools?
Speaker 3 (22:09):
Current students
current K-12 students are going
to be moving into a world whereAI is augmenting professional
pathways what that looks like.
I'm not a futurist, right.
Speaker 1 (22:21):
We don't know what
sort of jobs are going to be out
there, right, but I canprobably guarantee you it's
going to incorporate AI,absolutely.
Speaker 3 (22:28):
Kids are definitely
going into a world with AI and
that AI is going to augmenthuman capabilities.
But here's one of thestatistics that I've seen
recently that got me excited,and that is that augmentation of
humans provides betterefficiencies and profits to a
(22:48):
company than replacement ofhumans.
We saw that in some of the autojobs right, machines were going
to replace autoworkers, andthen what we found out was that
machines were just very good atgetting the job done, but they
weren't very good at assuringthe job was done right.
So Detroit and othermanufacturers around the world
have gone okay.
Well, yeah, we've got morerobots than ever and, I would
(23:09):
say, ai driven, but at the sametime, there's humans that work
with the robots to make surethat the things are in the right
place, the joints are wherethey need to be and all that
kind of stuff.
So that augmentation and theidea that kids are moving into a
world where AI and skills withAI are important is real.
So for our students, we, asChesapeake Public Schools, are
(23:32):
looking to make AI toolsavailable to students.
We, as Chesapeake PublicSchools, are looking to make AI
tools available to students.
We also are being verypurposeful about that because we
want to make sure that wechoose tools that truly are
beneficial to all students andenhance their experience and
make them better learners andmake them better citizens.
So, for students, they alsojust have to accept the fact
that it's here, it's real.
They should be playing with itgenuinely.
(23:55):
Except the fact that it's here,it's real, they should be
playing with it genuinely, and Ithink that the last survey that
I saw was you know, 74% of kidsadmit to using AI, so we know
they're using it whether it's onour devices or on their
personal devices.
they acknowledge it and I dothink that that is going to be a
really critical skill.
Prompt engineering is a jobthat pays six figures and
(24:15):
there's not enough promptengineers out there for the jobs
that are open right now, and weknow that large companies are
saying hey when the next 10years.
One of the top five skillswe're going to be looking for is
ability to work with AI forproductivity.
Speaker 1 (24:29):
So it makes sense for
us to not just have this
discussion but figure out theways, like what you just said,
to incorporate it in our schoolsthat are purposeful to help
prepare our students for thesefuture jobs.
Speaker 3 (24:46):
Yeah, and one of my
favorite things is to amplify
those things that are uniquelyhuman and to enhance the
humanity of the experience.
And those are, that's to me,the overarching, the overarching
.
Speaker 1 (25:00):
Well, that's great to
hear.
As a former technologyinstructional now coach, I'm
into this stuff, so I could sithere and talk about this all day
, but unfortunately we're goingto have to go.
We are going to speak with ateacher and a technology
innovation coach on the nextepisode and they're going to
share how they're starting toincorporate AI in the classroom.
(25:24):
Thank you so much for coming intoday to share where we're
moving as a district with AI inour classrooms.
Speaker 3 (25:32):
Fantastic.
Thank you.
It has been a pleasure and gladyou're talking to a teacher and
a TIC as well, because theirperspectives are going to be
invaluable.
Speaker 1 (25:38):
Yeah, it's going to
be great, awesome.
All right, thank you so much.
We hope you enjoyed the storiesbehind our story on this
episode of Amplify theChesapeake Public Schools
podcast.
Feel free to visit us atcpschoolscom.
Forward slash amplified for anyquestions or comments and make
sure to follow us wherever youget your podcasts.