Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:15):
Hi,
everyone and welcome to Another Voices of literacy conversation.
Today.
My special guest is Doctor John Spencer who is not only a personal friend but also a former middle school teacher and current full-time professor at George Fox University in Salem,
Oregon.
John teaches preserved teachers how to leverage educational technology for creativity,
(00:38):
design,
thinking,
student engagement and in all sorts of ways that make the world better in a day.
Addition to being a huge fan of libraries and a library advocate,
John is also a reader and creator.
He shares his vision for creative classrooms through his blog Spencer ideas dot org and his illustrated videos which you will see throughout our semester and which I'm a huge fan of at sketchy videos dot com.
(01:05):
You can connect with him on Twitter or on Facebook.
Hey John,
thank you so much for being a part of this conversation.
I really appreciate it.
Thank you.
I'm excited about it.
Can't wait for my students to learn from you today and our conversation around A I we are recording this in the spring of 2023 and I'm sure I know for sure that you are getting a lot of questions about artificial intelligence generated A I in particular right now and I don't want to assume anything but the conversations I'm mostly having or the conversations that educators mostly want to have with me about this.
(01:44):
It revolves around cheating and academic integrity,
et cetera.
So I wanna start our conversation in a different place and ask you what you want to talk about in terms of A I,
because I feel like that's a very limited view of these tools and the impact they're about to have on our lives.
Yeah.
I mean,
II,
I would love to talk about what we mean by A I because I think we really,
(02:09):
um,
if,
if we don't understand what A I is,
then it becomes,
um,
either really scary or the actual things that are scary you,
you miss,
right.
So I think the nature of A I,
I'm really interested in,
um,
and then also what does this mean for everything from,
(02:30):
um,
literacy to content creation,
to the curation process,
to,
um,
a learning tool?
I mean,
I think any of those things,
uh,
excite me because I do think we need to get beyond the,
how are students gonna use it for cheating and how do we catch them mindset?
Absolutely.
So,
ok,
this is a great place for us to start break it down for my students.
(02:53):
What,
um,
are you thinking about and how would you describe a,
I to someone who's maybe heard the term but doesn't really know what it is.
So I think the biggest thing to remember is this is not an A I revolution,
it's an A A I evolution.
And so A I is simply any time that you are using algorithms.
(03:15):
So steps processes any time you're using algorithms to facilitate the thinking that you as a human would do,
right?
So the example,
so the term A I really developed back in the 19 fifties,
I think we need to remember the notion that A I is that old and A I is totally pervasive throughout our life.
(03:44):
So II I will give an example and the example I'm gonna give is let's imagine that there um is a couple,
right?
I'll just be fun and uh they get up in the morning,
they're,
they're,
they're married.
Um and um they're both teachers.
So we'll just have,
have a little fun.
They get up in the morning and um phone rings,
(04:06):
uh the the alarms go off,
they grab their phone,
look at it,
open it up.
Um and she texts her mom,
there's a quick auto fill,
they laugh because her mom texts her and says,
um you know,
the pandemic has been hard on us instead of pandemic,
right?
And she let it up and says,
wishes her mom,
(04:26):
well,
her mom's gonna have a screening later on and um and she's worried that it might be cancer.
Um Her husband gets up,
he's checking his fan fantasy.
Uh sports gets the recommendations on,
on what to do.
They get up the um thermometer auto adjusts in their house.
(04:47):
They walk to the fridge,
they look at the temperature,
it's gonna be pajama day at school.
They have matching pajamas because they both got this recommendation on Instagram uh Instagram that popped up and they bought each other the same gift for uh for their anniversary.
Um They get the recommended curation on Spotify and they're gonna do their,
(05:10):
their weekly um dance party in the kitchen that they,
that they love to do and it's 10 minutes into the day and algorithms are everywhere.
So algorithms are um the facial recognition.
So A I is the facial recognition that allows them to unlock the phone.
It's the voice to text that um she's using um as she texts her mom.
(05:34):
It's the auto correct on the text.
It's the screening process.
Um The A I that's gonna screen her mom for cancer.
It's the spell check that we use on our computers every day.
It's the um the algorithms that allow her to um find out the optimal price to go fly home when she finds out that the diagnosis isn't good,
(06:00):
but it's also the autopilot on that flight.
It's,
you know,
it's all of these things.
It's the nest thermometer that they have in their smart home.
It's the weather app and the way we track it,
it's our city planning.
So the key thing to recognize is like already A I is everywhere.
And so even in just the 1st 10 minutes,
it's the supply chains and the efficiency of the supply chains.
(06:23):
It's the fact that they,
they ordered something,
you know,
last night and it arrived via Amazon.
You know,
like when I was a kid,
we had ice cream trucks.
So my dad was a kid.
They had milk,
the milk man in the truck and we have the Amazon truck,
right?
And it is,
it's,
it's everywhere.
It's the recommended songs on Spotify and that curation process.
(06:48):
And so A I I think we need to recognize is any time that we are using algorithms to do the thinking for us,
it's the recommendation players on fantasy football.
It,
it's the,
it's the,
you know,
all of these things.
So the evolution that we're talking about having right now,
(07:08):
why,
how is that?
I guess more disruptive or is it more disruptive?
Why are people so concerned about it?
How is it different?
So I think the thing to remember about what's,
what's different is there are the,
there the current A I that we see with things like chat GP T or um all of the different image generators that we've,
(07:33):
we've seen,
I mean,
so many different types of generative A I is that for,
for years,
um the algorithms have been pretty much preprogrammed and I'm not doing a very good job of explaining.
It's a little more complicated,
but they've been preprogrammed.
And now what we have is instead of programming it how to do a task.
(08:00):
You are programming it to learn how to do a task,
right,
in a way that mirrors the neural networks of our minds by mining massive amounts of data.
And so now that we think about things like generative A I,
it's,
it is different than other forms of A I.
In the past,
(08:20):
it's a different autocorrect is different than a spell check,
right?
Because the pattern recognition system is different and spellcheck is much more objective auto correct.
Now introduces things like bias,
right?
Bias.
You know,
there's all of these sources of,
of data that itself are biased.
(08:41):
So it's introducing bias,
there's the potential of generative A I to lie like you and I were talking about this before in terms of like ask it to write your bio and see what is wrong about you.
Yeah.
And it's a lot because it wants to please you,
it wants to please you.
So it,
you know,
I'm talking obviously I'm,
you know,
(09:01):
attributing human characteristics to it,
but it wants to complete the task for you.
And so it,
its mission is to do that and it will fill it with whatever it can find that feels like it's,
it works in that to answer your prompt.
And I think it's really important to understand.
So if we think about generative A I it can do creative work and I think that's truly what's different is it can create something original and that feels to us so different and it is so different.
(09:30):
I think the key thing to remember is it's not,
it's not thinking right?
Because thinking is effective.
Thinking gets tired.
Thinking.
Um When you leave chat GP T it's not continuing to think,
right?
It's just doing nothing.
Um So it's really important that we remember that these are metaphors.
(09:51):
Um And if I just nerd out on this for a second one of it,
I love it when you nerd out.
It's my favorite thing.
I mean,
as a history nerd,
one of the things I am struck by is how powerful the metaphors are in shaping the way we think about things,
right?
(10:12):
So if we go back to this idea,
the conceptual metaphor theory in any given conversation,
there's an implied metaphor,
right?
So when people are talking and they say uh the conversation's getting heated,
he shot down my argument this and that,
you know,
that a conversation has become an argument when you start using war metaphors in your language,
(10:34):
right?
It happens all the time.
So what's interesting is we now have these new metaphors that shape the way we view things and it's really easy to miss how that is shaping our,
our,
our thinking about the technology,
right?
So tell me more about that.
What I mean?
Is.
(10:54):
So if we think about A I,
social media is A I,
right?
So Facebook is an A I,
it's not using A I,
it is an A I because it's using smart algorithms to filter content.
Social media itself has two conflicting metaphors,
social and media and it's a fusion of those two things and in a way that sorry if you can hear my dog moving in the background,
(11:23):
like minor barking downstairs.
So if you think about it,
you have this fusion of social and media in terms of features.
And so people will talk about Twitter is it,
it's not what it used to be.
It's it's changing as a place.
They're treating it as a social network as a community as a,
as a place.
But then when we use things like,
(11:44):
I don't know how to use Facebook.
I don't know how to use Snapchat.
We're treating it as a tool,
right?
And this fusion of tool and space is very weird.
We,
we have never applied metrics to relationships before.
Yeah,
I would never tell a joke.
Ask people how many of them like it,
give me your reaction.
Count the thumbs up that I get,
(12:06):
walk into the next room and broadcast that to everyone.
Yeah,
but we base,
I mean,
and I say we,
I'm not just talking about us but I'm,
I'm really talking about the young people.
We serve,
we base so much of our self worth on those metrics now that,
that fusion that you're talking about,
I mean,
II I know I'm not the only one thinking about this,
(12:27):
but I'm,
I've been thinking a lot about whether or not that relationship is healthy and I think I know the answer but then what do we do to make it healthier?
And I know that we are veering off of A I,
no,
no,
no.
Because,
because that is a,
I,
I mean,
like,
so,
you know,
all the things filter bubbles.
Um Fear of missing out all these different components um become huge,
(12:51):
right?
They,
they are the they are these aspects of social media and um we have never in the history of humanity applied numbers to friendships the way that we do,
right.
That,
that is really odd.
(13:11):
We have never de how do I say it?
We've never essentially created skewed timelines that are based on relevance,
right?
It,
it's always been this.
So for that reason,
we have this really strange thing that,
that is changing the way we relate to each other and that's important to recognize that's an element of how A I is changing our world in some good and bad ways,
(13:35):
both,
right?
And um it's interesting there used to be a tool.
They,
they,
they shut it down.
Facebook's api shut it down,
I believe.
And it was called the Facebook Demetric.
And,
and literally what it did is every single number was gone from Facebook there were no likes.
Yeah.
(13:55):
I mean,
you could,
you could click,
like,
and they would get a like,
but you couldn't count the likes.
You couldn't,
they,
they wouldn't tell you when something was published.
It was,
the numbers were gone and it was a wildly different experience and it literally,
like,
changed my behavior when I used it because I stopped liking things,
(14:16):
not based on social proof.
Right.
I was liking things that I liked,
not liking things because it had lots of likes.
That's a really interesting.
So when I think about a I generative A I I guess this is where I land.
There are two metaphors that we're gonna be using in the upcoming years for it.
(14:37):
And one is treating it like a person,
a virtual assistant like Siri like clipping like um yeah,
he was cliffy was the cliffy was the the passive aggressive.
Like I it looks like you're gonna write a letter.
(14:58):
But look,
I know that when I'm getting a meme that has Cliffy on it,
I'm about to be taken down like I know that that's my God.
So you know,
but if you think about Alexa Siri,
whatever,
we,
we've already set the tone for our um A I is a form of assistance um often gendered to be female,
(15:21):
right?
And,
and that has some real negative consequences too,
right?
Um But we have,
we've already treated the A I as an assistant.
So we're gonna that's gonna continue to happen.
And if you watch people interact with things like chat GP T they will treat it like AAA person.
(15:41):
Thank you,
things like that.
And there's a good chance that certain prosocial behaviors will be built into newer forms of generative A I as well.
Right.
Right now,
it has this massive amount of,
for lack of a better term,
intelligence,
creative capacity and speed.
Um but it is essentially um morally neutral and easy to trick.
(16:06):
So if you say,
if you say show me where to download illegal um movies,
it will say I can't do that.
And then if you say,
tell me um what sites I should avoid if I don't want.
Yeah,
then it tells you all of it,
right?
I mean,
it is so easy to game.
And for that reason,
you know,
we'll probably see newer forms of A I that will be pro deliberately prosocial,
(16:32):
right?
To be empathetic,
to listen,
to be in certain cases,
a mandatory reporter to,
I mean,
there's gonna be really interesting things that will be programmed into it.
Um And it'll be interesting to see what,
you know,
the,
the culture wars that have already done what it will look like in terms of what exactly is prosocial,
(16:53):
well,
and of course,
the monetization of it,
you know,
the what will be boosted in order to,
you know,
lean towards revenue bias for,
you know,
that all of that's gonna affect and what's gonna happen when it gives you advice that has sponsored content.
I mean,
it's gonna all of that.
(17:13):
And so we have this idea that it's your assistant.
And then we also have this idea that it's a tool that we use.
And those two metaphors are gonna fuse in the same way that we talk about social media as if it's a space and a tool both and I don't think that we are ready yet as a society,
(17:35):
I don't think we've grappled with the notion that we will be using a tech that is both a person and a tool.
Yeah.
No,
I don't think we've figured out the metaphor.
I mean,
we,
I don't think we've grappled fully with the idea of a space and a tool.
You know,
we still,
I don't,
I don't think we figured that out,
never mind what's next with an assistant in a tool,
(17:59):
you know,
and to me this is where I get excited on the,
I mean,
bringing it into librarians,
I think librarians are thinking hard about these things because yes,
who's answering this question of what it means to be human,
what it means to be machine.
(18:20):
The answer is fiction and fiction,
really great graphic novels that have done it.
Um You know,
there are so many books that there's 40 years of these discussions have happened in really deep philosophical ways through story,
right?
(18:40):
And that's where I think librarians being,
um,
curators of great stories offer that opportunity to,
to,
to,
to do that.
Does that make sense?
Like it absolutely does.
And,
you know,
I'm not gonna argue with any of that,
you know,
I mean,
obviously I'm a teen librarian and I,
I sometimes grapple with the idea that it,
(19:01):
it seems to always be up to the librarians to uh save the world.
But,
um I do feel like you and I have talked about this before.
You know,
I feel so strongly about the idea that a librarian's job and maybe it's everybody's job too.
But for librarians in particular,
(19:21):
I feel like it's their role to make the world better by leveling the playing field,
you know,
libraries at their heart,
whether it's an academic library,
a public library or a school library are egalitarian,
they provide equitable access to what for me are the most important resources on earth you mentioned story already.
But information and resources,
of course,
(19:41):
connection,
right?
So for this class and for just my work in general,
I'm always thinking about how do we use these tools to further that mission?
How do we use these tools to make the world better by leveling the playing field for the most vulnerable among us,
for you using them to help connect to be conduits for our communities to these resources.
(20:03):
And I just wonder like,
what do you think,
how are these tools going to be most useful for my students?
As they become librarians.
Oh my gosh.
So lots of different ways.
Wait,
I'll take notes.
So 11 of,
one of the ways is from a purely um making information accessible just from an accessibility standpoint,
(20:28):
it's huge.
So for example,
um you can go online and do research.
But if the grammar is written in passive voice and makes heavy use of um the past perfect progressive verb tense and has multiple clauses.
(20:50):
Um someone learning English is not gonna have the same access.
So the ability to simply,
you know,
copy and paste the website,
use it as a tool with just the text to to then say,
simplify this for this reading level for this text complexity for and those prompts are going to be huge.
(21:11):
It's not so much that the librarians will do that for students,
but they will become people who know how to use the prompts well in chatbots,
right?
And that I think becomes huge.
So accessing the information um summarizing information obviously um engaging in the information literacy piece is huge um um helping them know when things are a deep fake and things like that.
(21:40):
But really,
I think um using it as as a tool for clarifying conception,
you know,
misconceptions,
I um have have have used it.
My own kids have used it in math to ask a question like what is A P value?
Can you give me an example?
What,
what's the difference between this and this,
and they use it as this concept building machine.
(22:03):
And I think that's really powerful.
Um So I see a lot of potential in those areas for um in,
in nonfiction,
I think it will become a curation tool.
I don't think people are treating it as a curation tool yet,
I think it will become a really great curation tool.
(22:24):
And I will say one thing that I think is really important um in research for a second,
I was talking to a librarian and she said,
um what she wanted from um a work side page.
So at,
at their school,
this seniors do this big big senior project,
(22:46):
they have a huge works side page that is,
you know,
a P A format and all that kind of stuff.
And she advocated at her school.
She said,
I want two different forms of we're excited and this,
this is basically what she said,
look,
when you engage in information literacy,
you have a formal and an informal version,
(23:08):
right?
And you know,
you have the lateral readings very formal and it treats it as if it's all on browsers and stuff.
And um she'd read your book,
right?
So she described,
you know,
she described your work as informal information literacy.
And in other words,
when we're on mobile devices,
when,
when we're mindlessly scrolling,
(23:30):
that's,
that's also research,
that's also information literacy.
And when we text,
we don't text the same way that we write an essay when we read,
we don't read.
Um,
there's a difference between close reading and pleasure reading.
That's right.
Why would we have one universal information literacy?
Right.
Right.
Which again,
preach in the choir here.
(23:50):
So she was a fan of your work.
Talked about it.
We nerd out on it for a second.
And,
um,
but she said for that reason,
I want a formal and an informal work side page and my informal work side page,
I want it to be chronological.
I want you to tell me who helped you learn?
Was it people?
(24:11):
Was it relationships?
Was it someone you ran into?
Was it a I?
Mhm I love that.
How did the A I shape your perspective?
So she launched this idea of the informal work side page and the,
the,
the teachers said it,
it clicked the,
the alphabetized works side page was something that they just jumped through a hoop.
(24:34):
But the informal work side page became this journey.
And I,
and I think the same thing will be true.
Like it will help with a different version of a work side page.
It will help with a different version of curation.
Um So I think it's gonna change a lot of those things in a really powerful way.
I mean,
(24:54):
I think the ability to generate new ideas and compare your own ideas.
Um You know,
I,
I think a lot of again going to the library a lot of libraries have maker spaces and,
and engaged in creative work.
Um,
I just did a,
a blog post all about how I would use a I with students for project management.
(25:16):
Sure,
sure.
Plans adjustments,
checklist,
things like that.
So,
I think there's tons of potential in,
in positive ways.
Um,
and it's not always just gonna be cheating,
you know?
Right.
I mean,
I think it's natural.
I mean,
especially for somebody in the field who's barely like juggling all the plates that they've been told to spin,
(25:40):
you know,
who,
uh,
I,
I think in today's climate already feel like probably their job is pretty overwhelming to suddenly feel undermined by a tool that makes maybe some of their assignments feel irrelevant.
And I think there's some learning to be done there,
you know,
if your assignments can be completely undermined by a chat,
(26:00):
you know,
but then,
you know,
then it's maybe time to rethink those assignments.
But in the immediate like term,
in the short term that feels very,
uh,
you can feel very defensive,
feel unsure about what to do next,
overwhelmed et cetera.
So I understand those reactions.
Um,
but we have to navigate those emotions and then think more logically about like how these tools really can be a help to us and support us in learning and allow us to do things with kids.
(26:30):
Maybe we've always wanted to do in the past but didn't have the capacity to.
So I get it.
I get it,
but I'm ready to move this conversation uh towards some of those later discussions rather than focusing on the cheating.
So,
thank you for helping us do that today.
Yeah,
thanks.
I appreciate it.
It's been so much fun.
I want to end this with one last question.
And that is um are you reading anything right now that you would recommend to my students or even if you're not reading it now,
(26:56):
something you've read recently that you would recommend?
All right.
So two things,
one,
I just finished and one,
I'm rereading.
So I am rereading right now,
the uh Fable Haven series.
Oh,
yes.
Um I'm,
I'm just rereading it having fun.
Um um my kids all read that when they were younger and I was like,
(27:18):
you know what,
I'm just in the mood for fantasy and I love it.
I love that series and then,
um I forgot the name of it,
but I just finished a book um by a linguist named John mccorter.
And he um did this book on um the history of cuss words,
curse words.
And my Amazon argues that there's three phases of what is considered a taboo word,
(27:44):
right?
And one is one is that it's profane that it,
that it,
it is um um um denying the sacred.
And then the second phase that happens after that was a phase where it is all about um the body,
it's anything relating to sex like that,
you don't talk about that and then,
(28:04):
um,
as we become more secular,
um,
it's really normal for us to say,
oh God,
or what the hell when you drop something,
those aren't really bad words,
but they offend a few people.
They're a little,
um,
the,
the,
the,
the,
the F word and,
and sh word,
you still aren't supposed to say that.
(28:25):
But they've become,
words are common.
Yeah.
Much more common and they're much more used in speech and in writing.
Um And so they're more improper.
Sure.
But the taboo words are now words that marginalize people,
the utter the word,
(28:47):
for example,
even giving it as an example,
you squirm a little bit saying here or you want to take it out of an old old book,
you know,
you,
you,
you don't,
it's how do we read Mark Twain like that kind of thing.
And so,
um it's a really great read.
So fascinating.
I'll put it in the show notes.
I'll put both those books in our show notes and yay.
(29:09):
Well,
thank you my friend again for this conversation and for your work.
I'm super grateful in the show notes.
There'll be links to John's website and other ways that you can follow him and uh learn from him constantly.
We really appreciate you.
And until next time,
uh thank you for tuning in.