All Episodes

May 16, 2025 58 mins

Guest: Sarah Morris, Assistant Director of Academic Engagement at the University of Georgia Libraries

First broadcast May 16 2025. Transcript at https://hdl.handle.net/1853/77588 

Playlist here

"AI has become the elephant in the room."

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):


(00:00):
[MUSIC PLAYING]


SARAH MORRIS (00:04):
Weirdly enough, I feel
like I'm a little bit of anoptimist with a glimmer of hope
there is somethingthat we do in libraries
which is just flat outcritical thinking skills.
So even if you havesomething like a deep fake,
a lot of times wetalk with students
about don't try to getyour magnifying glass out
metaphorically andunpack this picture
but think about what isthe context in which I'm

(00:26):
seeing this.
Is this a topic that's areally hot button issue?
Can I look this up and see whatothers are saying about this
and almost goingold school with it
and maybe not always relyingon certain detection tools
or whatever but your owncritical thought process
to at least be awareand you're like what.
I'm not certain if thisimage is real or not,

(00:47):
but I know enough tobe a little bit wary.
And I'm not going to sharethis or take this at face value
until I learn a little bit more.
[MUSIC PLAYING]


CHARLIE BENNETT (01:16):
You are listening to WREK Atlanta,
and this is Lost in the Stacks--
The Research LibraryRock and Roll Radio Show.
I'm Charlie Bennett inthe studio with everybody,
and everything'sbeen rearranged.
There's Fred.
There's Alex, there's Marley.
Cody is on the board andhe doesn't have a mic,
so I can't ask himhow he's doing.
Oh, he says he's doing great.

(01:37):
Each week on Lost in theStacks, we pick a theme
and then use it to create amix of music and library talk.
Whichever you tune infor, we hope you dig it.

MARLEE GIVENS (01:45):
Our show today is called "Putting the A and the I
in Information Literacy."

CHARLIE BENNETT (01:50):
But there's three I's
in information-- oh, I get it.

MARLEE GIVENS (01:53):
Uh huh.

CHARLIE BENNETT (01:54):
AI.
You're trying to sneakartificial intelligence
into another show.
This is what Fredwould do, Marlee.

MARLEE GIVENS (02:02):
Well, hold on, I'm not being sneaky.
I'm telling you whatthe show is right now.

CHARLIE BENNETT: Yeah, you're right. (02:06):
undefined
I'm sorry.
I overreacted.

MARLEE GIVENS (02:09):
Yeah, well, I hear you,
but AI has become theelephant in the room.
So I've decided it's time tostop ignoring it and face it.

CHARLIE BENNETT (02:17):
Oh, boy.

FRED RASCOE (02:18):
Because it's not just the elephant.
It's many elephants.

ALEX MCGEE (02:21):
In many rooms.
I don't know that I'mready for this AI circus.

MARLEE GIVENS: Well, while we can (02:25):
undefined
agree to disagree on addressingor ignoring the AI elephant,
the--
or elephants I shouldsay-- the reality is
that for the most part,our students are using
AI more than we are, andlibraries need to decide how
we're going to deal with that.
One way to dealwith it is the way
we dealt with the wave offake news a decade ago,

(02:46):
by applying ourinformation literacy
skills in a new context, whichis why we brought today's
guest back on theshow after her episode
last year to continue theconversation we started on media
literacy.
And that's a conversation whichlike so many things nowadays
inevitably leads toa discussion of AI.

FRED RASCOE (03:03):
And our songs today are
about doing somethingnew, uncertainty,
and telling what'sreal from what's not.

CHARLIE BENNETT: Is that official? (03:10):
undefined

FRED RASCOE (03:11):
Let's start with a song
about trying to findreality in an environment
of false informationand hallucinations.
And since our guesttoday works at UGA,
let's make it a song froman Athens, Georgia, artist--

CHARLIE BENNETT (03:25):
Settle down.
Alex.

FRED RASCOE (03:27):
Called, funnily enough, Of Montreal,
but this is "Penelope" by OfMontreal right here on Lost
in the Stacks.
[MUSIC PLAYING]


MARLEE GIVENS (03:38):
"Penelope" by Of Montreal.
This is Lost in the Stacks,and our show today is called
"Putting the A and the Iin Information Literacy."
And we are pleased to welcomeSarah Morris, assistant director
of academic engagement at theUniversity of Georgia Libraries.
And I should say welcomeback because you were here

(03:58):
last summer to talk tous about media literacy.

SARAH MORRIS (04:00):
Yeah, thanks for having me back on, everybody.
It's good to be here.

CHARLIE BENNETT: Whenever you change jobs, (04:03):
undefined
we have to have a guest back.

SARAH MORRIS (04:06):
That's true.
It makes sense.

MARLEE GIVENS (04:08):
Yeah, yeah, yeah.
So just keep that in mind.
But--

FRED RASCOE (04:12):
If you want to come back, get a different job.

MARLEE GIVENS (04:14):
Yeah, exactly.

SARAH MORRIS (04:14):
It's a lot of pressure actually.

MARLEE GIVENS (04:16):
Sorry.
Right.
So the last timeyou were here, we
were talking aboutmedia literacy,
and that was your focusat your previous job.

SARAH MORRIS (04:24):
Yes, absolutely.

MARLEE GIVENS (04:25):
Yeah so how did this lead
to your currentinterest in AI literacy?

SARAH MORRIS (04:29):
Yeah, I refer to media literacy as the on ramp
into this AI--
I think the circustheme is somewhat
apt as a bit of a herdof elephants moment.
But I've had a long-standinginterest in media literacy,
and I know we talkedabout fake news
at the top of the show andmisinformation equipping people

(04:51):
to think criticallyabout these things
and help them better navigatethese increasingly complicated
information environments.
And AI is now complicatingthings even further
and having implications aroundbias results, misinformation,
and things of that nature.
So I felt like it sortof was a natural entry
point into this newworld of AI literacy

(05:12):
and figuring out how this fitswith both media and information
literacy.

CHARLIE BENNETT (05:16):
Do you have any simple and succinct answers
to how it fits.

SARAH MORRIS (05:20):
I--
I'm not sure the rightanalogy sometimes,
but I do think of eitherinformation literacy
as a sort of umbrellathat can house
other things or a foundationthat you can build upon.
So for me, AI literacyis now a part and parcel
of information literacy.

MARLEE GIVENS (05:38):
Do you have your own working definition
of what AI literacy means?

SARAH MORRIS (05:44):
I think I'm--
I feel like it'sclunky right now.
I might need ChatGPT toclean it up for me, but--
oh no.
No, I think there's--
I guess there'sdifferent elements of it.
So for AI literacy,there's a component
of being able to use AItools effectively and have
some understanding of how theywork so that you can use them

(06:05):
to best effect.
But I think it'salso really important
to consider an ethicalthroughline here
so a lot of issuesaround how can we
use these tools ethically,how can we understand
their implications and impact,how can we mitigate issues that
could be coming up with them.
So those are I guess threestrands that I would identify

(06:27):
and hopefully developthat into a nicer tagline
or something in the future.

CHARLIE BENNETT: That connects to what (06:32):
undefined
we've done at theGeorgia Tech Library
when it comes to podcasting andaudio because not only do we
teach courses aboutcomposition because podcasts
are an assignment inEnglish Composition courses,
but also we're hopingthat people understand
and how the effects are achievedlike here's how you cut but then

(06:55):
also here's how other peoplecan cut audio and change what
someone said or change howfast they said it, et cetera.
But with AI, it seems much moredaunting to teach how it works
and what it can do.

SARAH MORRIS (07:08):
Absolutely.
I like the podcastinganalogy you just drew,
the idea that you'reusing these things,
but you're thinking about it.
You're having maybedeeper understanding
of certain decisionsyou're making.
I think AI can feelincredibly daunting,
and there's certainly I thinka lot of feelings of being
overwhelmed at the moment.
A lot of the issue cansometimes stem just

(07:30):
from the language surroundingartificial intelligence.
She was talking about thingslike neural networks and machine
learning, and a lot of peoplejust throw up their hands
and say I'm not acomputer scientist.
I don't even know whereto get started with this.
So I think a question for alot of people going forward
and educationspaces and libraries
is how can we clarifythese things for users

(07:54):
or what's most importantto know to equip people
to think critically and usethese things effectively
without perhapswandering off and getting
a master's degree in acomputer science discipline.
Though if you want to dothat, that's fantastic.

CHARLIE BENNETT: Marlee, you were (08:08):
undefined
talking about composingwith ChatGPT off air.

MARLEE GIVENS (08:13):
I was.
Yeah.
And it's interestingbecause I think
one thing that feelsdifferent about AI
literacy frommedia literacy is--
and I know that media literacydoes encompass social media.
So there's someuser created content
and you're teaching peopleto be good consumers as well
as good creators.
AI also feels likethat, that there's

(08:36):
this element of I'm creatingsome things with an AI tool.
It's not just being fedto me, which is maybe
a change in the conversation.
I don't know.

SARAH MORRIS (08:49):
Yeah, I think that's a really good point
that media literacy canencompass both consuming
information, but also beinga producer of information.
And that can even be as much asliking a post on social media.
You're contributing thatway to conversations online.
With these AI tools, Ido think is bringing up

(09:10):
a lot of new elementsof what creation entails
and what it means to be both aconsumer and user and producer
with these different tools.

MARLEE GIVENS (09:18):
But I think one thing that feels different
is AI just feelssneaky sometimes.
It feels like I'm not reallysure if it's there or not.

CHARLIE BENNETT: A sneaky elephant. (09:28):
undefined

MARLEE GIVENS: There's this-- part (09:29):
undefined
of the literacy is reallyquestioning is this re--
and obviously it's real.
Even though we sayartificial intelligence,
the content itself is real.

CHARLIE BENNETT: How my stomach just (09:42):
undefined
clenched when you said that.

MARLEE GIVENS (09:45):
I'm sorry, Charlie.

CHARLIE BENNETT (09:45):
And I'm not--
I'm not making a joke about howI don't like it, but it is real
made me feel like,oh, gosh, that's
where the realtrouble comes from.
Here's an actual mediaobject and the creation of it
We don't even quite understand.

SARAH MORRIS (10:00):
Absolutely I think there's
critiques around the black boxnature of a lot of these tools.
And we've heard of this inrelation to other aspects of
say social media and how searchengines work and something
we've been grappling with.

CHARLIE BENNETT (10:15):
The algorithm.

SARAH MORRIS (10:17):
The algorithm, the dreaded algorithm.
But, yeah, I agreethat AI can feel--
it's weirdly in yourface a lot of times.
I've been joking aboutthe omnipresence of it.
All these tools now are sayinglook at this shiny new AI
feature, but there's alsosomething a little sneaky
about it sometimes as well.
You're not sure whatit's doing or how it's

(10:38):
produced something for you.

FRED RASCOE (10:39):
Well, this is Lost in the Stacks,
and we're going to talkmore about AI literacy
after a music set.

MARLEE GIVENS (10:46):
And you can file this set under Z6930.3.E76B33.
[MUSIC PLAYING]

You just heard "The ElectronicInsides and Metal Complexion
That Make up Herr DoctorKrieg" by Riders of the Mark.

(11:09):
I don't know if I want to sayanymore, but I will persevere.
Before that, "IDon't Dig Your Noise"
by Barrence Whitfieldand the Savages.
And we started the set with"Real or Not" by French Vanilla.

CHARLIE BENNETT (11:21):
There's a lot of stuff going on in that set.

MARLEE GIVENS (11:24):
Yeah.
Songs about discerningreal from fake,
the robotic, the hallucination.
[MUSIC PLAYING]


FRED RASCOE (11:35):
This is Lost in the Stacks,
and today's show is called"Putting the A and the I
in Information Literacy."

ALEX MCGEE (11:43):
Our guest is Sarah Morris, assistant director
of academic engagement at theUniversity of Georgia Libraries.
Go, Dawgs.

MARLEE GIVENS (11:51):
Yeah.
So what is happeningwith AI literacy at UGA?

SARAH MORRIS (11:56):
Oh my goodness, I think
we're probably tracking wherea lot of other libraries
are, which is just tryingto figure out what we're
doing right now and figuringout what people need to know,
how to best supportboth faculty students.

CHARLIE BENNETT (12:09):
When you say what we're doing right now,
do you mean the world--
what's going on in the world?

SARAH MORRIS (12:13):
Yeah.
I think the whole world.

CHARLIE BENNETT (12:14):
OK, yeah.

SARAH MORRIS (12:15):
Just the universe even.

CHARLIE BENNETT (12:17):
Because it also makes total sense
that a library would be sayingwhat are we doing right now.
We don't know, but I justwanted to clarify that.

SARAH MORRIS (12:23):
For everyone.

CHARLIE BENNETT (12:24):
Do you feel completely overwhelmed
by the speed at which AIis developing and inserting
itself or being insertedinto daily life?

SARAH MORRIS (12:33):
It is incredibly rapid.
It's something I hear fromother colleagues and people
I've spoken with in theprofession, too, about how just
overwhelming thismoment really feels.
It can be really difficultto keep up with everything.

CHARLIE BENNETT (12:46):
It doesn't seem like a mnemonic is
going to fix anything likeRADAR or the classic CRAAP test.
I just don't feel likethere's a, oh, whenever you're
confronted by a media object--
I don't even knowwhere to start.
Distrust it.

SARAH MORRIS (13:01):
Exactly.

FRED RASCOE (13:02):
It's inserting itself so much,
in fact, that just duringthe music break there,
you were telling us there isactually in your institution
a new position related to this.
Can you tell us about that?

SARAH MORRIS (13:15):
Absolutely.
Yeah.
We have an actual AI literacylibrarian, which is fantastic.
And so certainly ainteresting I guess trend
to keep an eye on ifother libraries are going
to start creatingroles specific to this
or if it's going tobecome something that's
I guess a shared responsibilityarea for every librarian

(13:36):
in a unit, for instance.

MARLEE GIVENS (13:37):
Yeah.
Go ahead, Fred.

FRED RASCOE (13:39):
Do you or the AI literacy librarian
know is this comingout of the fact
that the library is recognizingthe influence of AI or have--
actually have youhad students come
and say we want some sort ofguidance or workshop in AI.

SARAH MORRIS (13:58):
Yeah, I think a combination.
I definitely feel thelibrary is recognizing
the importance of this moment.
But our students--and, again, I think
this is trending inlibraries in general,
people looking for guidancein this moment and not sure
maybe how to proceed if it'sOK to use certain things for
say a researchproject, for example.

(14:20):
So I think both facultyand students asking
a lot of questions right now.

MARLEE GIVENS (14:26):
Is it the library asserting we
have some knowledge andexpertise in this area, or is it
that folks on your campus arelike we should ask the library
to help with this?

SARAH MORRIS (14:38):
I think that can get a little blurry sometimes.
I think it's probablya combination.
And I do feel librariesand librarians have things
to contribute to theconversation and the work we do
around information literacy andnow increasingly AI literacy is
important, but there's--
with the Herd ofelephants in the room,
there's a lot ofdifferent considerations.

(15:00):
So for me, it'simportant to have
a table where you have saya writing center present,
faculty perspectivesbeing represented,
different disciplinesbecause everyone's
going to have differentconcerns or takes
or ideas that they canbring to this conversation.

CHARLIE BENNETT (15:13):
Do you all have an AI literacy workshop
or modules that you put out?
What are the thingsyou're creating--

SARAH MORRIS (15:20):
Yeah, we--
like many, we have areally great LibGuide
that has good resources somepeople have put together
and certainly workingon more workshops
around different topics suchas evaluating AI output,
prompting best practices aswell as some online resources
that people could takeat their own pace.

(15:42):
So I think trying avariety of approaches
right now just to see what mightappeal to different audiences.

CHARLIE BENNETT (15:48):
Have you seen any really successful or not so
successful partsof that strategy?
I think right now it'sdetermining what's
going to work and land well.
One I think concern Ihave going forward--
and this is probablysomething the whole profession
is going to have to grapplewith-- is librarians are always

(16:10):
decrying the one shotinstruction session
where you have that onetime with your students
to get stuff across.
Given how complex AI is and wewere already having this issue,
if you're tryingto address things
like media literacyin 50 minutes,
it can be really challenging.
So I think for me at leastI'm really trying to think
is this a moment to leaninto maybe more asynchronous

(16:32):
forms of instruction, otherkinds of online things,
workshops, other ways toconnect with people aside
from that very time boundand limited one shot session.

MARLEE GIVENS (16:44):
Yeah.

FRED RASCOE (16:46):
Is there a concern about other sorts
of impacts of AI.
I'm thinking about environmentallike students come in
and say like what about theenvironmental impacts of this.
Or maybe theinstructors say maybe
limit your use becauseof environmental impacts.

SARAH MORRIS (17:06):
Yeah, that's--
it's a conversation--
I haven't engaged as much aroundthat say, students and faculty.
But I've been in someprofessional development
settings or done some classeswith other librarians,
and that's something thatcomes up pretty frequently
and that ethical considerationof if you're encouraging people
to use these things and playaround with it or somebody

(17:28):
said I feel bad if I had ita goofy Shakespearean sonnet
and you're like did I justkill a tree by doing that.
What have I done?
And so, yeah, I thinkthat awareness is there
both for peopleteaching about this
and I imaginecertainly students.
At least anecdotallyfrom other librarians,
I've heard students havebeen bringing this up

(17:49):
in terms of environmentaljustice and concerns
around these tools.

CHARLIE BENNETT (17:53):
Do you have a particular thing
that you're reallyhoping people understand?
You personally, do you have--what's stuck in your craw?
[LAUGHTER]

SARAH MORRIS (18:03):
So many things.
No, I know we--one of the songs,
we talkedhallucinations earlier,
and I feel there mightbe a better term for that
because it is such ahuman term to ascribe
to artificial intelligence.
But the idea thatthese tools can
get things reallywrong sometimes
but sound very confidentwhile doing so--

CHARLIE BENNETT (18:23):
In the last 30 seconds of the segment,
you want to workshop some otherterms besides hallucinations?

SARAH MORRIS (18:28):
Yes, let's do it.

CHARLIE BENNETT: What do you think-- (18:29):
undefined
what are you tryingto get across
is that it's just a glitch orthat it's a mistake or what?

SARAH MORRIS (18:34):
Glitch could be good.
Yeah, some kind ofcomputer rev-up or--

CHARLIE BENNETT (18:37):
Glitch is kind of whimsical and fun.

SARAH MORRIS (18:39):
Exactly.
No, it's not whimsical.

CHARLIE BENNETT (18:40):
So I feel like maybe it
needs to be a knife hit orsomething, something terrible.

SARAH MORRIS (18:45):
Something dramatic, yeah.

CHARLIE BENNETT (18:46):
A cut.
An AI cut.

SARAH MORRIS (18:49):
Yeah.

CHARLIE BENNETT (18:50):
OK.

SARAH MORRIS (18:50):
Something.
We'll get there.

CHARLIE BENNETT: We will get there. (18:52):
undefined
You are listening toLost in the Stacks,
and we'll talk moreabout AI literacy
and what to call hallucinationson the left side of the hour.
[MUSIC PLAYING]

Would you be willing todo a show and station ID?

STEVE ALBINI (19:14):
Sure.
I don't know whatthat means but sure.

Hi, this is Steve Albini.
I'm a recordingengineer, and I'm
in the band Shellacof North America.
And you are listening toLost in the Stacks on WREK.


MARLEE GIVENS (19:37):
Our show today is called "Putting the A and the I
in Information Literacy."
We opened the show with a clipfrom our guest Sarah Morris's
last visit to Lostin the Stacks,
and one of the things Ilove about what she said
is that it helps me asa librarian understand
that AI literacyhas a lot in common
with other kinds ofinformation literacy.

(19:58):
Now I admit tofeeling overwhelmed
by AI's pervasivepresence in the academy
and in my daily life.
And one way that I deal withbig feelings is to define them.
So I went looking for adefinition of AI literacy.
There's more than one.
So here's a sample.

CHARLIE BENNETT (20:14):
McGill University librarians Amanda
Wheatley and SandyHervieux created
an AI application evaluationtool called the ROBOT test.
They say being AIliterate does not
mean you need to understandthe advanced mechanics of AI.
It means that youare actively learning

(20:35):
about the technologies involvedand that you critically
approach any text you readthat concern AI, especially
news articles.

FRED RASCOE (20:42):
The authors of the paper "AI Literacy--
Definition, Teaching,Evaluation, and Ethical Issues"
in the 2021 proceedings of theAssociation for Information
Science and Technology reviewedseveral articles about AI
literacy andconcluded that, quote,
the most common approach todefine AI literacy is to base it
on different typesof literacies,

(21:03):
which have recently been appliedto define skill sets in varied
disciplines.
In this review, mostresearchers advocated
that instead of merely knowinghow to use AI applications,
learners should be inculcatedwith the underlying AI
concepts for theirfuture career as well
as the ethical concernsof AI applications
to become a responsible citizen.

ALEX MCGEE (21:25):
Georgia Tech professors Jerry Long
and Brian Magerko in their 2020paper "What Is AI Literacy--
Competencies andDesign Considerations"
define AI literacy as a setof competencies that enables
individuals to criticallyevaluate AI technologies,
communicate and collaborateeffectively with AI,
and use AI as a tool online,at home, and in the workplace.

(21:49):
More recently, University ofNew Mexico librarian Leo Lo
defined AI literacy as, quote,the ability to understand, use,
and think critically about AItechnologies and their impact
on society, ethics,and everyday life.
Its components include technicalknowledge, ethical awareness,
critical thinking,and practical skills.

(22:10):
He called on librarians,quote, to serve
as educators, guiding patronsto critically evaluate
and responsibly interactwith AI-driven systems,
fostering informed andethical engagement.
That sounds a lot likeinformation literacy to me.
File this set under BF463.U5J37.

(22:31):
[MUSIC PLAYING]

That was "Only Tongue Can Tell"
by the Trashcan Sinatras andbefore that "So Much Strange
to Give" by Free Cakefor Every Creature.

CHARLIE BENNETT (22:46):
Did an AI create that set?

ALEX MCGEE (22:48):
There's a lot of cake
in this documentis what I'm seeing.
Songs about human uncertaintyand handling information.
[MUSIC PLAYING]

CHARLIE BENNETT (22:59):
This is Lost in the Stacks,
and our show today is called"Putting the A and the I
in Information Literacy."
Our guest is theassistant director
of academic engagement at theUGA libraries Sarah Morris.
So we've been talking aboutwhat you've been doing
and what UGA's been doing.
Now we're going to need youto project into the future.

(23:19):
What role will librariesplay in promoting AI literacy
among patrons and everybody?

SARAH MORRIS (23:26):
Big question.
I'm not sure about myfortune telling skills,
but I'll give it a go.

CHARLIE BENNETT (23:30):
Just go for it.

SARAH MORRIS (23:33):
I think libraries are
going to have a major role toplay in that promotion of AI
literacy skills.
I think it ties in in alot of ways to the work
we're already doingaround information
literacy and theresearch process,
and so the topicswe're already set up
to talk about and workwith students on I

(23:55):
think connect reallystrongly to AI.
So I don't think we'llbe the only ones,
but we'll play apretty big role.

CHARLIE BENNETT (24:02):
The thing that I worry about with libraries
and AI literacy is that--
going back to the talkingabout podcasting, it's--
AI is so much moreproductive than--
and that's a valueneutral productive
right there-- itproduces so much
more than say justsearching databases.

(24:25):
So we can teach here's howyou can manage or navigate
information.
But then when wegrip on to AI, we
have to say now here's howto manage not just a fire
hose coming at you butnow the fire hose is loose
and you're holding about sixfeet back from the water.
So now deal with it right anddon't get hit in the head.
I'll work on that analogy.

SARAH MORRIS (24:47):
The visual is really going there.

CHARLIE BENNETT: So, yeah, I guess (24:51):
undefined
I'm afraid that this iswhere my essential pessimism
about libraries and AI isbringing me to a spot where
I can't ask any good questions.
All I can do is complain.
So I'm looking atthe rest of the show.

SARAH MORRIS (25:04):
I think he raised a good point, though,
about the idea of, I guess,how "productive" it can seem--
I'm using air quotes here--
because I think the wayit can package things--
we were talking during thebreak about cognitive offloading
issues where a doingthese things for you

(25:25):
and presenting the seeminglyneat array of sentences
or sources or whatnot.
But some of these thingsmight not even be real,
so the fact that you need todo that due diligence to check
on everything there, it'sreminding me some of challenges
with teaching fact checkingskills to people because part
of it is it's not fun.

(25:45):
Unfortunately, you have to slowdown and think about this, which
doesn't sound great whereyou have these AI tools that
magically seem able to doeverything really fast.
But the reality mightbe you actually need
to slow down and take a beatand double check on these things
and think critically about them.

CHARLIE BENNETT (26:01):
And the fact checking stuff,
it gets into almosta philosophical place
because when you'redoing fact checking,
you have to talk about what'strue and how things can be true.
Just like with AI, youhave to talk about, well,
what is this creation.
How did it come to be?
What have we done with it?

FRED RASCOE (26:18):
I think with AI literacy
and the current capabilitiesof tools, the hallucination
problem I thinkeventually will diminish.
It's a serious thingright now, but I
think it's not built in that AIhas to exist with hallucinations

(26:40):
and we have to deal with it.
There is a progress to somany people are putting
so much money into this.
The hallucination problemmight even go down to 0,
but that's not to say thatthere's no problem with using AI
because sometimes thatcognitive offloading is

(27:00):
a literacy problem in itself.

SARAH MORRIS (27:02):
Absolutely, yeah.
And to your point on thehallucinations going down,
it's an interestingconundrum because I
think you could see them gettingbetter and better over time,
a lot of these differentgenerative AI tools,
but there was a NewYork Times article
that came out earlier this monthabout how hallucinations were
going up with certain newversions of tools like ChatGPT

(27:24):
I think partly due tohow they're being trained
on themselves now.

MARLEE GIVENS (27:28):
Being trained on themselves exactly.
Yeah, because I was goingto say that's if we're
going to talk about pessimism.
That's where my pessimism is.
It's in two areas.
One is that, yeah, they'rejust going to keep--
because we're going to startrestricting the content
that they can beused to train on.
So they're just goingto train on themselves,
and they'll start to accepttheir own hallucinations
as fact.

CHARLIE BENNETT: Oh my gosh, they're (27:45):
undefined
going to train on stuff thatpeople don't care about,
that people aren't protecting.

MARLEE GIVENS (27:49):
Right.

CHARLIE BENNETT (27:50):
Oh, no.

MARLEE GIVENS (27:50):
Yeah.
But my other--

CHARLIE BENNETT: I have to leave. (27:51):
undefined

MARLEE GIVENS (27:52):
My other pessimism
is just are we goingto be able to keep up.
I just-- I think we've gotten--
the ship has sailed past thelibrary in so many instances
with open access andscholarly communication
and with data management andjust all these things that--

FRED RASCOE (28:15):
Search engines.

MARLEE GIVENS: Search engines, yeah. (28:15):
undefined
And so I don't wantit to be another
running away from Wikipediaor running toward Second Life.
Yeah.
How are you feeling?

SARAH MORRIS (28:29):
Oh my gosh.
Yeah, I'm definitely--
I skew a bit.
I'm definitely skeptical aroundthese tools, and I skew--
I think I skew a littlepessimistic at times about it.
I guess my one kernelof optimism I can share
is that I think youmentioned Second Life
and Wikipedia, thatwe as librarians,
the ship might havesailed past us,

(28:49):
but we've stillmanaged to weather
a lot of these challengingtechnological moments
and that the work we do isreally vital in this space.

MARLEE GIVENS (28:58):
I love ending on a warm, fuzzy note.

SARAH MORRIS (29:01):
We try.

ALEX MCGEE (29:02):
This is Lost in the Stacks,
and you've been listeningto our interview
with Sarah Morris, whois assistant director
of academic engagement at theUniversity of Georgia libraries.
Sarah, thank you somuch for joining us.

SARAH MORRIS (29:12):
Thanks for having me here.

CHARLIE BENNETT (29:14):
File this set under P96.M4M63.
[MUSIC PLAYING]


(29:35):
That was "Easier Said Than Done"by the Essex and before that
"Before We Go Under"by the Magick Heads,
songs about going head firstinto something new while trying
not to be overwhelmed by it.
[MUSIC PLAYING]


MARLEE GIVENS (29:56):
Today's show is called "Putting the A and the I
in Information Literacy."
And before we rollthe credits, I
was just wonderingcan you describe
in one word or onesentence how you're feeling
about AI on May 16, 2025, Fred.

FRED RASCOE (30:14):
I think maybe a reluctant resignation.
I feel like I'm talkingabout AI the way
that I talked aboutGoogle back in 2002.
That's what I'll say.

SARAH MORRIS (30:28):
I-- wary would probably be the word.
I'm also hearing circusmusic play in my head
when I think about AI.

FRED RASCOE (30:35):
I hear the elephants.

SARAH MORRIS (30:36):
I hear the stampeding elephants.

CHARLIE BENNETT: How about you, Alex? (30:39):
undefined

ALEX MCGEE (30:41):
We'll go with 'woof' and 'woooof' and majorly sus.
I have a lot ofconcerns about bias.

CHARLIE BENNETT (30:50):
Did you just say majorly sus?

ALEX MCGEE (30:52):
I did for the young people listening.
Yeah.

MARLEE GIVENS (30:55):
All right.
I actually-- I feelthe same as Fred.
I feel reluctantly resigned.
I feel like I need to at leaststay close behind the train.
I'm never going to getout in front of it.
Charlie.

CHARLIE BENNETT (31:09):
I'm going to go with cuts like a knife.
That's my phrase.
How about you, Cody.

CODY TURNER (31:15):
The exact same way I feel about pickleball.
How long can I ignoreit before my friends
make me engage with it.

CHARLIE BENNETT (31:23):
I think that's the next list Lost in the Stacks
shirt is AI is like pickleball.
Ask me to learn more.
Roll those credits, Cody.


ALEX MCGEE (31:36):
Lost in the Stacks is a collaboration
between WREK Atlanta and theGeorgia Tech Library written
and produced by Alex McGee,Charlie Bennett, Fred
Rascoe, and Marlee Givens.

CHARLIE BENNETT (31:45):
Legal counsel and a large bag
of peanuts forthe elephants were
provided by the BurrusIntellectual Property Law Group
in Atlanta, Georgia.

MARLEE GIVENS (31:53):
Special thanks to Sarah for being on the show
and thanks to librarianseverywhere trying
to address multiple AIelephants in multiple rooms.
And thanks as always to each andevery one of you for listening.

CHARLIE BENNETT (32:04):
Our web page is library.gatech.e
du/LostInTheStacks where you'llfind our most recent episode,
a link to our podcastfeed, and a web form
if you want to getin touch with us.
And please don't askChatGPT to write that thing.

ALEX MCGEE (32:18):
Well, no.
On next week'sshow, we're talking
about cultural competenciesand job qualifications
in the archives and theproblems that can arise
when those two things meet.

FRED RASCOE (32:28):
It is time for our last song today.
AI is a tool for new methodsof work, but if we use it,
we'll still need toincorporate fundamentals
of information literacy.
So let's close with asong about sound methods
by one of the early bandsout of the Athens, Georgia,
scene, another nodto UGA there, Sarah.

(32:49):
This is the Method Actorsfrom 1980 with "Do the Method"
right here on Lostin the Stacks.
Have a great weekend, everyone.

[MUSIC PLAYING]
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.