All Episodes

January 7, 2025 72 mins

Adam Sparks joins the show to convince every educator to ditch AI-detection tools and focus on what really matters: the writing process.  We discuss alternatives to AI-detection like Brisk, Draftback, revision history tracking, and Short Answer and share actionable strategies to teach writing effectively in the age of AI. From fostering transparency with students to leveraging powerful tools for formative assessment, this conversation is packed with insights to elevate your writing instruction. Plus: Instant Pear Decks, Canva Dream Lab, and new Google Forms Settings.

#EduDuctTape Episode 114

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
In today's episode of theeducational duct tape podcast.
Adam Sparks joins me to convince everyeducator to ditch AI detection tools
and focus on what really matters.
The writing process.
Now, if you're not a writing or Englishlanguage arts teacher, you may be tempted
to skip this episode, but don't, you'llhear things that apply to all classrooms.
Plus we discuss alternatives to AIdetection like brisk draft back revision,

(00:24):
history tracking, and short answer.
And we share actionable strategies toteach writing effectively in the age
of AI, from fostering transparency,with students to leveraging powerful
tools for formative assessment.
This conversation is packed with insightsto elevate your reading instruction.

(00:44):
If there's one sound that echoesthrough my house more than
anything else, it's the word, mom.
Our kids yell from across the house withthe most random questions and requests.
Where did you put my flannel PJ pants?
Or have you seen my tooth brash?
It's like the thing, mywife's a human help desk.
I keep telling her she needs adigital system where the kids

(01:07):
could submit requests and she couldfollow up when she's got time.
Well, my wife's still waiting forthat personal kid request system.
Today's sponsor visor.
Has you covered when it comes tomanaging it issues in schools?
Their help desk software, let students andstaff submit issues through a self service
portal emails, turn into tickets and youget all the details like serial numbers

(01:30):
and warranties, right inside the ticket.
It even automates tasks likemanaging loaner devices.
For my educational duct tape listeners.
There's a special deal at visor.cloud/jakethat's V I Z O r.cloud/jake.
Along with that special pricing, youcan get some awesome swag and a copy

(01:53):
of my book, educational duct tape.
With visor, managing it issues is easierthan finding a missing toothbrush.
Hey there duct tapers, whetheryou're a longtime listener or
tuning in for the very first time.

(02:14):
Welcome to the show.
I'm Jake, a personalized learningand ed tech specialist and former
middle school teacher from Ohio.
And by joining me here today,you are officially a duct taper.
You didn't have to pay to get thatlicensed or turn into any professional
development credits or hours, or go onlineand create an account with two factor

(02:36):
authentication or anything like that.
You just tune in and bam.
Doc TAVR.
That means you are on board withthe educational duct tape metaphor.
And that means that you see educationaltechnology, not as the end goal,
but as a powerful tool, like ducttape use to solve problems, achieve
goals and meet learning standards.

(02:58):
Okay.
Now that that's out of the way for.
Three, two.
One ha.
Happy new year.
It is 2025 and I cannot believe it.
I hope you had a happy new yearcelebration and a joyful holiday season.
Filled with laughter relaxation andmaybe even a little bit of time to

(03:18):
recharge, you sure do deserve it.
Speaking of recharging today's episode ispacked with so much amazingness that it'll
be like recharging your teaching battery.
I think you're goingto hear me geeking out.
During this episode, I have so much fun.
In this interview because itwas just fascinating and it was
just good education, nerd talk.
And my opinion, I really enjoyed it.

(03:39):
Um, in fact, Adam, Adam Sparks, today'sguest shared so many valuable insights.
Like it was so good, so much.
And he responded so thoughtfullyto so many questions that I asked.
I mean, so many, like I was sofascinated and interested, the
questions just kept popping up.
So I kept asking him so manyquestions over the conversation and
he responded so thoughtfully to them.

(04:00):
That I decided to split thisinterview into two episodes
because there's just so much.
Uh, goodness in it.
Uh, it's amazing.
I think trying to take it all inand process all of the educational
implications and goodness,and it would be like eating
back-to-back Thanksgiving dinners.
Not ideal.
Uh, no matter how much you lovestuffing and I do love stuffing,

(04:22):
so make sure you're subscribed.
So you don't miss out on the secondserving of stuffing coming next week.
I mean, The second half of today'sinterview with Adam Sparks.
So that is coming up next week.
Be subscribed so that you don't miss it.
Also, I'm excited to share that injust a few weeks, actually about one
week from the time you're hearingthis and right around the time

(04:42):
that episode with Adam comes out.
I will be in Orlando, Florida,sunny, warm, not snowing.
Like here in Ohio, Orlando, Florida.
Uh, as a featured speakerat F E T C, the future of
educational technology conference.
If you are attending, I'dlove for you to come say hi.

(05:02):
I would be thrilled to meet somefellow duct tapers in person.
And I'll probably have some stickers andbuttons and other swag available too.
So please do stop and say hi for there.
And today's guest Adam, asyou're gonna hear shortly.
Uh, is also going to be there.
So I hope you'll say hi to both of us.
Um, but if you're not making thetrip, but still want to connect, you
can always say hello on blue sky.

(05:24):
It has a blue sky in the name.
It's not quite the blue skies ofOrlando in January, but blue sky
is my new favorite social media.
And so I hope whetheryou're going to FETC or not.
You will say hi there, becauseI'd love to hear from you.

(05:53):
So a few nights ago, wedecided to order takeout.
This is almost always.
Uh, headache, inducing activity.
So with the wildly unpredictabletastes and befuddling stubbornness
of our three kids and two will lesserextent the preferences of my wife
and I finding one meal that works foreveryone is like finishing the, New

(06:15):
York times crossword on a Saturday.
That's the hardest day.
But this time it was like solving theSaturday, New York times crossword.
In pen.
Yeah, right.
In pen, not pencil, no racing.
That's because we had added a new wrinkle.
My dad.
His doctor recently told them to cut backon cholesterol and amongst other things.

(06:38):
So now I'm not just scrollingthrough options for my picky kids.
I'm filtering out anything thatmight spike his lipid levels.
I gotta be honest.
I didn't even know whatlipids were last week, but now
they're a part of my takeout.
Ordering considerations.
I should point out.
They're not really a part of my dad'stakeout, ordering considerations though.

(07:00):
He's like Uber.
As many of, you know, at somepoint we all end up doing a little
parenting of our own parents.
Anyhow, those cholesterolconsiderations got me thinking about
one of my very favorite episodes ofone of my very favorite podcasts.
And that podcast is revisionist history.
By Malcolm Gladwell.

(07:22):
And the episode that I'm referringto is called the basement tapes.
It ironically is also a story abouta man, his father and cholesterol.
So the father in the story, Dr.
Ivan France was thinkingabout lipid levels, but not to
inform his takeout, ordering.
You see, he was a cardiologist whospent his life doing research to

(07:46):
promote what he believed was a healthieralternative to butter and animal fats.
Poly unsaturated oils.
His goal.
Prevent heart disease by loweringcholesterol, ultimately saving lives.
At the time, the thinkingwas that replacing saturated
fats, like delicious butter.

(08:08):
With poly unsaturated oils like margarine.
Would lower cholesterol and inturn, prevent heart disease.
So France set out tostudy this hypothesis.
But here's the heartbreaking twist.
Decades later, his own researchrevealed that all these healthier
oils did reduce cholesterol.

(08:30):
They also came with unintended harms.
You see the poly unsaturated oils that Dr.
France integrated into his test subjects,diets were lower in cholesterol, but were
also higher in something called linoleic.
acid.
I hope I'm pronouncing that, right.
I'm really not sure it was in the podcast.
I guess I could go listen and see.

(08:50):
Anyhow, at first, this whole replacementthat lowered cholesterol seemed okay.
After all lowering cholesterolwas essentially their goal.
But it turned out that consuming too much.
Linoleic acid turned outto have its own risks.
You see, elevated levels havebeen linked to inflammation and
other health issues, including.

(09:13):
An increased risk of heart disease.
The very thing that Dr.
France was trying to prevent.
In other words, his well-intentionedstrategy for heart health.
Inadvertently created new problems.
Undermining has original goal.
And that kind of unintended harm oftenhappens when we focus too narrowly on

(09:34):
solving one part of a complex problem,and we lose sight of the bigger picture.
At this point, you're probably wonderingJake, why are you talking about takeout
and cholesterol and linoleic acid?
Well, this story of unintended harmsfrom a well-intentioned person.
Got me thinking about asimilarly well-intentioned fix.

(09:58):
In education.
AI detection tools.
You see much like poly unsaturated oils.
Those tools initially seemed like asimple solution to a complex problem.
But as with linoleic acid,the surface solution might
hide deeper unintended harms.

(10:20):
At first glance, AI detection toolsseem like a simple, straightforward
solution to the growing issues ofstudents using AI to generate their work.
This honest use of AI is a real concern.
And AI detection seems like an easy fix.
Plug in the assignment, runthe software and wallah.

(10:42):
You've got the cheaters.
Problem solved.
Right.
But just like Dr.
Francis polyunsaturated oils,these tools come with hidden costs.
And the more we rely on them, the morethese unintended harms reveal themselves.
Molly clutter.
And I discussed those unintended harms inan earlier educational duct tape episode.

(11:05):
And I also referencedsome powerful points.
The today's guest Adam Sparks madein a series of tweets that later
became a blog post, but I referencedthose in that conversation.
You see, first Adam pointed out thatthese tools are notoriously unreliable.
Studies show they have a falsepositive rate of over 20%.

(11:27):
That means for every five studentsflagged for using AI, one is
likely completely innocent.
Imagine being a student whospent hours writing an essay
only to be accused of cheating.
And while an algorithm may identifythe essay as AI generated, it's not the
algorithm delivering the accusation.

(11:47):
It's your teacher.
One of the most importantadults in your life.
So why are these tools so unreliable?
Well it's because they rely onpatterns and probabilities and
algorithms to detect AI generated text.
And that's not a foolproof process.
There is no like receipt in the writingthat says this was made with chat GPT.

(12:09):
They're just looking at the text.
The tool is and saying,what would AI right next?
And if that's what's written there,then they're like, I think this was AI.
When a human could havewritten those things.
Actually they often misinterpretdifferences in how students write.
Which means that non-native English,speakers and neurodivergent learners.
End up being at a higher riskof being falsely flagged.

(12:33):
This all means that instead offostering a safe, supportive learning
environment for our students,especially the ones who need it most.
These tools can createa culture of suspicion.
And distrust.
So that's our first two unintendedharms false accusations that undermine
fairness and damaged relationships thaterode trust, you know, trust is the

(12:55):
foundation of any effective classroom.
When a student feels like theirteacher doesn't trust them.
Or worse.
Actively assumes they're cheating.
That trust is broken and onceit's broken, it's hard to rebuild.
Finally an Adams post, hepointed out a third issue.
The opportunity cost schoolbudgets are tight and some of the

(13:15):
AI detection tools aren't cheap.
By investing in AI detection tools.
You're taking funds that could havebeen spent on something else, an
initiative or a tool that couldactually improve student learning.
And you're redirecting those fundstowards something that might harm it.
These AI detection tools.
So what are those unintended harms again?

(13:36):
Number one false accusationsand undermine fairness.
Number two, damaged relationshipsthat erode trust and number three
wasted resources that could bebetter spent on real learning.
In the end, AI detection toolsmight seem like a quick fix.
But there are a lot, like thosepolyunsaturated oils, they address

(13:57):
one issue on the surface whilepotentially causing harm beneath it.
And here's the kicker.
The goal isn't even to catch students.
Your goal is not to catch kids cheating.
Your goal is to guide them towardmeaningful learning and real growth.
And that is never assimple as pushing a button.

(14:18):
So as Carol Commodore wiselysaid what we know today.
Doesn't make yesterday wrong.
It makes tomorrow better.
So if you used an AI detectiontool yesterday, Uh, or last
week or last month or last year?
Let's not focus on the wrongness of it.
Let's learn to make tomorrow better.

(14:40):
You see, when you used those AI detectiontools, you were no more wrong than Dr.
France was for believing ina solution to heart disease.
He was doing the best he could withthe knowledge he had at the time.
Just like you were.
As you'll find if you listen to thatrevisionist history episode, which I
hope you do, because it's really good.
And I've linked it inthe show notes for you.

(15:01):
But you'll find that the keypart came later after Dr.
Francis passing.
His son, also a doctor.
Help the researchers use his father'swork to make tomorrow better.
He knew that upholdinghis father's legacy.
Wasn't about defending a flawedhypothesis about polyunsaturated oils
or outdated beliefs about margarine.

(15:25):
It was about being humble.
Humble enough to change.
And that's what science is andhis father was a scientist.
So of course that's what he believed.
So now it's our turn toembrace what we've learned.
Not cling to what we used to believe.
It is clear.
The AI detection tools.
Are not the answer.

(15:47):
So, how do we do better?
Well, I think to prevent cheating,we need to address its root causes.
First, we must shift our learnersaway from performance oriented.
Extrinsically motivated goals.
When students are focusedon completing a task.
Or we're getting a grade.

(16:08):
Or avoiding punishment.
Or pleasing an adult.
These are all performance orientedor extrinsically motivated things.
When they're focused on that.
The temptation of an easybutton, like cheating.
Becomes really appealing.
Because all they're worriedabout it's the end result and
cheating gets you there faster.
And more efficiently.
Second, we must help them work withintheir zone of proximal development.

(16:33):
Meaning if the work is too hard.
And they don't see a path to success.
Or if it's too easy andfeels meaningless, cheating.
Once again, becomes tempting.
Third, we must strive tohighlight the relevance and
the learning for our students.
If it feels like irrelevant,busy work that has no value

(16:53):
beyond grades or compliance, whywouldn't they look for shortcuts?
When students see how theirwork connects to their goals or
interests or the real world, thetemptation to cheat often fades.
Again, to prevent cheating.
We need to address its root causes.
When we focus on meaningful goals.

(17:13):
Appropriate challenges and relevance.
We create a learning environment wherestudents are far less tempted to cheat.
And today's episode Adam Sparks is goingto share some thoughtful relationship
centered strategies that I think willhelp create the learning environments.
We all strive for.
And closing a simple statement.

(17:35):
Our goal isn't to catch students.
It's to guide them towardbecoming self-directed learners.
Prepared for the future.
And that is never as simpleas pushing an easy button.

(17:56):
Well, Today's guest is Adam Sparks.
Adam taught for seven yearsbefore recently finishing his
master's in learning designand technology at Stanford.
While at Stanford, Adam designeda writing tool called ShortAnswer
that he now builds full time.
As a part of his work with ShortAnswer,Adam leads PD with schools across
the country on adjusting writinginstruction in the wake of AI.

(18:20):
You could reach Adam on emailat Adam at my Short Answer.
com on their website at my Short Answer.
com or on Twitter slash X at Mr.
Sparks tweets.
Or if you're like me and you'vetransitioned over to blue sky,
you can find them at Adam sparks.
Dot B S K Y dot social.

(18:40):
Those things are all in theshow notes, but not in the show.
And it's actually here inthe podcast is Adam himself.
What's up, Adam?
How's it going, Jake?
I'm excited to be here.
Thank you for having me.
Of
Yeah, I'm glad to have you.
So Adam and I have been talking aboutdoing this interview behind the scenes
for a little while, because I became awareof Adam's work a couple of months ago
and was really, really excited about it.

(19:01):
And I was like, Adam, I needto have you on the podcast.
And I am really excitedthat you said yes, Adam.
So thank you for that.
course.
I'm honored to be here.
Yeah.
So some of you might recognize the namebecause I've referenced him before, and
maybe you were part of, Adam becomingkind of a viral sensation on Twitter.
I mean, you were, you were pretty,you were pretty hip, right?

(19:21):
You were definitely trending.
I was the, I was the anti AI detection guyfor like a good two to three months there,
which was an interesting experience.
I'm glad you didn't changeyour Twitter handle over to
at the anti AI detection guy.
I guess no, not yet.
But you could have like a theme song.
That's kind of like the BillNye, the science guy song, but

(19:42):
it's the anti AI detection guy.
I think I'm more in the billthan, than whatever that would
be, but yeah, you know, maybe
So I guess some of you are like,wait, what Adam is, is famous on
Twitter and is the anti detection guy.
So Adam wrote, and we're going totalk about this in a little bit.
a series of tweets.
It was a thread of tweets.

(20:03):
That I, when I became aware of itwas when, Matt Miller from ditch,
that textbook shared it out.
I think that probably is whenit really spread and reached
kind of its critical mass.
But then it got to the point whereyou were talking to news outlets and
things like that and, edutopia, right.
About, about the, about the work that youshared in that, thread of tweets about
the problem with AI detection tools.

(20:25):
Yeah.
What was, what was the craziestpart of that experience, that
whole experience of that, of
just referenced it.
It was like having journalists reach outand be like, we want to talk to you about,
you know, the research on AI detection.
I'm just like, I'm just a guy onTwitter who, who read the paper.
You know what I mean?
So the, the, the genesis of itwas like, TurnItIn which is the
most popular AI detection tool.
put out a white paper of like, here's theresearch supporting our, our approach.

(20:49):
And my wife currently studieseducation data science at Stanford.
And so I hang around with people thatknow a lot about data science who
openly laugh at those types of tools.
So I'm like, Hmm, very interestedto read this paper, and dug into it
and just immediately found a lot of.
Not, not direct lying,but, but lying by omission,

(21:10):
which to me is lying, and sort of likeobscuring the research, to, to benefit,
you know, the tool that they're selling toschools and that rubbed me the wrong way.
So I
just put out this.
This Twitter thread of like, you know,here's the, here's what it actually says.
Here's the research they're citingthat actually doesn't support
the claims that they're making.
And, and no one read it.
Like I put it out.
I don't know.

(21:31):
I don't, I don't even rememberthe date, but put it out.
And then, yeah, like a month later,something, a couple of people that
were very influential, I think Matt
Miller then Holly Clark retweeted it.
And all of a sudden it exploded.
And yeah, edutopia is reaching outfor interviews and, TurnItIn, took
a keen interest as you might expect.
And I ended up talking to the head ofAI at TurnItIn and, and it turned into
this formalized blog post that I've

(21:52):
shared out with the world.
And it was a guest blog post withMatt Miller's, ditch that blog.
so yeah, it was a weird, surrealexperience to be caught up
in a viral Twitter thread.
Well, I was, it was so good.
And that what was fascinatingabout the Twitter thread.
And for those of you that are like,Tell me about, tell me about why, And
like, you're saying that like they'redoing the lying by omission kind of

(22:13):
thing in there, which I agree with you.
then they want to know more.
I'll put a link to the blog post, inthe show notes and, and also we'll,
we'll talk about it a little bitmore too in a, in a couple minutes.
But, what I think was fascinating aboutthe whole thing was exactly what you
said, where they were citing a study.
that wasn't super favorable to themand no, none of us read it, right?

(22:39):
Every, we all just trusted it.
And like, you're like, you'renot a data science science guy.
You, you were just a guy who read it.
Like you were a guy who thoughtdeeply about it, but just read it.
And like, like I, what I love about boththe blog posts and the Twitter thread.
Is you were just like, I felt like I wasalong for the ride with you as you were
like reading it and going like, wait,what didn't know it's like, like you're

(23:03):
like, you were like live streaming yourreaction, like, you know, those YouTube
videos where somebody watches somethingand they're reacting and on the spot.
Right.
Then that was your, that was your thread
of you reacting to this article.
Yeah, I mean, literally it was that itliterally was just like a real time sort
of reaction to like, my goodness, thisis, you know, this is just overt lying.

(23:23):
so, yeah, it was a surreal experience for
sure.
I'm happy to dive into any of itspecifically if you would like to.
But,
Yeah.
Yeah.
We'll get to it.
We'll get to them.
And we had, we had somefun stuff first, but we'll,
we'll definitely get to that in a minute.
Yeah.
Like I said, it'll be in the shownotes for everybody to check out.
It really fascinating stuffthat you, you, dug into there.
And I think you don't give yourselfenough credit in saying you're not a
data scientist because maybe that's good.

(23:45):
You were, you were reading it as aneducator, and a person who cares about.
education and about writing and aboutteachers and about schools and about
school funding and about a lot of things.
and so I think you were the perfectperson to be reading it because
of the way that you read it.
You know what I mean?
Like you took the time to read it,but also you had, a perspective that
brought about some really, valuabletakes on what you were reading there.

(24:08):
Well, I hope so.
And I, I hope it moves the needle in,in, in discouraging folks from using
those types of tools and instead sortof orienting themselves towards more
holistic approaches to adjusting writinginstruction, that need to be made, which
I'm excited to talk with you about today.
Yeah, for sure.
For sure.
So, before we get into that, Adam, I'mexcited that a few weeks after this
episode drops, you and I are going toget to meet in person at FETC, right?

(24:33):
Yes.
And a much warmer locationthan rural Nebraska, which is
where I'm currently located.
So
I'm excited for that.
and I'm in Ohio So yeah, we are bothexcited to be in Orlando for sure.
That, that is the one thing I'mmost excited to be at FETC for.
Um, yeah.
For the listeners who have neverheard of FETC, it's the future of
education technology conference.
I'd say it's my favoriteeducation conference.

(24:55):
Have you been to it before, Adam?
I praise.
No, I haven't.
I've heard really great things.
So this will be our first time going.
I'm excited.
I'm
Nice.
So they are, it's, it's likemini ISTE and ISTE is good.
but ISTE is, is ginormous and you feellike an ant in a big city or something.
I don't even know.
whereas FETC is just big enoughto have You could see anything and

(25:16):
hear about anything you want tohear about, but yet feel a little
bit more like a, like a community.
So I, I like SD too, butFETC is really great.
and, very carefully curated, by formereducators and led by Jen Womble,
who loves education and is veryinvolved in educational technology
and just has a great mind for this.
I was actually on, a webinarfor all the presenters, and,

(25:37):
and just like listening to her.
I was like, this is, this is a personwho gets teachers, gets the audience and
gets what good presenting is and she,Talked about pedagogy and andragogy,
you know, thinking about the adultlearners and like this person gets it.
So FETC is a really good conference.
I'll put some links for anybodyinterested in joining us in Orlando.
If you can convince your school to sendyou to Orlando, I've got a discount code.

(26:00):
I'll put in the show notes for everybody.
but are you guys, so will you guyshave a booth there and presentations,
what are you doing there?
Yeah, we'll have a booth.
And then we were encouraged by somefriends to do the, the pitch contest.
So we're going to be up on the stage.
just talking about Short Answerand trying to convince people to
join us and, and making some of theadjustments to writing instruction
that we're going to talk about today.
so I'm getting, I'm rapidlypreparing my, my little three

(26:21):
minutes feel, for, for Orlando.
So I'm excited for it, butI've heard so many people say
a lot of what you just said.
it's like, yeah.
Yes, Steve, but smaller, and a lotmore personalized and, and curated.
So I'm, I'm super excited for it.
Yeah, it's, it'll be a very good time.
I'm excited to meet you in person there.
I hope anybody that, is there willcome up and find me and say hi and come

(26:42):
find Adam and say hi and stop by ShortAnswers booth and go to Adam's, pitch.
I can't believe it'sonly three minutes, Adam.
We've already been talkingfor three pitches of time.
It's like a shark tank thing, I think.
So, we'll see, but, I'm excited for it.
Yeah, great.
Well, I hope I can make it to it.
I hope my schedule allows that.
so yeah, so, FETC, are you doing ISTE?

(27:02):
Are you doing any other big conferences in
You are, you know, this is the first time,you know, last year we're, we're basically
a bootstrapped organization, which isvery different from a lot of the ed techs.
And so I have to be a little bit moretargeted in our outreach, but, this year
we've kind of expanded the footprint.
So, so we're at GATC in Atlanta,a couple of weeks ago, which was
fantastic, really great conference.
we're going to be down at FETCafter that it's TCEA, which is

(27:24):
another great conference in Texas.
It's in Austin this year.
we're English specific, so we'realso going to go down to TCT ELA,
which is a much smaller conference.
But as you like English specific, andthen we'll be at ISTE for the first time.
And for those that don't know, ISTE iscombining with ASCD, which is a major.
Curriculum and instruction organization.
So very curious to see thishere, like how they handle that.

(27:46):
Cause that's going to be a bajillionpeople in downtown San Antonio.
So, but I've never been to ISTEand have always wanted to both as a
former educator, but now, especiallyas an ed tech creator, like.
Kind of a bucket list thing.
So I'm excited for it.
Yeah.
Good time.
So anybody, if you're at anyof those things, reach out to
Adam, make sure you say hi.
And if you're organizing a differentconference that he didn't just

(28:08):
name, maybe you should reachout to him about that as well.
so Adam, you mentioned as a former teacherbefore you were in the ed tech space,
what did you teach for those years?
So I taught English and social studies,but mostly social studies, for seven years
in the most varied context imaginable.
It's almost like hilariouslydifferent places.
So rural Nebraska, which iswhere I'm from, urban China for
a year, in the Southern part ofChina and then on the West side

(28:31):
of Chicago for a couple of years.
So
very different contexts.
Wow.
Which,
which probably really enablesyou to have a really cool like
background in the way you think aboutclassrooms and about teaching and
about learning and stuff like that.
Right?
definitely, um, definitely,
drove home the idea that like ineducation, everything works somewhere
and nothing works everywhere.

(28:51):
So you gotta cater it.
I'm stealling...
I'm stealing that quotefrom Dr Dylan William.
By the way, I want to give
credit where it's due.
Um, but it's very true education and whatworks in education is very contextual.
So now, as someone who'sbuilding a tool at scale,
it's helpful to have thatperspective, I think.
Wow.
Nothing works everywhere andeverything works somewhere.

(29:12):
Is that what, did I say that right?
That's
education research in a nutshell.
So
I'm stealing that too.
And if it came from Dylan William, youknow, it's, you know, it's good stuff.
yeah, exactly.
Yes, that's for sure.
All right.
So I'm already having fun chatting withyou, Adam, but we're going to play a
game cause we have to, it's, it's part ofthe educational duct tape rule book that
a game be played during the interview.
Like it's, it's right there.
It's clause five.

(29:33):
and so we're going to play a game.
Of two truths and one lie.
So you're going to readme three statements.
I'm going to do a horrible jobof guessing which one's the lie.
I'll get confused abouthow the game works.
Cause that's what normally happens.
I may or may not realize what the lie is.

(29:55):
And you'll have to tell me like, wait,Jake, you never figured out the lie.
The people who listen, no, this game goes.
but first what are yourthree statements, Adam?
Yeah, I've got them right here.
I'm very excited.
So
the first one, I'm ahuge college sports fan.
So graduated from Creighton university,small school in Omaha, but we just
upset the number one team in thecountry, by the way, in basketball.
So just a heads up there, go Jays.

(30:17):
but so diehard Creighton basketball fan,but controversially being from Nebraska,
also a diehard Iowa Hawkeyes football fan.
So that's, that's, uh,you know, Item number one.
Item number two.
Um, I once slow danced withChicago Cubs legend Ernie Banks

(30:37):
Okay.
in bullet point three.
I road tripped across all thelower 48 states in one summer.
Those are my three.
Okay.
think of the three mostabsurd ones I could think
of.
Those were, those were pretty absurd.
Well, the first one wasn't thatabsurd, but the other ones were absurd.
Um, the funny thing is I, when I thoughtabout Creighton, I was like, the only

(30:59):
thing I know about Creighton is theyare regularly in the NCAA tournament.
Um, as a, as a relativelylow seed, but, uh, but
whoa, whoa, whoa.
whoa.
We almost made the final four, like a
weren't they a low seed?
No, no, no.
We were like, Ooh, I don't even remember.
I think we were fouror five seed that year.
Okay.
All right.
Yeah.
But I was going to say the next part Iwas going to say was they always are,
they always play impressively in it.

(31:21):
I'm not a big college hoops fan, butI always do kind of, kind of follow
along with like the, once they get tothe final four and stuff like that.
Um, and Iowa Hawkeyes football,I did not see that coming.
Um, and then the ErnieBanks thing that got me what
position that Ernie Banks play.
He's an, Ooh,
he was
Is he a middle
I'm actually not a diehardCubs fan at all, but, uh, he,

(31:43):
I think he was an outfielder,
Okay.
I believe you.
statues outside the stadium.
He's a big deal in Chicago.
Yeah, I can, and you slow dancedwith, let me say, I'm, that's the lie.
No, that's real.
That's a real one.
I know.
That's like the most absurdthing I could think of.
That's very absurd.
So just, you know, for the listeners,the lie was the Iowa Hawkeyes thing.

(32:04):
I'm literally hiding myNebraska court Huskers
I was getting, I was wondering
about that one too, becauseI was like, wait a minute.
Like I can understand you'renot a Creighton football fan
or like big Creighton footballfan, but Iowa, not Nebraska.
never, never a Hawks fan.
No, I'm, I'm, I'm a Cornhuskers fan.
Like die hard when it
comes to football.
Um, but no, so Omaha, um, is homeof, uh, baseball legend, Bob Gibson.

(32:28):
And so every year he would host thisthing called the Bob Gibson all star
classic, and my mom would take me upand it was at somebody golf tournament.
But afterwards they put on like a banquetwhere they have like a dance and you
could pay for tables and sit next to like,you know, legendary baseball players.
And
so.
Um, my mom did it cause she, and she tookmy grandpa who's a huge baseball fan.
And, um, and yeah, we were at a tablewith Ernie Banks and, and, and I was

(32:51):
like, and I was in like kindergartenat this point, I was very young, but,
uh, went out and danced on his shoes.
It was like a surreal experience, um,and a really cool random claim to fame.
So yeah,
that one's real.
You have pictures of youdancing with Ernie Banks.
We have framed pictures in this
house.
I'm in my parents house right now.
So yeah, yeah, it was a big deal.

(33:11):
Wow, Ernie Banks and Danza shoes
Pretty cool
experience.
Well,
fan thing because I am aBuckeyes fan here in Ohio.
So we'll
no comment on that.
Yeah, right.
we're not feeling super happyabout our fandom right now.
after
a couple of weeks ago, there, thereis a certain university that's

(33:34):
roughly halfway between Ohio stateand where you're at in Nebraska
that we're not happy with right now.
So we won't discuss that
one at
I won't.
I won't touch on that.
I know it's a sore subject.
Thank you for that.
what was the other, whatwas the other thing?
What was the third thing?
I did a road trip acrossall the lower 48 states.
Actually, when I was still teachingand I video blogged it with my kids

(33:54):
and we did this like, like triviathing where I made videos and
they had to guess where I was at.
And
that was after my first yearof teaching at Louisville.
And I, you know, Basically, Idesperately wanted to be Anthony
Bourdain at that point in my life.
So this
is my attempt to like create alittle travel show with my kids.
It was awesome.
so basically just road tripped all summer
and visited all 48 States,
all the lower floors, you know?

(34:15):
And then were you like blocking iton the, like YouTube or something?
Is that where you put it?
And then, And
then your students were commentingand watching and stuff like that.
Yeah.
It was awesome.
Like did all sorts of cool stuff.
Like stopped in Montgomery, Alabamaand interviewed Martin Luther King's
barber and talked about the civilrights movement and then like went
What?
I know, I know it's crazy.
That's actually a pretty goodidea for a YouTube channel.

(34:36):
Like I think maybe like if Short Answerdoesn't work out, which I think it's
going to work out, but if Short Answerdoesn't work out, uh, I think maybe, I
think maybe the show should come back.
Well, that was part of theoriginal, I think, conceit.
I was a young person and, youknow, kind of secretly wanted
to be maybe YouTube famous too
Right.
and, and, and do some educational stuffwith my kids and have an excuse to travel.
So, no, that was a really cool experience.

(34:57):
That is really cool.
Yeah.
And so the other, the part of thereason your students were watching is
you were 23 or whatever and you werethe hip guy teacher that they had and
that's
at least I thought I was, I don't know if
I
actually was, but
Well, to them, you were at thattime, like, but like, if I 44 year
old Jake Miller did this, likenone of my former students are

(35:18):
tuning in to that optional content,
but my first year, they'd be like, Oh, Mr.
Miller, you're visitingsomewhere random in Ohio.
Like, yes, I will watch that eventhough I live in that state, but
that's because you're the firstyear teacher and you're cool then.
Yeah, we'll take it.
Okay.
Adam.
Let's talk about some educationaltechnology now, instead of

(35:39):
goofing around about this stuff, soin the educational duct tape podcast,
Adam, and for people who may be tuningin for the first time, we think about
educational technology as a toolto solve teacher problems or meet
teacher goals, sometimes educationaltechnology is not the only route.
Sometimes it's, just a strategy we use.
There's something we do.
It doesn't have to be something digital.
It could be something analog, butin general, we're thinking of these

(36:01):
practices and these tools as that.
Tools that help us do things.
so I always start with a teacher question.
And now I've got a big one thatis on a lot of people's minds.
And I know you've done alot of work in this space.
And so you're very well qualifiedto answer this question for us.
And that question is how shouldteachers teach and assess writing?

(36:23):
In the age of AI.
So now that our landscape has totallyshifted because of AI, I think
the classroom that's most impactedby it is the writing classroom.
They're all impacted by it.
We can't stick our heads in the sand.
We have to, we have to embrace it ina way, because it's part of our world.
So we have to admit it's there.
So what do we do whenwe're teaching writing?

(36:45):
How do we change how we teach?
How do we change how we assessand I guess my first question I'm
gonna lead with this one is can wejust keep doing it the same way?
And tell kids not to use AIand use AI detection tools

(37:06):
to see if they're using AI.
So that's my first questionis, can we, I'm really setting
this up as an easy one for you.
Can we just act like nothing's changedand just tell them not to use it?
No,
thoughts on that?
Short Answer.
No pun intended.
No.
Right.
Please elaborate.
yeah, no, we cannot.
And I mean, there's, there's,there's a larger reason for this

(37:27):
that has nothing to do with AI.
Like, if you want to stand back for asecond, even larger and just think, well,
why should we assess writing in a worldwhere machines can now write for kids?
yeah.
Our kids can't write already.
So if you look at NAEP scores, NAEPis like the national, you know, the
nation's score, what do they call it?
The nation's report card, so to speak.
75 percent of 8th And, Iwant to get this right.

(37:50):
I think 10th grade,don't quote me on that.
But at the high school level, 75 percentof kids are not proficient in writing.
That's based on 2022 NAEP scores.
So our kids already can't write.
So what we're doingalready is not working.
And so regardless of AI or not, weneed to make changes to how we approach
writing instruction and assessment.
But especially because ofAI, we need to rethink.
writing instruction and assessment.

(38:11):
And it's because we just mentionedlike, machines can now write for humans
for the first time in human history.
it's, it's a really important,practical challenge just as much as
in philosophical challenge for schools.
so so no, we can't just keep doingwhat we always have done and, and
sort of like turn to AI detectionbuttons as like the easy button of
like, well, I'll just keep doing whatI'm doing, tell kids not to use AI.

(38:34):
And then if I suspect them of usingit, I'll just run it through, you know,
pick your, pick your AI detection tool.
and then that'll be myway of dealing with it.
That's my policy.
Here we go.
that.
Which I've seen, unfortunately, alot of schools and teachers doing
like that's not going to cut it.
Unfortunately, it's going to takelarger, more fundamental changes.
So, I have a 5 point plan for you todayon on on, you know, sort of 1st steps.

(38:56):
We can take it and changesthat that need to happen.
I think to K 12 writinginstruction and assessment
Yeah, I'm eager to eager to hear that
I apologize for interrupting the episode,but I've got a quick confession to make.
I'm a bit of an out loudthinker, you know, the type.
I need to talk things throughto make sense of my own ideas.

(39:16):
My wife, well, let's just say she's avery patient listener, but I'm 99% sure.
She doesn't always enjoybeing my sounding board.
That's why I'm so excited aboutthis segment's sponsor swivel and
their new tool mirror talk.ai.
I'm thinking that mirror talk mightjust free my wife up from listening to

(39:37):
all of my processing and reflecting.
You see mirror talk, lets you oryour students reflect out loud.
Literally just talk it out.
And gives you instant AI powered insights.
On how you're thinkingand where you can improve.
It provides honest objectivefeedback that helps develop
reflective and metacognitive skills.

(39:59):
Mirror talk is perfect forstudents, teachers, or anyone
looking to think smarter and grow.
So if you're ready to stop burdeningyour loved ones or help your students
take their thinking to the nextlevel, head over to mirror talk.ai.
That's M I R R O R.
T a L k.ai and check it out.

(40:21):
It's like having a thought partnerwho never gets tired of listening.
Uh, speaking of nevergetting tired of listening.
I never get tired of listeningto Adam Sparks his wisdom.
So let's get back to that interview.
Yeah, I'm eager to eager to hearthat and eager to share that.
I think one thing I want to pointout, and I think you, you will, I
don't want to speak for you, butI'm confident, very confident that

(40:42):
you're gonna agree with me here.
I love teachers.
my, I, I don't know any profession thatworks harder, cares more and Deals with
issues of distrust and doubt and lack ofrespect as consistently as teachers do
So I understand it's a very hard job.

(41:03):
There's not a lot of time to do it.
A lot of educators putunpaid time into their work.
So when we see a new challenge.
It's really hard for themto say like, well, I'll just
change the way I do everything.
Um, because we understand like, likeyou're already working really hard
and you're already putting a lot oftime and making a change is hard.
It's hard to embrace it.
It changes the fabric ofwhat you know to be teaching.

(41:29):
And teaching writing and writing,especially and assessing, it
changes the fabric of all that.
And that's hard.
That's a tough pill to swallow.
It requires a lot of work.
And so when a teacher wants to usesomething that is an easy button, so to
speak, I understand where that comes fromand I understand why they want to do that.
I understand why it's so enticing.

(41:50):
The problem is, as you alludedto earlier, that Some ed tech
companies are very willing tosell their things as those tools.
because that's how I, society thatis a capitalist society works.
Like that's what we do.
Right.
and so I understand it all right.
And I agree with youthat we can't do that.

(42:12):
Can you.
Before we chart the path forward, I'mgoing to, I'm going to put the, I'm
going to put the link to your, your blogpost about AI detection tools in the, in
the show notes so everybody can see it.
Um, but will you give us like theone minute overview of, of why not?
Why not Adam?
but simply because these tools don't work.
So, if you talk to any data scienceperson, I talked to a lot of them

(42:34):
right now, because my wife's agrad student studying data science.
they laugh at these toolsthat they don't work.
So researchers have done independentthird party studies on these things.
The one that I cite inthe, blog post is from Dr.
Laura Weber-Wulff was at the universityof applied sciences in Berlin.
They looked at all the major.
popular AI detection tools.
And this is, this is the studythat was actually cited by

(42:55):
TurnItIn and their white paper.
and yes, TurnItIn as the best inthat study, but then you look at
it and you find out that it's wrongover 20 percent of the time still,
uh, in the text that it's labeling.
And a direct quote from thatstudy is, This is in the findings.
These tools should not beused in academic settings.
That's coming directly from thirdparty researchers who I trust over
a for profit company who's tryingto make money off of schools.

(43:18):
so that, you know, that, that's,that's the Weber-Wulff study.
I, I cite another study a lot, which isthe university of Maryland's computer
science department did a study wherebasically if you just change the words
around of AI outputs, it changes thestatistical Probability like it changes
the, uh, well, we don't need to getinto the nitty gritty of it, it's fine,
but it changes the efficacy of thesethings where it drops it to like by 90%.

(43:42):
Um, so, so if, if you don't trustthird party researchers, which I just
cited open AI wrote a blog post a whileback where they basically gave up on,
on developing AI detection software.
And they directly address educators in it.
And they say, we know this createschallenges where you're not going to know
what was AI generated and what's not.
But.
Just as much of a challenge isputting a tool into your hands that

(44:03):
doesn't work and saying that it does.
So
we're not going to develop this tool.
and interestingly, sincethen, open AI has come out.
Well, they haven't released it,but they have supposedly they
have a watermarking system.
That's like 99 percent effective,but they haven't released.
So we'll see.
Maybe, maybe down the roadthere will be AI detection
stuff.
But, so yeah,
Cause that's the thing I tell a lotof teachers is the AI detection tool

(44:24):
can't find some kind of, I've neversaid the word watermark, but there's
not like a piece of code in thetext that an AI detector goes like,
ah, here it is, it's a, it's AI.
There's, there's nothing inherentlyabout the text that tells somebody it
came from AI, unless something likethat watermarking feature becomes
available in the future.
But they're looking for patterns.

(44:44):
So go ahead.
yeah, no, it's looking for patterns.
You just, you just got what I wasgoing to say, which is large language
model powered AI tools like chat,GPT or clod or pick your poison.
They're just stringing together wordsbased on statistical probabilities.
So what these AI detection tools dois look for what you would expect.
Those statistical patterns that you wouldsee in text that's been strung together.
based on probabilities.

(45:05):
But again, the immediateproblem with that is change.
A few words of brown, you break upthose probabilities and all of a
sudden the efficacy drops by 90%.
So, and even when youdon't, you know, even when
you copy paste directly fromChatGPT 20 percent chance,
it's still going to be wrong.
So
the efficacy is just not high enoughto be used in an academic setting.
And it's certainly not high enoughto be giving a kid a zero or you

(45:26):
know, making it your policy foryour classroom, you know, at all.
So I would strongly encourage yourlisteners to not use these tools and
instead take more holistic approachesto changing how they approach, things
like, you know, academic integrity and,
and, and that sort of thing.
Yeah, I think the main the mainpart there is don't use them

(45:47):
because they don't actually work.
And that, that, that's,that should be enough.
But if it's, if it's not enough, you'vegot to think about what, what happens
if you still choose to use them.
I hear a lot of teachers say like, well,it's just be one tool in my toolbox, or
it could be the first thing that I useto look at, it could be a conversation
starter, and you mentioned some ofthese things in your article, but.

(46:11):
You've got to think about, okay, well,then, well, then what happens, right?
Then what happens when you talk to akid and make it either clear directly or
maybe kind of the kid can tell in yourtone that you're suggesting that maybe,
possibly they might have, you suspectthat they used AI and what does that do

(46:31):
with your relationship with that kid?
Whether or not they used it.
so there, there are someinterpersonal things that are,
Immensely probable, not probable,problematic and probable, I guess,
if we choose, if we chooseto use these things.
so not only are they not effective, butthey, they can cause a lot of problems.
And so, yeah, not a good

(46:51):
I just don't think therisk is worth the reward.
That's the
biggest explanation I hearis like, well, this is just a
conversation starter for me.
My response to that is you'restarting a conversation with,
with a baseless accusation, which
is a really troublesome way to, like,build a relationship with a kid, which is
arguably the most important factor inwhat you're going to be able to get
out of a student in your classroom.
So anything that gets in the way orpotentially gets in the way of building

(47:14):
a meaningful relationship with the kid.
Right.
I think we need to be skeptical of,
so I don't, yeah, I would
discourage against these tools
Yeah, I don't have a study to cite.
I don't know where it falls on onHattie's meta analysis, but but the
the trust between a teacher and a kid,especially between the kid and the
teacher is has got to be one of thebiggest influencers of student success

(47:37):
and doing these kinds of things showsa lack of trust from the teacher of
the student and then creates a lack oftrust from the student with the teacher.
And.
So you're, you're trying to use thesethings for good cause, which is to
improve student learning becauseyou don't want cheating to happen.
But instead what you're doing isa detriment to student learning.

(48:00):
So, yeah,
there are free alternativesthat work better.
So, I mean, I'm not saying don't tryto, you know, if you set the expectation
with your kids and in a writingassignment that they're not supposed
to use AI and you suspect that theydid, there are free tools that include
like a DraftBack Chrome extension or a.
revision history or a tool thatI've become, I really like is
BriskAI, which I know you've talkeda lot about on this podcast and

(48:21):
it's, it's an awesome tool, butthey have a revision history tool
that I think is super powerful.
If you suspect a student of usingAI, you can go back and and see
how many edits the kids made andover how long of a course of time.
And if there's any big copy pastingmoments that happened in the, in the,
in the version history of that document.
So there are tools that can do this.
I do think brisk has AI detection too.

(48:43):
That'll kick out like a percentage score.
I wouldn't recommend using that, but Ireally like their, their revision history
tooling.
So there are tools to helpyou monitor this stuff.
It's just, I wouldn't use, you know, theTurnItins in the, GPT zeros of the world.
I think there's better alternatives.
Yeah.
Oh, and I think what's goodabout those alternatives is
there, what they're looking atis the process, the learner went

(49:03):
through while they wrote, not theproduct that came out of the writing.
And that's really what we need to do.
I think that's going to lead intothe other stuff that I'm going to
let you share here in a second.
You're like, I know you're itchingto share that stuff, but, um,
that's, what we need to work onis the process, not the product.
Uh, these AI detectors arelooking at just the product.
Things like revision history, uh, whichwill show when the kid worked on the

(49:25):
document, how long, how many edits, whatthey typed, what day, and draft back.
That'll show it as a process.
and brisk that'll also do that.
Those tools let you see.
And then if you, like, as you said,if you see a big chunk of text come
in all of a sudden, then maybe theydid copy it over from chat GPT.
That's a more effective way to do that.
Then just throwing intothose different tools.

(49:45):
Um, and I, and you allude to this inthe article, such great metacognitive
things for writers to be doing.
Anyhow, looking back at what their writingprocess and writing journey was like, to
see ways they can be better or what kindsof writing they're good at, where their
skills are, what their strengths are.
So it's not just a tool that willhelp us notice when AI is there.

(50:06):
It's a tool that willhelp us grow good writers.
I think.
100%.
I was just going to say that,like, we don't need to just view
these tools as like punitive and
like to enforce our anti AI policies.
It's like there's tremendousformative assessment value here
in getting kids to be owners of their ownlearning and reflect on their process.
and yeah, like, like I could literallyimagine assignment where it's like, make

(50:27):
me a screencast where you walk me through,
um, you know, your process editingthis document explaining step
by step the changes that youmade to this document over time.
Like,
there's tremendous formativeassessment value on that.
So, so, yeah, there's, there'sa lot of learning potential that
can come out of this as well.
Yeah, I worked, back inthe day with my friend, Dr.

(50:47):
William Kist, who, at the time was withKent State University here in Ohio.
now he's, he's no longer withthe university, but he still does
work in the, English and languagearts space and writing and we did
work with a group of teachers.
It was called writing ourselves was theproject and we were, we were focused on
high school writing classes, primarilya little bit middle school too.
And the kids made digital portfoliosof their writing, not just of they're

(51:10):
finished products, but of the process.
So in the portfolio was, A brainstormingdocument was the first draft was
the feedback on the first draftwas the second draft because that's
what we, they really cared about.
They were saying what's importantin writing is the process,
not always just the product.
and in it we use draft back.
To record a video.
So we would take a screencast ofdraft back, feeding back the writing

(51:31):
process, I think is what we did.
it's been a long time ago and put thatin their digital portfolio, which I think
was a really, really cool way to do it.
And, and then it has this addedbenefit of helping us see this.
So I think that, I think that's,that's kind of a first step of, of
how we could teach and assess writingin the age of AI, which is to not
use AI detection tools instead to usesome of these alternatives that let
us look at what the process was like.

(51:52):
Not what the product is like.
what, what's your other, what'syour five point plan, Adam?
Yeah, I mean, that leads right intothe first point of the plan here,
which is like you need to haveclear classroom policies and clear
school policies around and reallyconversations up front with kids about
what are your expectations with AIuse, what does it mean to act with
academic integrity on a specific writingassignment that you're assigning?

(52:14):
and then being really intentionalto make sure that your kids
know those expectations andcan follow those expectations.
and so the best system thatI've seen for this ... there
are now various varieties of it.
I really like, and a lot of people don't.
So I'd encourage youto challenge me on it.
But like a simple red, yellow, greenlabeling system of if I label this
writing assignment red, you can't use AI.

(52:34):
If you do, you're violating myacademic integrity expectations
and you can be held accountable.
If I label it yellow, you can usesome AI with specific constraints.
And if you go beyond thoseconstraints again, you're
going to be held accountable.
And a green labeled assignment is notonly can use AI, The learning construct
of this assignment is I want you to learnhow to use AI effectively in your writing.
So that's what I'm measuring.
And so a part of the goal ofthis assignment is to teach

(52:56):
you how to use AI effectively.
So I want you to use it.
I like the simplicity of that system, butI think the core insight is really just
thinking about writing assessment throughthis lens of a tiered layer of influence.
And so we, we actually did awebinar with Short Answer with, Dr.
Mike Perkins and Leon Furze who aredoing some research around this.
They're calling it theAI assessment scale.
And maybe this is somethingwe can link in the show notes.

(53:17):
For sure, yeah.
Um, and so they have a instead of ared yellow green, three level system.
They have a five point system.
I've seen other people thatare doing 5 to 7 point systems
around like specific.
It's almost like a rubric.
And within each one, we've got adescription of what I use looks like.
I think this is another exampleof one of those moments where it's
like everything works somewhere.
Nothing works

(53:37):
everywhere.
Teachers need to find what's goingto work best for their context.
But, um, Yeah.
Just sort of practically acceptingthat if and when writing leaves the
classroom, there's a pretty goodchance, especially at the middle
and high school level that A.
I.
Is probably going to have an influence onthat writing once it leaves the classroom.
and and we just need to adjust accordinglyand take a practical lens on this.
And I think the assessment scale thatthat Leon Furze and Dr Perkins have

(54:00):
promoted and has now been interpreted inmany different ways, I think, is maybe
an effective first step and then pairingthat with the tooling that we just talked
about, which is like, revision history
tracking and being up front withkids about, Hey, I am going to
track your revision history here.
cause there's been some really meaningfulconversations on social media I've seen
recently on like, you know, is it alittle bit surveillancy to be To not

(54:22):
let a kid have their like, you know,private process of writing, because
I can think of a million differentthings that I might put into a blog
post in the first, you know, when
I'm just kind of like wordvomiting words onto a page that I
wouldn't necessarily want seen by
other
a good point.
Yeah.
And, you know, so Iwant to plug real quick.
Her name's Anna Mills.
I follow her and she's a great followon on blue sky and on Twitter and, just

(54:43):
around these dialogue, this dialogueon writing instruction and assessment.
I looked at
her work a lot.
so, So, yeah, I think so.
So point one of the plan is just wegot to have clear classroom policies,
and oftentimes that means havingconversations up front with kids
about what effective use looks like.
And there's a whole larger conversationthere around academic integrity and, I

(55:04):
don't know if we want to go down thatrabbit hole or not, but really, I think
we need to also reframe in schoolshow we approach academic integrity.
It can't just be a policy.
It needs to be an expressedlearning outcome that we're
explicitly teaching to kids.
Like,
this is what it means to act withacademic integrity in my ELA class
or my social studies or math orwhatever your, your content area

(55:24):
is, and explicitly teach it rather thanwhat we currently do, which is like,
Basically define it by what's what it'snot and assume kids know what that means
where it's like academic integrity isnot copy pasting off of ChatGPT and
it's not, you know, stealing from yourclassmate, whatever they wrote, or
it's like we tell kids what it's not.
And we often don't actuallyteach them what it is.
And I think that's a transitionthat needs to happen too.

(55:46):
But that's a whole larger conversation.
Yeah, you got me on that one.
I never, never really thoughtabout academic integrity that way.
Um, I have a couple of follow uppoints I want to make on that.
I wholeheartedly agree with this plan ofhaving clear classroom policies in place.
I think something like that AI assessmentscale, whether it's the red, yellow, green
one, or the one you shared about, I think.

(56:07):
Number one, from a classroom culture pointof view, you know, making things clear to
kids is really effective, especially ifwe can involve them in those discussions
and help them craft what that policy is.
I think that might not be somethingthat all teachers are comfortable with.
I'm okay.
If you're not, I thinkit'll increase buy in.
if you talk to the kids about whatthings can we do, what things can we

(56:28):
not do, what should yellow look like?
What should green look like?
What should red look like?
Even when it's green, what arethe limitations we should have?
not only talking to kids aboutthose things, which I think
is, is almost a necessity.
You have to talk to them aboutyour policies, but I think I would
also encourage you to have an opendiscussion, where it's co right.
It's both of you.
It's you and the learners, not justyou telling them, what the policy is,

(56:51):
which is the bare minimum we need to do.
We also need to involve them in that.
I think you're making a really importantpoint, which is like kids need to
understand the why behind you know,why are you doing what you're doing?
Because I can understand where akid, especially a high school kid
who may not want to do your writing
assignment and who may question thevalue of why should I have to do this?
If the machine can do it for me, youneed to be explicit with that kid about

(57:12):
here's why, and be intentional about that.
So, So, yeah, that dialogueis super important.
Yeah, yeah, for sure.
And I think also It's that trust issuethat we were alluding to earlier, right?
The kid trusts you because you're beingupfront, and, transparent with them.
And it also creates thatsituation where the kid goes
like, Oh, right now we're on red.

(57:32):
If that's the scale you ended up usingas that traffic light scale, that's
a bummer, I really wanted to use myAI tools, but the kid is hopefully
like, Yeah, but he lets me be ongreen when he thinks it's appropriate.
Right.
And so there's, there's an understandingthere that go with any rules, policies,
expectations in a classroom where ifyou're flexible, like if you're saying

(57:52):
like, like, like talking about groupwork, right, right now you get to
choose who your partner is, but thisother time I'm choosing for you, right.
The kids are more likely in that kind ofclassroom when you choose for them to go
like, yeah, But he let me choose myselfyesterday and today there's a reason
and he was transparent about what thereason is and so kids can then understand

(58:12):
and respect those decisions you're
imposing
them, because they know thatyou're flexible and have
their best interests in mind.
I think that's the other important partI want to make before we move on to part
two, is That we need to be clear that thegoal of school is the learning, right?
And so all of these things thatwe're doing are in interest of them

(58:34):
learning to be really good writers.
So every decision we make when we're,when we put it on red, it's not because.
that will create a flaw in theassessment and then I won't have
an accurate grade for you becausethe grade is not the priority.
The priority is the learning, right?
And so we have the expectations and thepolicies and the procedures and things
like that in interest of learning becausethat's what our goal is, is learning,

(58:58):
Exactly.
And it's funny you say thatbecause we did a webinar with Dr.
Dylan William, who's an advisor to us.
And for my money, like a godfatherof modern K 12 assessment.
And he talked about that because Ibrought up this red, yellow, green system.
And he was like, red,yellow, green, fine, use it.
But what's really interesting aboutthis is like, just to what extent
is AI serve the learning constructof whatever you're trying to teach.

(59:20):
And if it doesn't, why are you using?
Why are we even worrying about AI?
Like um,
so, what I like about the system andit allows you to cater your policy
kind of to your point, it's breathable.
You can have conversations with kids.
You can Sort of pick and choosehow you want to use AI specific.
It's not just like a blanket,like absolutely no AI
ever, because in some cases it mightbe useful and in some cases it may not,

(59:41):
it just depends on what the assessmentconstruct of your assignment is.
So I like it as a system
for that reason.
Yeah, I agree wholeheartedly.
Okay.
So I I'm guessing I may or maynot have had a peek at what
you want to talk about today.
that, the topic of Mr.
Dylan William, who is one of my educationheroes, may connect to your next topic
here because it may be about somethingthat a lot of his writing is related to.

(01:00:03):
So go ahead.
Mm
yeah, it kind of is.
It's so, so point one is like,I have clear classroom policies
and conversations with kids.
But then point two is okay.
If you are nervous about AI and you arestill working on like the core scale of
just kids putting words into sentencesinto paragraphs and you don't want them to
use AI because you want that you're stilldeveloping those foundational skills.

(01:00:24):
And I think you're going to need to bringmore writing back into the classroom.
So, so point two is just moreshort formative targeted in class
writing instruction that is thatis based around Dylan Williams, you
know, sort of he has a five point
plan actually around formative assessmentthat's actualizing those best practices
and formative assessment and reallygetting kids to think critically

(01:00:46):
about about writing instruction
about writing skills.
and that's I mean, this is not avendor session here, so I'm not going
to sit here and plug Short Answertoo much, but that's specifically
why we're building Short Answer
is.
It is for that in class, short form,meaningful conversations and social
interaction around writing instruction.
So, you know, that'sthe tool we're building.
I'm happy to talk more about it, but youdon't have to use Short Answer to do this.

(01:01:09):
You can do really cool paper, pencilstuff to with in class, you know,
short form writing experiences aswell that I'm happy to talk about.
Yeah.
I want to jump in on that one.
and, and point out like there are certaintimes when ed tech companies, which Short
Answer is, I don't know if that feelscrazy for you to have an ed tech company,
like, right.
Um, there's, there's ones that I feeldifferently about and there are certain

(01:01:32):
ed tech companies that I really root for.
Because they are coming from aplace of trying to do something
good for in the classroom.
there's plenty of them out there, but forexample, in the most recent episode of
the podcast, I talked to Dan Stitzel andDan was talking about when he saw Josh.
I don't know how topronounce Josh's last name.
Uh, but the, the founder and originatorof Gimkit, when he saw him speak

(01:01:54):
years ago, Josh as a high schooler.
Yeah.
Realized that he really liked gamifiedlearning experiences and was really
interested in coding and in a highschool class developed a tool that
later led to him creating GimKit,which is a gamified classroom thing.
And so it's easy to cheer forGimKit because it's this guy who

(01:02:16):
wanted like a certain thing toexist in the classroom, right?
It wasn't.
It wasn't a, like a money grab.
And so Short Answer.
Similarly, you have this thing thatyou're trying to work towards, which
is finding a way to make this shortformative targeted in class writing happen
and be digestible and be fun for kids.

(01:02:36):
Um, and Dylan William, I know you'vedone a webinar with him too, correct?
Is that right?
Yeah, we have.
Yeah.
And I'm happy to link it in the show notes
Yeah, for sure.
Um, and so Dylan William talks a lot inhis work about formative assessment and
about the power of formative assessment.
And I think that's a hugefirst step we can make in all
classrooms in the age of AI.

(01:02:58):
We should have made before theage of AI, but also in writing
classrooms is moving more of it to beformative and doing less summative.
Because if we don't see the writingprocess and we don't see the little
bits and pieces, it's easy to betricked, and to let AI sneak in.
And if we do these short formativeassessment things, number one, it's harder

(01:03:18):
to be tricked, and number two, it showskids that our focus is on the process.
Our focus is on them as writers.
Our focus is on helping them grow.
Our focus is not oncatching them cheating.
Our focus is not on grading, right?
Our focus is on what'shappening during the thing.
So, so that's my roundaboutand long way to say, please

(01:03:40):
do tell us about Short Answer.
I, I know that it feels awkwardbecause you're, you, you don't wanna
feel like you're selling here, but Ithink people need to hear about it.
Pe everybody comes on this show andtalks about tools that they love.
Short Answer is a tool that I, I love.
It's a tool that you love, youlove it for extra reasons, , but
tell us about it.
Go for it.
Yeah.
I mean, it's so
context.
I was a teacher for seven years.
It grew out of that experience.

(01:04:01):
It started as my master'scapstone project at Stanford.
We got some grant fundingand launched a year ago.
So it's very new.
It's only been aroundaround about about a year.
The fastest way to get it across isjust think like Kahoot for writing
like kids sign in with a code.
You send out writing promptsto students or questions.
And Students respond.
It's all constructed response.
They can type theirtext or attach pictures.

(01:04:22):
They rate their confidence intheir submission, which I think is
an important point and submit it.
And then once the teachersgot all those responses, they
push him back to the class.
The really important part of this is, it'sall based around peer to peer interaction
and creating social experiences for kids.
I think that's really importantand I want to come back to it.
But so teacher pushes responses backto the class, and then in a variety

(01:04:43):
of gamified activities, the kidswill see each other's responses,
Uh, I hate to leave you witha cliffhanger, but this is
a to be continued situation.
Folks.
I'm really sorry, but we will pickup where we left off with Adam.
In next week's episode, he's got more totell us about the functionality of short
answer, which I am super excited about.

(01:05:05):
As well as the rest of his tips forteaching writing in the age of AI.
You are not going to want to missthe rest of the conversation.
There is so much goodness in it.
So make sure you're subscribedso that you don't miss it again.
It's coming next week, but justwhatever app you're in right now,
Spotify, YouTube, uh, Apple podcasts,whatever you're in, make sure you're

(01:05:27):
subscribed or following the show.
So that'll come right into your feed.
As soon as that episode drops.
Uh, next week and so that you don'tmiss it also, before we jump into
the next part, before you go, I dowant to point out short answer has
been a sponsor on the show beforeand will be a sponsor in the future.
I intentionally didn't include them asa sponsor today because I thought it was
kind of a conflict of interest for themsponsoring the episode that Adam was in.

(01:05:49):
Those are two completely different,, ventures, them sponsoring the
show and Adam being on the show.
He's on the show as an educator.
And when I have a company like shortanswer response to the show, or
like mirror talk from swivel today.
Uh, or visor today.
These are companies that I believe in.
So that's why shortanswer is also a sponsor.
Um, certainly I'm not going toturn away a great sponsor that

(01:06:09):
helps fund the making of this show.
Uh, they're a great tool.
They're not a sponsor fortoday, but they'll be back
as a sponsor in the future.
Uh, before you go, before wewrap up the show, let's take a
quick look at some education newsthat has caught my attention.
First up an update from one of myvery favorite tools, pear deck.
So we know that many tools.
Now let teachers generate activities.

(01:06:31):
And lessons with AI, but thosetools don't have the awesome lesson
delivery features that pear deck does.
And now their users have accessto AI powered instant Paradex, as
you'd guess you just enter a topic,a standard, a link, or even a file.
And bam, a full lesson is readyto go right inside a pear deck as

(01:06:53):
always we know pedagogy and ourcontent better than an AI tool does.
So if you try it, make sure you proofread.
Read it.
And then afterwards, Pleasetell me, did it do a good job?
Did it hit the, hit the standard?
Well, did it ask good questions?
Was it easy to use?
I want to hear about it.
So reach out preferably on blue skyor via email or the speak pipe for

(01:07:14):
the show or something like that.
And let me know, did it work well?
Um, if you'd like to see this in actionbefore trying it out, I will put a link
to a video from my good friend, StacyRoshaun, and the show notes where she, uh,
did a tutorial on a demo of this feature.
And some other new paradigm features.
Next up Canva added a new feature calleddream lab that lets teachers generate

(01:07:36):
curriculum aligned visuals in seconds.
So detailed plant cells, historicaltimelines, whatever you need, just
type it in dream lab makes it for you.
It could be a game changerfor making abstract concepts
more accessible and engaging.
I know there were times in myclassroom, whether it was when I was
teaching math or when I was teachinga science or when I was teaching stem.

(01:07:57):
Where I'd spend, like mywhole planning period.
She's trying to make avisual for the next lesson.
And it really was such a small facet ofthe lesson, but it was such a big thing
in terms of my students' understanding.
If that makes sense.
Like.
So important.
It seemed, it seemed like sucha small thing to spend a lot of
time on, but so important thata value deserve that much time.
Now with something like dreamlab, we could potentially
have it made automatically.

(01:08:18):
Now I haven't tried it out in depthand tried it a bunch of things with it.
So if you do try out dream lab and Canva.
Again, reach out.
Let me know what you think of it.
Yeah, hit me up on blue sky,uh, or the SpeakPipe for the
show or something like that.
And tell us how to work.
Next up a Google-y update.
They have added a new way to manage whocan respond to forms for most of us, it's

(01:08:41):
just a new interface to have to learn.
And the buttons that we're used toclicking are no longer in the same places.
For those of us who createtutorials, uh, it means you have to
rerecord your Google forms, videos.
Sorry about that.
Uh, but it does have some benefits.
I think this is actually agood change for some of us.
It's just gonna be a pain, but forteachers in particular, if you're
using this in your classroom, Uh, Ithink there's going to be a benefit.

(01:09:03):
So previously you could chooseto let anyone who has the
link respond to a Google form.
Which was the kind of thedefault, or you could restrict
it to people in your domain.
That was your only option.
You could either saylimit it to my domain.
Or you could say anybody whohas a link can respond, right.
And you just wouldn't givethe link to everybody.
But technically if Joe Schmo and who knowswhere got the link, they could respond

(01:09:27):
to your form or everybody in your domain.
Now the problem would be.
What if you limited it to everybodyin your domain and it's something
for your first period class, andyou don't want your second period
class to respond, somehow they getthe link and they respond, right.
There's nothing stopping them from that.
So now this new feature does stop that.
So now you can set your form sothat only specific students or

(01:09:50):
classes or groups can respond.
So you could say just these usersor just this Google classroom.
Um, Group these, this roster, or justas group of email addresses can respond.
So you could designatewho the form works for.
It won't work for anybody else.
And what you could do then.
Is at, in that second periodclass later when you're ready
for them to be able to respond.

(01:10:10):
So the link could be there and you'rejust not open it up kind of a nice way.
If you're using Google forms forassessments, you could post it in, you
know, say cool classroom or whatever,and then make that change later.
There's better ways to do it, butyou could say, you know, the links
there, but it's not going to workuntil I put your email address in.
So those are three new updates.
Um, AI generated instant Paradex Canvadream lab, and new new restrictions

(01:10:35):
to who can access a Google form.
So I'm curious, which one of those threeupdates are you most excited about?
Again, let me know.
On blue sky, use thehashtag EDU duct tape.
Tell me all about it.
If you want to let me know ona different, um, social media.
That's totally fine too.
I'm on all of them, but blue skiescurrently, the one I'm most excited
about, there's also a SpeakPipe for theshow where you can record that feedback.

(01:10:57):
Uh, and otherwise you reach outand you're certainly welcome
to reach out via email, to.
Well, just like how my wife eventuallyhelps my kids find their flannel
PJ pants or their toothbrush.
It's time to find the end of this episode.
She's still waiting for thatmagical kid request system.
But luckily today's sponsor visormakes managing it issues in schools,

(01:11:19):
easy with features like a selfservice portal and automated tasks.
Students never need to yell Turk purser.
For a special pricing and some awesomeswag head over to visor.cloud/jake
that's V I Z O r.cloud/jake.
Trust me, visor makes managingChromebooks look easier than my

(01:11:43):
wife makes managing our kids' stuff.
Look.
And thanks divisor and thanksalso to today's other sponsor
swivel and their new tool.
Mira talk AI, have a great day, everybody.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Cold Case Files: Miami

Cold Case Files: Miami

Joyce Sapp, 76; Bryan Herrera, 16; and Laurance Webb, 32—three Miami residents whose lives were stolen in brutal, unsolved homicides.  Cold Case Files: Miami follows award‑winning radio host and City of Miami Police reserve officer  Enrique Santos as he partners with the department’s Cold Case Homicide Unit, determined family members, and the advocates who spend their lives fighting for justice for the victims who can no longer fight for themselves.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.