All Episodes

July 29, 2025 48 mins

In schools with limited resources, large class sizes, and wide differences in student ability, individualized learning has become a necessity. Artificial intelligence offers powerful tools to help meet those needs, especially in underserved communities. But the way we introduce those tools matters.

This week, Matt Kirchner talks with Sam Whitaker, Director of Social Impact at StudyFetch, about how AI can support literacy, comprehension, and real learning outcomes when used with purpose. Sam shares his experience bringing AI education to a rural school in Uganda, where nearly every student had already used AI without formal guidance. The results of a two-hour project surprised everyone and revealed just how much potential exists when students are given the right tools.

The conversation covers AI as a literacy tool, how to design platforms that encourage learning rather than shortcutting, and why student-facing AI should preserve creativity, curiosity, and joy. Sam also explains how responsible use of AI can reduce educational inequality rather than reinforce it.

This is a hopeful, practical look at how education can evolve—if we build with intention.

Listen to learn:

  • Surprising lessons from working with students at a rural Ugandan school using artificial intelligence
  • What different MIT studies suggest about the impacts of AI use on memory and productivity
  • How AI can help U.S. literacy rates, and what far-reaching implications that will have
  • What China's AI education policy for six-year-olds might signal about the global race for responsible, guided AI use

3 Big Takeaways:

1. Responsible AI use must be taught early to prevent misuse and promote real learning. Sam compares AI to handing over a car without driver’s ed—powerful but dangerous without structure. When AI is used to do the thinking for students, it stifles creativity and long-term retention instead of developing it.

2. AI can help close educational gaps in schools that lack the resources for individualized learning. In many underserved districts, large class sizes make one-on-one instruction nearly impossible. AI tools can adapt to students’ needs in real time, offering personalized learning that would otherwise be out of reach.

3. AI can play a key role in addressing the U.S. literacy crisis. Sam points out that 70% of U.S. inmates read at a fourth-grade level or below, and 85% of juvenile offenders can’t read. Adaptive AI tools are now being developed to assess, support, and gradually improve literacy for students who have been left behind.

Resources in this Episode:

Other resources:

We want to hear from you! Send us a text.

Instagram - Facebook - YouTube - TikTok - Twitter - LinkedIn

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Matt Kirchner (00:00):
Matt, welcome to another episode of The TechEd

(00:10):
podcast. I am your host. MattKirkner, I was at a huge
artificial intelligence eventthis week, actually, in the
shadow of Lambeau Field in GreenBay, and one of the speakers
said this. She said thatartificial intelligence is the
great democratizer In thiseconomy and in this age. And you

(00:31):
know, as I thought about that, Icouldn't agree with her more.
The truth of the matter is thatAI doesn't pick favorites.
Artificial Intelligence isavailable to so many people in
so many different ways. And so Ithink we are entering into an
era, maybe already in an erawhere it doesn't matter your
background, it doesn't matteryour education, it doesn't
matter your economic status,none of that stuff is important

(00:53):
in terms of your access andability to invent and to
innovate using artificialintelligence. Now that's not
going to happen on autopilot.
There's some things that have totake place along the way, but I
do believe that this is one ofthose technologies that, if we
play our cards right, can beavailable to everybody. Can do
an incredible job of knockingdown disparity across our

(01:13):
economy, here in the UnitedStates, and as we're going to
talk about today, around theglobe, we're talking about that
with a guest that I'm really,really really excited to welcome
to the studio of the TechEdpodcast from Philly. Our guest
is Sam Whitaker, the Director ofSocial Impact at study fetch.
We're going to learn in a littlebit about the work that he's
doing. But I want to start firstof all, Sam by welcoming you to

(01:35):
the TechEd podcast. Thanks somuch for being with us. Thank

Sam Whitaker (01:38):
you for having me, and just to confirm it's home of
the world champion,Philadelphia, eagles, Philly,
not just Philly anymore, so,

Matt Kirchner (01:46):
yeah, well, you're saying that to the guy
that was sitting at LambeauField yesterday. What we should
really do, if we had more time,was, is compare the number of
world championships, and thenyou know which of those two
teams had the first three ofthem, but, but that's a topic
for another day. We're going totalk about artificial
intelligence instead, and I wantto dive into that with you, Sam.
We're going to talk about anaspect of artificial

(02:08):
intelligence that doesn'tnecessarily get quite as much
news as I think it should, andthat's the potential positive
impact for people in underservedpopulations. So anybody coming
from an underserved community,before we get into that, I'd
love to hear a little bit moreabout your role as Director of
Social Impact and why that workis so important to

Sam Whitaker (02:28):
you. It's funny, I was having the conversation
yesterday. I don't really have,like, an origin story for this.
You know, I went to greatschools, I had great teachers, I
had a very supportive family,things like that. It's just
always been something I'mpassionate about, and it's
actually how I got started instudy, fetch. My friends started
the company. I'd known them foryears, and we were talking, and
I've done work at a school, at acharter school here in Philly

(02:49):
called Esperanza Academy. Andthe first thing I thought was,
Hey guys, can you donate somelicenses to the school? They
said, Absolutely. And that wasthe genesis of me becoming all
encompassed with study, fetchand AI education

Matt Kirchner (03:03):
awesome. And if my Spanish is still a little it
might be a little bit rusty, butI took a ton of it when I was
going to school and spent sometime in Spain. Esperanza Academy
would be Academy of hope, if I'mnot mistaken. And we're going to
talk about hope today, talkabout the hope that AI can
provide to so many individualsacross so many different socio
economic backgrounds, differentgeographical backgrounds. Let's

(03:26):
start with this. How do youthink AI platforms in general
can start to address some of theeducational disparity that we
see, particularly in rural andinner city communities? I live
in, in Milwaukee, it is stillone of the most economically and
racially segregated cities inthe in the world. Sad to say, a
lot of progress being made,certainly a different world than
it was in the 70s and 80s when Iwas growing up here. But so much

(03:48):
work still to be done. How doesan AI platform help address some
of those differences ineducational opportunities for
our young people and people ofall ages? Well,

Sam Whitaker (03:56):
what you started out with democratization of
opportunity? I agree with you100% but then you made one point
that I think is very important,which is, if we do it right,
yeah, that's the key studentsneed to be taught, and need to
learn how to use AIappropriately and use it for
good, for lack of a betterphrase. And that needs to happen
early, and it needs to happensoon, because if that doesn't

(04:19):
happen, and if it's still theunderserved populations that are
just using unrestricted chatTPT, where students that are in
what we would consider moreprivileged are being taught to
use it appropriately, then thatgap will actually increase, as
opposed to decrease. It's aboutcreating an environment, an AI
environment, where students canuse AI specifically to learn. So

(04:40):
with products that are built forlearning, chatgpt is a
productivity tool. That's whatit's built for. That's what it's
supposed to supposed to do,supposed to make you better at
your job. It's supposed to allowyou to accomplish more in less
time. That's not necessarilywhat you want for students.
Students need to learn. Theyneed to learn how to learn. And
with. Studies coming out. Idon't know if you've seen the

(05:01):
recent MIT study comparing chatso they real quick, chatgpt,
Google and basically handwrittenessay writers for four months.
The results were prettystaggering. The chatgpt group
saw decreased alpha brainactivity. Their work was very
generic. They couldn't rememberwhat they wrote, let alone
recreate it. And probably theworst thing is, after four

(05:23):
months, those effects lingered,and the students had to kind of
relearn how to be creative andgenerate content on their own.
And we're so early into thisthat's after four months, how
long before those effects becomepermanent? We don't know yet

Matt Kirchner (05:38):
that are, again, really fascinating observations.
I'll point to another study fromMIT, and kind of juxtapose the
two of them, Noy and Zhang didone probably two or three years
ago. Both students at MIT, theylooked at they basically took
college educated professionals,divided them into two groups,
gave one group chatgpt, let theother one do it the old
fashioned way. Found that thegroup using chatgpt Number one

(06:00):
finished their work 37% faster.
Number two, the quality of theirwork improved. And number three,
their job satisfaction in realtime improved as well. So that
kind of points to the benefitsof chatgpt And certainly, who
wouldn't want to be able to dotheir work faster, more
accurately and enjoy it more?
Nothing wrong with any of thosethings. But at the same time to
your point, if we're notutilizing certain aspects of our

(06:22):
brains, and I'm not pretendingto understand the science behind
this, in a way that we learn, ina way that that learning becomes
real to us, in a way that we'rethinking through it, that we're
thinking critically, we'rethinking creatively and training
our brains how to think, asopposed to just relying on a pre
trained transformer to do thatfor us. That there's a huge loss
in terms of retention, amongother things. If that's the way

(06:44):
that we're leveraging artificialintelligence, is that, is that
correct? I agree 100%

Sam Whitaker (06:48):
I always make the analogy to learning to drive. So
when students are learning todrive, they get a learner's
permit, they take some classes,they drive with their parents
for a while. They can't drive atnight for a while. And
eventually, when we've deemedthat they've successfully
completed that, and they'vemastered driving to an extent,
although, when I was 16, Idefinitely hadn't

Matt Kirchner (07:06):
mastered driving.
I'm not sure I have yet, but I'mstill working on that. But we
don't

Sam Whitaker (07:10):
have that process right now. We're handing the a
car. An automobile is a big andpowerful and dangerous machine,
and so is AI, and if we don'tteach kids how to use it
appropriately. It's very, verydangerous.

Matt Kirchner (07:23):
You know, it's a friend of mine years and years
ago, told me this lesson. He hadtwo daughters that were learning
to drive, and he said, You neverwant to take a three or 4000
pound vehicle and move it in adirection without knowing what
you're doing and looking exactlywhere you're going. And it's
kind of the same thing here,where, you know, if we think
about AI as a powerful machine,and indeed it is. And anybody
who's used chatgpt or other gptscertainly has the ability to

(07:46):
appreciate and recognize thepower there. But in the same way
that we get in big, big troubletaking a motor vehicle and
running at 70 miles an hour downthe interstate without some
controls, without someguardrails, without some
understanding of how that'soperating and how to do it
safely, we can get ourselves inbig, big trouble. That's
certainly true if we're drivinghere in the in the United States
of America. It's certainly trueif we are driving in other parts

(08:10):
of the globe. You spend anopportunity doing that. We can't
drive from here to Africa. Youhave to take a plane or a boat
to do that. You actually did.
You traveled to Uganda, as Iunderstand it, over there, you
were introducing AI to studentsat African rural University and
the Uganda Rural Development andTraining Program girls school.
That just sounds fascinating tome. I mean, you start thinking
about, you know, how we're usingAI here in the West, here in the

(08:31):
United States, would love tohear a little bit about what you
saw in Africa. What familiaritydo those students have with AI?
Was it similar to what you wouldsee here? What about their
teachers? And how did thatexperience maybe differ in some
ways from what we're seeing herein the United States as well.

Sam Whitaker (08:49):
So it's interesting that our the segue
there was from driving toAfrica, because a lot of Africa
driving is a big part of it. Sothe school was actually seven
hours away from the airport,seven drive, but it was only to
go 180 kilometers. The roads arereally more pothole than road.
They actually have a phrase forit. They call it an African
massage, because you're gonnalike this entire time, and

(09:11):
there's a lot of weaving, andlanes don't really exist, but
getting there, but all of thatcompletely worth it, just to be
there. So to answer yourquestion, to start familiarity
with AI was one of the firstquestions I asked to kind of try
to get a baseline. And therewere 30 girls in the class, and
I asked, how many have you know?
What do you know? What AI standsfor? Everybody raised their

(09:32):
hand, artificial intelligence.
How many of you were using AI?
28 out of the 30 students werealready using AI. Wow, to give
some reference again, sevenhours from the airport, this is
about as far afield as you canget where you still have
internet access, yeah, and it'smostly chat, GPT and deep seek.
And then I asked, How many ofyou have taken classes to learn
how to use AI, or how many ofyou, how did you learn how to

(09:56):
use AI? And the answer wasalways, well, I do. Just started
using it. That's was a littlebit scary because, and honestly,
like, to your point, not verydifferent from what's happening

Matt Kirchner (10:09):
here. Yeah, interesting. It's curious to me,
you think about AI here in theUnited States, and certainly
very familiar with deep seek,the, you know, one of the
Chinese versions of gender, ofartificial intelligence, really,
some some innovative technologyin terms of how they're doing
searches, and doing it in a waythat's more efficient and in
some cases, consumes less energyas well. So it could probably do
a whole episode just on thejuxtaposition between chatgpt

(10:30):
and deep seek, but butinteresting. We don't hear deep
seek much here in the UnitedStates, yet. Anyway, I know it's
it's available, and people aretalking about it, but certainly
not as ubiquitous as some of theothers. And you're saying that
it was it as common as chatgptWhen you were in

Sam Whitaker (10:45):
Uganda, it sounds probably about 5050, okay, yeah,

Matt Kirchner (10:49):
yeah, fascinating. So we'll see where
that technology takes us, andalso how different cultures are
going to influence how we'rebasically democratizing and
spreading artificialintelligence around the globe,
whether we're right next to anairport here in the United
States, or seven hours from onein Uganda. Why Uganda? I mean,
of all the places you could pickto go and study artificial
intelligence, why did you pickUganda?

Sam Whitaker (11:10):
It wasn't so much picking as it was where, just
kind of where it happened when Iwas in a conference in Singapore
the end of last year, and I meta teacher who had done some work
at this school in Uganda, and westarted talking, and it was a
long process. I mean, it waslike six months to really kind
of get the get everybody onboard. And then we did a lot of
training with the teachersvirtually, where we were kind of

(11:32):
showing them the tools, and theywere starting to get to use it a
little bit. And then when we gotthere, we made sure that all the
students were set up with oursoftware, study fetch, and made
sure that they were ready to go.
And then when we were there, Imean, it was still there were, I
mean, there were generatorissues, and the internet went
down a few times. But it wasworth it. And monkeys running
through the campus the wholetime, which was awesome, wow. I

(11:54):
was saying when I landed, I waslike, I gotta see a monkey
before I go. And my friend, whothe teacher who came with me,
she was like, Don't worry,you'll see a monkey. And as soon
as we pulled up, I mean, thereit's built into the forest, so
there's 2000 monkeys that livejust off of the campus. They
just kind of run through andback and forth the entire day.
It was amazing. It

Matt Kirchner (12:15):
was so cool.
Yeah, yeah. That is really cool,and certainly a lot different
from what we see, of course,here in the United States, first
of all, by coincidence, thismorning, in our in our studio
here at the TechEd podcast, ourinternet went down. Somebody had
to, almost never happensanymore, so it's, it's kind of
an anomaly here, but somebodyhad to go reset the router. In
this case, never had somebodyhave to go fire up the
generator, just so we could getour our internet working, and

(12:36):
certainly not having to dodgethe monkeys while they were
doing it. So that sounds like areally, really cool experience
that you had come back. We'll doan episode on deep seeking
monkeys, right? That would justbe, that'd be fascinating stuff.
But for for the time being,we're going to stick to this
topic of democratization ofartificial intelligence. What
role do you see AI playing? I,you know, I teed it up a little
bit in the intro thisconference. I was at this

(12:56):
meeting, I was at yesterday,where somebody saw it as the
great democratizer, andespecially in terms of high
quality learning. You talkedabout the importance of learning
to use generative AI, the rightway learning, the right things.
I mean, do you see this asreally kind of creating an
environment where there's lessdisparity in terms of, you know,
whether it's socioeconomic,geographical, political,
whatever, access to education?

(13:19):
Do you see it having that impact100%

Sam Whitaker (13:22):
it can, if we do it right back to that so going
back to Uganda really quickly.
So I gave the students. We endedup having about two hours with
our software where the studentscould really just sit down and
learn. I found an NVIDIA videoon it was about an hour long
video on deep learning andneural networks, which is the
basis for all GPU softwaredesign. And students had never

(13:42):
heard of Nvidia. They had neverheard of deep learning or neural
networks, anything like that. Soabout 60% of their education is
actually agricultural as well.
They're learning to live theirlives in their world when they
graduate. Sure this was nowhereon their radar. So they had
about two hours with oursoftware in this video, and they
were tasked with just learning,and they were preparing to give

(14:04):
a presentation at the end basedon what they learned. And we got
to it, I had no idea what toexpect. I really did. I was just
happy that we were there andhappy that they were introduced
to the software, and they werekind of learning, and the
students were really all ofthem, even the professors were
really kind of timid, a littlenervous when we first started,
but then as you get into it,it's just a classroom, and the

(14:25):
students are getting moreexcited, and they're laughing
and they're joking with eachother. And when they gave these
presentations, it was one girlwrote a poem about neural
networks. Another wrote entireshort story on deep learning.
Another one wrote a song. Andthen some did like, just great
presentations, and they're in noway experts, not at all, but
this is advanced stuff, and theycan hold a conversation about

(14:47):
it. Now after two hours, that'swhat can happen with AI. But
again, if we do it right. So ifyou look at democratization and
kind of the haves and have nots,China, for instance, is
mandating AI edge. Education tostudents starting at six years
old, starting this year. And Iguarantee you, they're not just
giving them access to deep seek.
They're teaching responsible AIuse. They're using platforms to

(15:08):
teach students how to use AIappropriately so they can build
the next generation of deep seekengineers. But however, in other
countries, in the US, forinstance, we're being incredibly
hesitant with how we roll outstudent facing AI to students,
and it's, it's born of fear, andI get that completely but the
sad fact of the matter is thatif we wait, we lose we have to

(15:33):
act now, and that's where thedemocratization comes in. It. We
don't want to get to a pointwhere it's the haves and the
have nots. We want to createequality of opportunity across
the globe, and the only way wecan do that is if everyone is
rooted in responsible AI skills.

Matt Kirchner (15:51):
Everyone is rooted in responsible AI skills,
for sure, and I've got a bit ofa mantra going through life,
Sam, is that I don't need tounderstand everything about
something in order to be able tobe able to use it. In other
words, I don't need tounderstand everything my laptop
is doing in order to be able touse it. Same thing with my
smartphone, for that matter,same thing with my smart car
anymore. I can use the Smartaspects of it without
understanding why or how it'sdoing what it's doing autonomous

(16:14):
vehicles. Same thing I wrote inone not too long ago, in
Phoenix, and you get in a carand there's no driver. I don't
know how it does that. I kind ofdo but, but I don't. I don't, I
don't need to know that in orderto get from point A to point B,
unless, yeah, you're talkingabout, and you did it in two
hours. It's not like you'recreating AI scientists or PhDs
in artificial intelligence, but,but you're talking about
teaching deep learning, andyou're talking about teaching

(16:35):
Neural Networks. Why is itimportant to teach some of the
framework of artificialintelligence and what it doesn't
and how it does it in order fora student to use it responsibly.
Or, I guess, do we have to teachthat first of all? And if so,
what I think

Sam Whitaker (16:48):
we do, I think a basic basis of knowledge is good
to your point, that you don'thave to understand everything
about neural networks in orderto use chatgpt. I agree with
right, but a breadth ofknowledge is just a good thing
to have, and kind of being ableto have a conversation and
understand where these thingsare coming from that's
incredibly useful. And it's itwasn't specifically because we

(17:09):
were doing AI that we weretalking about AI, it just
happened to be a good video thatI thought would be good at a
topic I was fairly sure theywould know nothing about. Sure.
That's kind of a baseline,

Matt Kirchner (17:19):
yeah. And it proved to you, then that you
could use this, this platform,to teach students a topic that
they know nothing about enoughso that in two hours, they're
writing poems and singing songsand doing presentations about
that technology, which is, whichis really, really cool. I want
to switch gears a little bit andtalk and related topic, for
sure, this whole idea of AIliteracy, the idea that students

(17:39):
are learning how to use AItools. Really, really important.
But you talk about AI andliteracy, putting that word and
in the middle, using AI as atool to help students who are
behind in such subjects as asreading and writing, and maybe,
you know, written comprehensionor verbal comprehension, help us
understand the landscape ofliteracy. Let's start here in
the United States. I know we'vebeen talking about Uganda, but

(18:01):
are students falling behind nowin these areas, in terms of
reading and writing and so on?
And if so, do we see thisproblem worse in certain
communities than others? They're

Sam Whitaker (18:10):
not falling behind. They are behind. I've
been doing a lot of research onliteracy recently, and when you
really dive into the stats, it'shorrifying. Some of the scariest
stats are, when you look at theprison population, 70% of
inmates in the United Statesread at a fourth grade level or
below. 85% of juvenile offenderscan't read. You want to talk

(18:32):
about making an impact onsociety, and then there's a
multiplicative impact ofeducation in detention centers
in terms of recidivism. So themore inmates are educated, the
less likely they are to end upback in prison. It just shows so
clearly how education is soimportant. And if you think

(18:52):
about traditional literacy, itis, I mean, it's the basis of
being able to be functional insociety. You need to be able to
fill out forms, you need to beable to read signs, you need to
be able to do things. Andthere's kind of levels of
literacy. We don't really talkabout grade levels anymore. We
talk about kind of likefunctional literacy, and then
kind of meeting expectations,and then exceptional literacy.

(19:12):
Sure, so someone may be able toread a newspaper, and just
because they can understand allthe words, that doesn't mean
that they understood the impactof what is being said. Sure,
that's comprehension. Yeah,exactly that. There's a distinct
difference there. And then youtalk about functional literacy,
things like banking, things likehealthcare. And now we're
introducing AI, and AI is goingto be such it already is, and

(19:35):
it's going to be even more of arequired skill for success in
the world. So what we're tryingto do is and also it's tangible.
So traditional literacy issomething that's tangible in
kind of in districts and withpolicy makers, whereas AI still
isn't really quite there. Sowe're trying to solve, and help
to solve the traditionalliteracy problem here, starting

(19:58):
in the United States, while all.
So teaching AI literacy at thesame time, so tackling two of
the many, many pillars that needto be hit,

Matt Kirchner (20:07):
got it, yeah, years and years ago. And I mean
when I say years and years,years and years ago, when I was
in scouting, one of my friendswas working on his Eagle
Project. Of course, to be anEagle Scout, many people know.
Most people maybe, know you haveto do a service project, right?
You organize it, you plan it,you you lead it, you know, you
do the post mortem and talkabout the the impact of it, and
so on. And one of my friends,his project was to paint a

(20:30):
Literacy Center in Milwaukee. Sowe went into wasn't quite the
central city, but it was nearnorth side of Milwaukee. We
would call it. And we went in,and we spent a weekend just
painting this Literacy Centerand kind of beautifying it. And
it was my first introduction.
First introduction, growing upin a you know, middle class
suburb, to the idea that therewere a lot of people in my
community, like or the greatercommunity in Milwaukee, who
didn't know how to read. And itwas, it was really fascinating

(20:51):
to me, and not in a good way,but this whole idea of, how do
you how do you function, how doyou learn? How do you do a job?
How do you gain knowledge,especially then when tools like
smartphones were in and otherways that we had to learn
weren't readily available. Ifyou wanted to learn, you went to
a lecture, you watched a TVshow, or you read a book, that
was how you learned. And so thatwas a real eye opener for me.
And yet, even in this day andage, when it's, you know, 3040,

(21:14):
years later, there's still tonsand tons of people who don't
read at the level they could orsome people you know hardly read
at all. And to your point, ifyou know had a huge impact on
recidivism. We spend a lot oftime in corrections, and the
more educated that somebody is,the more they can come out and
have a skill, can get a job thathelps pay the bills, the less
likely they are to re offend.

(21:35):
And it's exponential. So hugeimpact on the literacy side.
Certainly talking aboutartificial intelligence is kind
of another chapter of that,right in the same way that if
somebody couldn't read or can'tread now, life is going to be
really, really tough for them ifthey don't have a command of
artificial intelligence,probably the same thing coming
at them. So I get the two ofthose separately. Let's talk

(21:57):
about the two of those together.
So artificial intelligence, isit a tool to help with reading,
with comprehension, to helpimprove learning? Talk about
that

Sam Whitaker (22:06):
without a doubt, and not just for when you're
talking about demographics, for,you know, areas where literacy
is particularly low, there'salso cross demographic
communities. Of what? A friendof mine, she's actually the
Secretary of Education ofOklahoma. She likes to call them
the jagged edge pieces. Whenwe're talking about the puzzle
that makes up the United Statesof America, we're all jagged

(22:27):
edges in some way, and figuringout together, some students are
more jagged edge. And we callthem neurodivergent, or we call
them on a spectrum of some kind.
There are so many that arefailed across demographics in
education, and when we talkabout how AI can affect that,
it's about personalization. It'sabout meeting students where
they are and also when they are.

(22:50):
So AI having the ability, forinstance, the literacy modules
we're building now, we'rebuilding out ways to not only
assess where a student is now,but as a student starts to read
a little bit more and starts toget a little bit better,
challenge them a little bit. Andthen if they're having a little
bit more, a little bit oftrouble, then you back off a
little bit, and then youchallenge them again. And slowly
that inches up. But that processis different for every student,

(23:12):
and in the in our educationsystem, we try to fit every
student into this box. And ifyou're in this box, and you do
it well, you're smart. If you'renot in that box, you're stupid.
And even though we don't saythat, that's the impression kids
get so early on and by secondthird grade, they just so many
give up because they just think,I'm dumb. I'm never gonna do

(23:35):
this. Yeah,

Matt Kirchner (23:36):
no, you're exactly right, and we talk about
that a lot in this podcast. Ouraudience probably is sick of me
telling about my journey throughgrade school and middle school
and high school, and how I justlearned differently. I couldn't
sit still. I did fine at school.
I mean, I had to put the effortin and get through it, and I
did. But when there were so muchbetter ways for me to learn than
the traditional education modelof sitting in a classroom and
listening to somebody lectureand look, there's a lot of

(23:57):
students that learn great thatway. I know a lot of them in my
own life that are really good atjust listening, taking notes and
capturing all the all theknowledge. Great for them. Not
so great for the other. Whateverpercentage of students it is
that don't exactly learn thatway, and we try to fit everybody
into this exact same model ofeducation. I agree with you that
artificial intelligence is thegreat opportunity creator in
terms of changing the way thatstudents learn, I agree that

(24:20):
we're going to get away from, ifnot already, grading people by
what grade they're in. It'sreally can't. Do you have the
competency? Do you have theability, or do you not? And if
you don't have it, you're notgonna be punished for that.
Let's figure out how we caninstill that competency in you,
if it's important to you andit's important to society, and
if you do have it, there's noreason for you to sit next to
the student that doesn't yet andhave to wait for that student

(24:41):
because it's holding otherstudents back. And by the way,
those same two students, ifyou're in a different class, may
be in the different positionswhere the student that maybe
struggled a little bit more inan English class goes into a
technical education or STEMclass and excels, where the
student excelling in Englishmaybe struggles a little bit
more there, just as one.
Example. So it's not saying goodstudent bad students. It's

(25:02):
saying we're all different. It'ssaying we all have different
interests, we all have differentways of learning. And AI really
creates an opportunity for us tocustomize and personalize
learning in a way that we'venever, ever done before. That's
one of the great opportunitiesthat comes from generative AI
and AI in general. There aresome risks that come along with
it, and a lot of educators,there's some that are that are

(25:22):
saying, Look, you know, we needto embrace this. We need to find
ways to use it. There are othereducators that are saying, Look
at chatgpt. It's just doing thework for the students. So how do
we redesign the experience inthe classroom? Right? Because
our classrooms are set up for asage on the stage, one size fits
all, lockstep learning model,and now we're talking about
having 30 students in aclassroom, 30 different modes of

(25:43):
learning and ways of learning.
How do we prepare the classroomfor the new way of teaching and
learning in the age ofartificial intelligence?

Sam Whitaker (25:50):
More than anything, we need to get
started, and we need to startnow. We have to get student
facing AI into classrooms now.
That's why we're starting. I wastalking about literacy being
tangible with so much feararound AI and so much hesitance.
What we're doing is tacklingsomething that is tangible, that
is a crisis, and people realizeis a crisis. And to that end,
we're actually going to beoffering it. We're spinning it

(26:11):
off as a completely separateproduct, separate from what we
our core study fetch, and we'regoing to be offering it as close
to free as we possibly can. Infact, we're working with some
companies and foundations tooffer it for free. We're
spinning it off because it's nota profit center. It's going to
be, we believe it's, it's anecessity in society. And to
that end, we're going to beworking with any competitors,

(26:31):
any experts, anyone out therewho's listening now or who you
know is reading the transcript.
If you want to help and getinvolved, we'll bring you on. We
don't care who you are, if youhelp us get to a solution, but
that's where we have to startwith those steps. We have to
start with getting it in frontof students, and we have to be
successful solving a tangibleissue, and then it grows from

(26:54):
there. And we have to figure outdifferent ways to evaluate
students. And can't be thetypical, you know, go home and
write this essay. It's too easy.
You say, hey, chat to pt. Writesomething about Catcher in the
Rye. That sounds like me. It'sdone in class, assessments,
understanding and realretention, having discussions
almost kind of mirroring like alike a dissertation in college,

(27:15):
but at much younger levels.

Matt Kirchner (27:18):
You know, one of the most fascinating things I
saw on Tiktok was basicallyshowing students how they can
use software that's been createdto help people with Lou Gehrig's
disease or ALS speak. So it'sbasically language generating
software for people that can nolonger communicate on their own,
and they're basically using thatsoftware, combining it with
generative AI like chatgpt,having content written by chat

(27:42):
GPT, filtering it through thelanguage software that actually
makes the words sound like thestudent themselves, right in the
way that they would speak. Andthen you think about a poor
teacher that's that's trying topolice the use of generative AI
outside of the classroom. It'slike that's a fool's errand. The
students are in some ways goingto be so far ahead of teachers

(28:02):
that are trying to catchcheating that we're really going
to have to flip the educationmodel. And you know, I used to
say, we used to say that schoolis the place we went to learn,
and home was the place we wentto practice. You'd go to school
to gain information, to sit inthe class, to have the students,
the teacher, rather, downloadtheir wisdom to you. Now we go
into this world of the futureusing artificial intelligence,

(28:24):
and we're not going to benecessarily learning at school
and going home to do homeworkand practice. We're going to be
maybe learning outside theclassroom and then going back
into the classroom to do thepracticing and to do the work
and to talk about the meaning ofwhat we've read, or to apply
what we've read or to do. Inthis case, you know, some of the
songs and the stories and thepoems that came from this work

(28:45):
you did in Uganda. Do you see itthe same way? I mean, are we
unleashing creativity in waysthat we haven't been able to in
the past, and is that world ofeducation going to flip in that
way that we're actually going toclass to basically experience
and to perform and to practicewhat we learn and to perfect
what we learn in ways that wenever have in the past.

Sam Whitaker (29:03):
So I always say with AI, we kind of mentioned
spectrums earlier. I think AI isabsolutely not a spectrum. AI is
going to be extremes. It's goingto be great, or it's going to be
awful. I don't really see howyou have an in between, because
if it's great, if we teachstudents appropriate use early
on, and we teach them toleverage AI, to expand and make

(29:25):
them more productive and makethem more creative, then I can't
even imagine what the amazingimpacts in society there. But if
we don't, you're talking aboutthe death of creativity. You're
talking about the death ofcritical thinking skills. You're
talking about I mean, you'veseen the movie wall e where all
of the humans are just on thesehovercraft wheelchairs, and they

(29:46):
just get just scooted around,and they watch shows and they
eat food, and that's it. Soundsincredibly dystopian, but it
could happen. I mean, why isgonna able to do everything for
us? We're almost we may almosteven have to choose. Choose to
work. I mean, think about wherewe don't have to AI can take all
the necessities, and the onlyway we're going to choose to

(30:07):
work is if we're doing somethingwe love, and that's another
area. AI can start early andidentify things kids love and
things they're good at. Make ita part of their learning
journey. Identify things kidsare good at and they enjoy early
on, make that a part of theentire educational journey, and
that can lead intoapprenticeships. When did vo

(30:28):
tech schools become such kind ofa bad word in the United States?
It's somehow they got thatstigma that was where the dumb
kids went. Not every kid needsto go to a four year college,
necessarily, but some kind ofapprenticeship path where
students, even students who areon like a PhD path for machine
learning, and they're going towork for Nvidia someday, that
doesn't mean an apprenticeshipalong the way can't be super

(30:49):
valuable and even more valuablethan a classroom experience,
perhaps, absolutely,

Matt Kirchner (30:54):
depending on the student without question. And
we're a huge fan of ourTechnical and Community colleges
across the United States ofAmerica. They are in vogue
again. They never should havenot been, because the truth of
the matter is, there's hugepride and incredible
opportunities for careers thatcan be born out of our technical
colleges. But you and I agree100% you know, we start to think
about and this is reallysomething that I've been locked

(31:16):
in on for a while. I don't know,and I don't mean to put you on
the spot, but have you readGenesis yet by Kissinger, Henry
Kissinger's book, Genesis, hewrote it. They published it
posthumously. You'd love it. Yougot to read this book. Yeah,
we'll link it up in the shownotes. I've talked about it
probably five or six times onthe podcast. There's a whole
section Sam in that book abouttwo things that you just
mentioned. Number one, thinkabout a world in which we don't

(31:40):
have to work. Think about aworld in which your food is
farmed for you, created for you,prepared for you. Your house is
heated using advancements inenergy that require very few
resources, that you don't haveto drive anywhere, that you know
that basically, your lawn ismowed, your kids diapers are
changed for you, using roboticsand automation, where,

(32:00):
literally, if you don't want to,you can just not work. You can
just sit and do nothing, and itdoesn't really cost you much,
depending on how all that playstogether. That's a really,
really different world from theworld in which we live. And they
go into some of the concernsabout, you know, we get a fair
amount of purpose from the workthat we do, whether it's, you
know, some people love theirjobs, like me, some people are
so so on. There's whatever.
There's a purpose that comesfrom going to work every day

(32:22):
that goes away when we don'thave to do that. They go into
this whole idea of what happensif we let AI go unguarded,
right? And there's no guardrails, and we create these
artificial Intel, artificiallyintelligent technologies that
start to go to war with eachother, right? So I've got one AI
and another AI, and one decidesthat the other one is moving in

(32:42):
on its space, and so it decidesto attack it, and it decides to
take over a weapon system whereit decides to destroy the other
AI's data center. I mean, thisis really used the word
dystopian, and I don't mean togo too far down that dark path,
but you and I agree. I mean, ifwe don't think about how
artificial intelligence getsdeployed if we don't have some
guardrails around the ethics andthe uses and training people on

(33:04):
how to use it and not to use it,things can end really badly. And
on the other hand, it has atremendous ability to solve
problems in healthcare andimprove our quality of life and
give people opportunities thatthey've never had before. But it
really is, I mean, the way thatyou put it, both in terms of the
value of work and what AI iscapable of doing in these two
extremes, you and I are on thesame page with regard to that,

(33:25):
and so I think it's going to bereally interesting to see how it
plays out, but also really,really important to make sure
that we're utilizing AI andgenerating public policy and
thinking about how we regulateand don't regulate it, because I
think over regulation ofartificial intelligence can be a
real problem as well. Butreflecting on all that a little
bit, are you seeing it the same

Sam Whitaker (33:46):
way in many ways?
Yes. So bring I'll go back toUganda, because that's kind of
the where we started. So thefounder of the school there, his
name is Dr Michelle, one ofthose guys you meet where you're
just like, wow, I'm really gladI got to spend some of this
man's time with him, it was,he's incredibly inspirational.
You know, he's a greatbackstory. He exposed some
corruption early in his careerfrom people who were stealing

(34:08):
money from a nonprofit exposedit. There was some incident with
a grenade thrown into his houseand shrapnel, and then he ended
up getting the grant, and hefounded Aru, and you are DT just
an amazing person. And we werehaving a conversation, and he
was talking about a lot of theseissues. He's not a religious
man, but he was talking aboutanother Genesis, and right God

(34:31):
creating man in His own image.
And then he was talking about,you know, humanoid robots. And
then we have AI that can reason,and they can think that's going
to go into these robots. And atwhat point are you blurring the
lines between what's human, whatisn't human, what has certain
rights, what doesn't havecertain rights? It's when you

(34:54):
talk about safeguards andguardrails. I agree with you
completely. And and the the truedystopian the to. 1000 sky you
know, all that Skynet. If that'sgonna happen, there's probably
not a whole lot I can do aboutit. So I'm gonna focus on this
stuff I can do something about,which is almost to me, the more
insidious way AI takes over isus letting it happen, right?

(35:16):
Just becoming complacent andbecoming lazy, lazier and just
but lazy to the nth degree,where AI is doing everything and
we're not doing anything. Andthat's where back to what we're
talking about. That's where wehave to step in, and we have to
step in now.

Matt Kirchner (35:31):
So let's talk about that. You know, you I
think you actually dovetailedthat right into the next topic
that I wanted to cover. Youknow, we've talked about some of
the challenges with AI. I agreeon the whole idea of once we're
once we're creating humanoidrobots that have the ability to
think and reason like humans,but maybe don't have the innate,
I believe, innate ethicaltendencies that I think are

(35:53):
endemic in most people, and theyjust go rogue. That's something
to worry about. All right, solet's bring this closer to home
for our teachers. We have a lotof educators that listen to this
podcast, and we're proud to havethem along every week. And I
know the reason that a lot ofthem listen is because they want
to figure out, okay, what am Isupposed to do about this? So,
you know, I get that we shouldbe worried. I get that we have
to do it the right way. Youknow, you talk about the the

(36:15):
idea that the issue in educationisn't the tool, it's not the AI,
it's itself. It's as you'vealready referenced. It's how we
introduce it in the classroom.
If I'm a teacher and I'mlistening to this podcast, and
I'm saying, okay, I get it.
There's risk here. But what doyou want me to do specifically
is possible? What is it that wewant them to do? What are you
what's the call to action for aninstructor, a teacher, a

(36:36):
professor in education, there's

Sam Whitaker (36:38):
a phrase, and it's from, I think it was mainly
associated with the early daysof Facebook, move fast and break
stuff. I hate that phrase. Ireally do. I think it just, it
pushes thought away. It pushesyour preparation away. And it's
just it gives people an excuseto just throw something out
there and without reallythinking about the
ramifications. Having said that,we kind of have to do it a

(37:01):
little bit. We have to be asthoughtful as we can be, we have
to be as purposeful as we canbe, but we can't do that to the
point where we're waiting toolong for a perfect solution.
It's another I'll keep up withthe phrases. A good plan today
is better than a perfect plantomorrow. We have to get some
stuff into the hands of studentsand figure out if it works. And

(37:22):
so from a teacher's perspective,I understand the fear, I
understand the hesitance, all ofthose things, but you're doing
your students a disservice ifyou're not teaching them
appropriate use of AI. And therehas to be a constant feedback
loop between teachers, betweenresearchers and between
industry, and we have to figureout ways to what's working, what

(37:44):
isn't working, fix it, trysomething else. Come back, try
it again. These traditionalmethods of research, where it's
a year long study, followed bysix months of peer review,
followed by finally publishing,and then maybe a few years from
that, you see the results theMIT study I was talking about,
for instance, the leadresearcher specifically said in
our release, I intentionally didnot submit this for peer review.

(38:06):
I have now, but I wanted topublish it first, because I
believe the findings are soimportant right now, and that's
teachers have to be a part ofthat, and industry has to
welcome teacher input andteacher just teachers being
involved, teacher involvementacross the board, and right the
ones who do, and I can say thatwe do. We answer every single

(38:27):
teacher who reaches out to us onsocial media. We answer someone
who emails us, and we take theirthoughts, their concerns and
their suggestions underadvisement, and we put them in
the platform as quickly as wecan, because teachers do know
best.

Matt Kirchner (38:40):
Yeah, in many ways they do, for sure, and
that's what drives continuousimprovement, is listening to the
voice of the customer, listeningto the voice of the user,
building improvement into themodel, moving quickly in many
cases. You know, I think aboutyou juxtapose education versus
industry, and I spent the first25 plus years of my career
outside of the world ofeducation. The last 10 I've been

(39:01):
inside, or at leasttangentially, Inside Education,
spending much, much more timewith educators than I ever have
in the past. Be honest with you,in an industry, we didn't peer
review anything. I mean, youcame up with a good idea. You
didn't hand that over to a groupof experts to tell you it was a
good idea. You implemented it,and you saw if it worked and if
it you know, I used to say,especially in small companies.

(39:21):
What I loved about being insmall to mid sized businesses, I
could come up with a change onMonday that could implement it
on Wednesday, and I could seethe results on Friday, and if I
didn't result like the resultson Friday, Monday, I could start
that whole process over again,right? And you compare that to,
in some cases, the several yearslong process in academia and in
some areas of research that ittakes to be able to advance the

(39:43):
technology this is moving sofast. I mean, think about where
we were, you know, withgenerative AI, just just a year
or two ago, and where we aretoday. If you're waiting two
years to study what chat GPTthree looked like to figure out
what we should be doing today,you're just never, ever, ever
going to keep up. So I think, Ithink it's really, really. Good
advice for our higher educators,the idea of getting started now,

(40:04):
doing it in a responsible way,educating themselves, for our
educators and maybe, you know,maybe K 8k, 12, super important.
What about students? So I'm astudent listening to this
podcast. How should my teachersbe kind of making sure that I
don't get lost and kind of losemy creativity and that whole
idea of wonder and joy ineducation. So what advice would
you have for a teacher and for astudent to make sure that we

(40:25):
don't lose what can be really,really special about the
educational experience, which isreally opening our eyes to the
magic of what's out there andnot getting too focused on
somebody else's version ofreality?

Sam Whitaker (40:36):
Man, it's tough for students, because I think
back to when I'm a student, youknow, 10 years old, 12 years
old, 14 years old, and you haveadults who say things to you all
the time. Be like, trust me,you'll regret that when you're
older. Fine. Let me just go backto watching TV, right? Yeah, you
really should probably startworking out now. That's your
older self will thank you forthat.

Matt Kirchner (40:57):
Oh, for me, it was like, Yeah, you think sixth
grade is higher. Just wait tillhe gets to seventh grade. And
boy, you have no idea what'scoming when you get to high
school, and then you get there,and it's like, nah, this isn't
so bad.

Sam Whitaker (41:07):
It's hard to say, imagining saying to myself now
and saying, it's the easiestthing in the world to use chat,
TPT to do your workforce. I knowa kid who graduated from a top
tier university, and when I saytop tier, I mean, like top 10.
Just graduated this past May,his last two year, got a job. Is
going to be out in theworkforce, or is it probably is

(41:28):
out in the workforce. Now, didnot complete an assignment for
his last two years. GPT everysingle assignment. And what did
he learn? Right? What helearned, but he got the diploma.
It's really hard to say to astudent when it's so easy, and
that's why, I mean, say itabsolutely, and if it hits
anybody, if anybody hears it andsays, Okay, you're right. I
really do, you know, I'm reallygoing to do this. I'm going to

(41:49):
kind of look for ways that I canlearn appropriately. But there's
a reason that we educate andthat we protect children as much
as we can until they reach anage where we've deemed that
they're old enough, just likethe driver's ed analogy, right?
I think for students, keep beingcurious, but also get out there,
like show up other places.
Don't, don't stay on, don't usejust the AI. Build your

(42:11):
interpersonal skills, becausethey're so important. But it's
really incumbent on the rest ofus to make sure that students
have a safe way to learn.

Matt Kirchner (42:20):
And the truth is, for a student, you know, yeah, I
was always a path of leastresistance guy myself, right? I
mean, why do extra work? We bangit into our heads in
manufacturing, why do more workthan we should to get to the
end, right? I mean, why do wecall that over production or
over processing inmanufacturing? It's if you're
putting more effort in than youneed to. Because that's just a
formal waste. I get why astudent would use generative AI

(42:43):
to complete their coursework, ifthat's what gets them to the
diploma, and the diploma istheir goal. There's really no
differentiation in that, though,right? Any student that can
speak the right prompt intochatgpt can then complete that
assignment. Where's thedifferentiation and being a
student and being a human being?
So, so certainly understand thataspect of it, both sides of it,
right, both taking the path ofleast resistance, because that's
what I always did in education,but also recognizing that that

(43:05):
you're probably missing anopportunity to learn when we do
that. And then the greatermessage, I think, is to
education in general, which is,let's create an environment
where that isn't what it takesto get a diploma, and let's have
people learning throughout theirjourney. And I know you learned
a ton through education, but youtalked already about some of the
things you believe abouteducation, personalized
learning, those those kinds ofthings. Are there other things

(43:26):
do you believe about educationthat would surprise other
people, or that might be alittle bit on the mainstream?

Sam Whitaker (43:33):
So honestly, I think something that would be
outside of the mainstream,especially based on a lot of the
talk, if you read a lot of theheadlines about AI and
education, I don't think we'rescrewed. That's the bottom Yeah,
I think there's absolutely hope.
I believe there's hope. I thinkwe have a chance to turn around,
not only turn around what we'redoing in AI, but turn around a
system that's been failingstudents for 60 plus years. I

(43:53):
believe as many times as we talkabout the dangers of AI, I I
believe wholeheartedly that ifpeople like you and people like
me and people like many peoplethat I know, keep pushing
forward, keep having theseconversations, keep doing the
right thing in our businesses,that we're going to see a
brighter future. We're going tosee better education. We're

(44:15):
going to see more equality ofopportunity. We're going to see
equality of access and all thesewonderful things that we want. I
truly believe it's going to

Matt Kirchner (44:24):
happen. Yeah, I believe 100% everything you're
saying is so aligned with withmy view on education. I'm always
careful also, by the way, is inas much as we have so many
educators that listen to thepodcast when when I criticize,
and I won't speak for you, but,but you can tell me if you
disagree, when you criticize themodel of education that's been
around for 60 years, and in manycases, our model hasn't really
changed much since shortly afterWorld War Two. Believe it or

(44:46):
not, if you look at mostAmerican classrooms, they look
about the same as they did 60and 70 years ago. That's
certainly not to criticize thegreat teachers that we have
across the United States ofAmerica that I know are going in
for all the right reasons andimparting wisdom and and.
Preparing students for life anddoing incredible work, and in
many cases, not getting anywherenear the gratitude that they
deserve. So we're always carefulto make sure that criticism of

(45:08):
the model of education is notnecessarily criticism of the
educator. In fact, a lot oftimes, I think the model that
we've created is held a lot ofgreat educators back. So huge
optimism for the future, as youhave for the future of
education. So appreciate the waythat you answered that question.
One final question for SamWhitaker, it's a question we
love asking all our guests hereon the TechEd podcast. You

(45:29):
turned the clock back a momentago, and I listened intently
when you said 1012, or 14 yearsold, and kind of thinking of
yourself as a student, we'regoing to click the clock one
year forward to the age of 15.
You're a sophomore in highschool, that young Sam Whitaker
has his entire life ahead ofhim. And if you could give that
young man one piece of advice,what would it be?

Sam Whitaker (45:48):
Just show up, go to conferences, join clubs, get
out in the world. I promise you,you will never regret the real
you didn't watch, or the memeyou didn't share, but just
showing up. And so many goodthings happen when you just show
up, when you get out there, whenyou have conversations, when you
meet people, you never knowwhere you're going to end up.

(46:09):
And while it's a concern aboutAI that people will stop showing
up, the more we do, the more AIis just going to enhance things.
It's going to enhance enhancethe rest of our life. So for my
15 year old cell phone, I wasgenerally okay about it. You
know, I did a lot of things, butI wish I did more. I wish I did
it all right? I try to live thatway now. So I talk to students
all the time, and that's the onething I say, just show up. Just

(46:32):
do it.

Matt Kirchner (46:33):
I've got a really good friend who talks about,
we'll rest in the next life. Ishis line. I'm like, I love that
one. He's like, let's just getit done now. Let's get out and
do as we possibly can. The truthis, if we use AI, right, if we
leverage it, if we use itresponsibly, that's going to
give us more opportunities tointeract with each other and to
have those social opportunities,opportunities to learn,
opportunities to engage inperson, as opposed to just

(46:53):
flipping a thumb on the screenof our smartphones. Totally
agree with you. Great, greatadvice, just show up. We're
really happy by the way, thatSam Whitaker showed up for this
episode of The TechEd podcast.
One more example of that we lovemeeting fascinating people, and
he certainly meets thatobjective and meets that
definition. So Sam can't thankyou enough. Sam Whitaker,
Director of Social Impact atstudy, fetch, for being with us
on the TechEd podcast. Thank youso much. It was a pleasure.

(47:16):
Terrific conversation with SamWhitaker, talking about how
artificial intelligence istransforming our entire economy,
talking about how it'stransforming different parts of
the globe, and how areas likeUganda are doing things maybe in
some ways the same and in otherways differently than we are
here in the United States ofAmerica. The importance of
literacy, the importance ofguardrails, the importance of
educators bringing AIconfidently into the classroom,

(47:39):
but doing it in a way that isresponsible, that prepares our
students for the future, thatprepares them to in the age of
artificial intelligence, wheretechnology can do so much for
us, it's still so very importantto show up. We talked about a
lot of different studies, wetalked about different articles,
we talked about some books, wetalked about some resources. We
are going to link those all upon the show notes for our

(48:01):
audience. We do have the bestshow notes in the business.
People tell us that all thetime. So if you heard something
you want to learn more about,rest assured, it's going to show
up as a link in those shownotes. You will find those at
TechEd podcast.com/whitaker thatis TechEd podcast.com/w h, i, t,
a, k, e r, when you're done withthat, and right before you show

(48:23):
up doing something in person,check us out on social media. We
are on Facebook, we are onInstagram, we are on Tiktok, we
are on LinkedIn. Wherever you goto consume your social media,
you will find the TechEdpodcast. While you're there,
reach out and say hello. Wewould love to hear from you. And
in the same vein, we would loveto see you again next week on
the TechEd podcast. Until then,I am your host. Matt Kirkner,

(48:43):
thank you for being with us.

Unknown (48:52):
You.
Advertise With Us

Popular Podcasts

Stuff You Should Know
New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.