Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
You are listening to
the Teach Middle East podcast
connecting, developing andempowering educators.
Speaker 2 (00:17):
Hey everyone, Lisa
Grace here with the Teach Middle
East podcast and they say ifyou're great, you can come back
again, and if you're rubbish Ican come back again.
And if you're rubbish, I'mjoking.
That is a joke, but it's thesecond time we're having Dan
Fitzpatrick, the AI educator, onthe podcast talking all things
AI loads to discuss.
Actually, stick a pin.
(00:37):
The last time we had Dan on thepodcast it was June 2023.
And so so many things havechanged since then and I'm sure
his work and his reach hasexpanded, Because I think at
that time, when I was speakingto him, it was the beginnings of
things, and now things havegrown and shaped.
(01:00):
Welcome, Dan, to the podcast.
Speaker 1 (01:02):
Thanks for having me.
It's great to be back.
I can't believe it's been thatlong and, yes, as you alluded to
, so much has happened in thosetwo years in this crazy world of
AI.
Speaker 2 (01:13):
Yeah, I think at that
time people were just getting
their heads around the fact that, ok, ai, especially generative
AI, because AI has been aroundforever, but generative AI is
here and it's not going anywhereand it's only going to improve.
I think at that time we werekind of in the what is it?
Do we ban it, do we not?
(01:33):
How do we manage it?
That kind of thing.
But let's kick off the podcastwith a straight question, dan
what have you seen change sincethe last time time we spoke?
Give me the skinny latteversion how long have you got?
Speaker 1 (01:49):
so we've that's why I
want the skinny version, I
suppose going back to june 2023my goodness, I think it was
still.
We were still in that phasewhere it was.
There wasn't many schoolleaders or schools really
looking at this in any depth.
It was very, very early days.
(02:10):
I think my first book had beenout two months by then, and it
was very much, I think.
How do we start to use this newkind of technology?
So I think one of the biggestshifts people have had to deal
with is the fact that this worksunlike any other technology
we've been able to use in thepast, simply because it speaks
our language, like every other.
(02:30):
You know, like literally,people have to invent languages
and there'll be teacherslistening to this who probably
teach those languages sostudents can manipulate computer
systems and so on, but even thelanguage of kind of knowing how
to press all the buttons in theright order to get something
out of a piece of software or toget something out of your
computer.
We now live in a world whereand we're gradually still
(02:52):
phasing into that world butwhere the technology speaks our
language, and so I think that'sone of the big journeys I've
seen over the last couple yearspeople almost trying to get
their heads around that and Idon't think we're anywhere near
it.
I've been this week alone, Ithink I've been in four
different schools where I'vestill tried to had to get that
message across.
And the great news is, I thinkand another thing I've seen
(03:13):
happen over the last couple ofyears is that teachers are
starting to realize thatactually the skills they've
already got enough to do somereally amazing things with this
technology.
Because where max does it?
Being able to say things in away that elicits certain
responses from students and getsthem to get other people
(03:34):
whether students or maybe we'rea line manager in a school to
get people to reach a certainstandard and that's kind of the
underlying skill of usingsomething like an AI chatbot.
Now, obviously, over those twoyears as well, we've moved from
just the basic use of an AIchatbot, like I like to say,
text in, text out, and now we'restarting to see so many more
(03:55):
different types of generative AI.
So, from AI that creates music,ai that you can talk to just
like we're talking, and AI thatcreates videos, creates software
, yeah, it's been like anexplosion of lots of different
types of generative AI and thena race for those generative AIs
(04:15):
to get better and better andbetter.
Yeah, I often say.
In fact, I remember somebodysaying to me around that time we
last spoke on the podcast thatI think they might have come up
to me after my presentation andsaid, spoke on the podcast, that
I think they might come up tome after my presentation, said,
oh, you're gonna have to changethis presentation quite
regularly because things aremoving so fast.
And it's interesting because Ido and I don't.
I feel like all that'shappening is the knowledge of
(04:38):
the audience is just growingwider and wider and wider.
So we still have teacherswho've never used it before and
now we have teachers at theother extreme who are using it
every day.
So it's almost like just themessage that I kind of try to
portray now just has to reach amuch wider audience rather than
just an audience who don't knowmuch about it, which makes it
(05:02):
really difficult to try and aimit to every single skill level
in the room.
But it's a good challenge.
Speaker 2 (05:09):
Yeah, you know, as
you were talking, I was
reflecting on the fact that atthe time when we spoke last time
, people were still worriedabout whether or not this thing
would actually wipe us out ashumans.
Do you think that fear haschanged any at all, or has it
grown since the AI has becomebetter?
Speaker 1 (05:32):
Well, literally about
an hour ago, I just posted on
LinkedIn something I was justlistening to a podcast that just
came out by probably theworld's most leading AI safety
expert.
Probably with the world's mostleading AI safety expert, and
he's still very much on that.
This is not going to end well.
So I think the fear is stillthere.
But in fact, something he saidon that podcast actually was
(05:54):
that humans are really good at,if they don't have control of
something, of just kind ofputting it to the back of their
minds and just living lifenormally anyway.
Now, that's probably not themost reassuring message your
listeners want to hear, but Ithink what I take away from that
is we're still in a period,we're still in the starting
stages, I think, of where thistechnology will go, even though
(06:16):
we're like, let's say, threeyears in since chat gpt was
released or just about threeyears.
We're still in that kind ofinfancy period, and I often say
to people like, especially ifyou've, if you've ever had a
toddler, or if you've gottoddlers at home, like my
toddlers, jacob and matilda,they blow my mind, they
absolutely blow my mind.
So sometimes they do something,I say something.
(06:38):
I'm like, wow, we created thatthat's amazing.
And then and then I leave theroom and come back in and
Jacob's drawn on the wall with acrayon or something and I'm
like we've got some issues here,we've got some problems and I
feel like that's where AI isstill at.
We're still at that period ofwhere we're like this blows my
mind.
But then I suddenly think we'vegot concerns here, we've got
(06:59):
issues, we've got problems, andI think we're still in that kind
of early infancy.
Where it goes from here, I think, is very much up to us, and by
us I mean society and somepeople.
I think in that moment kind oflaugh and go well, dan, have you
ever come across big tech?
They're not exactly the safestor the people you might want in
(07:20):
control of a computer.
That could cause probably themost damage, absolutely, and I
it's hard to reconcile thatreally, but that's the way it is
in the world right now.
But I suppose the control we dohave is in our own sphere of
influence as teachers,educational leaders, is we can
decide what comes through thedoor of our schools.
We can decide the skills thatour students have.
(07:40):
We can decide the values weinstill in our students so that
when they do come across maybethe less desirable side of this
technology, they're in a goodplace to be able to make the
right decisions.
Speaker 2 (07:55):
And as it relates to
safety, because one of the
things that I've been listeningto several podcasts as well, and
they're talking about safetyand guardrails and how do we in
schools ensure that this is usedsafely, because I'm beyond the
don't use it point now.
How do we ensure it's usedsafely?
Speaker 1 (08:18):
yeah, I think we.
I suppose we've got twodifferent dynamics going on here
, and that is using ai in ourcurrent structure of education,
but what we're starting to seeis other organizations going
right.
Well, how do we shift theorthodox practice of education
(08:42):
in terms of what we've alwaysdone?
So I suppose, looking at thefirst point, which is probably
the most practical for wherepeople are at right now, I think
we have to.
It goes back to those threethings I I kind of said we have
to control the tools.
We have to control or at leasthave a major influence over the
skills our students aredeveloping and the values that
they are growing as well,because when they leave us as
(09:06):
well, at the end of the day orbefore they come to school, like
we've got no control over whatthey're using, what they're,
what technology they're using,and I mean just to give one
example, they they could veryeasily navigate their way to a
website that will create images.
That will create images ofanything they want.
If you are confronted with atool like that and your values
(09:26):
are not developed, your criticalthinking skills are not present
, it's a recipe for disaster,really, and we have to safeguard
that child, but also thechildren who are around them,
because they could use it in anaive way to harm them.
Maybe not from a maliciouspoint of view, and I think it's
I'm a big believer that I thinkkids don't act out of a
malicious point of view and Ithink it's I'm a big believer
(09:46):
that I think kids don't act outof a of a malicious mindset.
There's there's other needsgoing on there, but you still
need to safeguard them from thattoo, whether it's the person
using it or the people who couldbe affected, and that, and our
teachers as well and those whowork in education.
So it's not an easy answer.
It's a really, really messyanswer and I think what it does
is it gets to the heart of well,what's education for?
(10:09):
I was speaking at the cotis moreai conference in the south of
england just a couple of daysago and and one of the questions
I really tried to, in fact Igot the audience to try and
answer this question what iseducation for?
Let's just take kind ofstandard education systems what
is the purpose?
And I got everyone to shout outone word that they could think
(10:32):
of, and we must have had about50 different words come out, and
I think that's the issue.
I don't think it's a badproblem, but I think the problem
is we can't really agree onwhat we're supposed to be doing
here and despite that, we stilldo an amazing job and teachers
do a fantastic job.
But what are we preparing ourstudents for?
(10:52):
And I think the prevailingattitude and I'm painting very
broad strokes, but I think theattitude that normally wins out,
especially in my context, inthe English education system at
the moment is that studentslearn as much content as they
can by the time they end schoolin order to prove that they know
that content and then get agrade to go to a university,
(11:14):
something like that, and I think, because of that, a lot of
other things get pushed to theside.
So we get we might get drop downdays for students, we might get
tutor time for students, wherethey then have to cover digital
literacy.
They then have to cover medialiteracy, sex education, values
education, whatever, whateveryou might call that, whatever
(11:35):
that might be, and it's thoseperipheral, probably what we we
associate with the soft skillsthat are becoming more and more
important, and that values-basededucation, I think is going to
be super important when it comesto being confronted with these
tools.
Is that a watertight way tostop our students from being
(11:56):
harmed?
Not at all.
But I mean, we live in a worldwhere our students are accessing
social media.
We know that it's very, verydifficult to do that anyway, and
that hasn't changed andprobably, arguably, may be a bit
more difficult now.
It's not a positive message,but we can influence and we can
(12:17):
do it, but we have to make aconcerted effort and that's
probably going to mean that weend up having to adjust what we
do in school to not just focuson what we've traditionally
focused on, but to to look atother areas as well have you
heard of the alpha schools inthe united states?
I have yeah I'm gonna be cheekyand ask you what your thoughts
(12:39):
are on as you probably know, Iwrite for forbes and I tried to
write an article on them,probably about a year and a half
ago now, when they just startedand they only had one school,
and I found it really difficultto try and find what they were
using in terms of artificialintelligence.
So when I was trying to do myresearch, I was trying to figure
out well, what are they doinghere?
Because they claim, don't they,that within two hours of a day
(13:07):
they can teach just as much totheir students as a normal
school does on a full day, andthen they can spend the rest of
the day focusing on the moresocial side of education and the
I suppose those things we werejust talking about, those, those
kind of peripheral, softerskills, which sounds amazing.
I'd love to know what theyusing for their ai.
I'd love to know that.
I think, as far as I could getwas it wasn't exactly clean cut.
(13:27):
I think they used a mixture ofdifferent things and I think
they were probably still in anexperimental phase, probably
still are.
Some weren't as keen to talkabout everything they were doing
.
I've seen some of the data,like the two hour stat.
I haven't interrogated thatdata so I don't know how robust
it is and how much it wouldstand against interrogation, and
(13:49):
I do.
I also wonder aboutsustainability of a model and
it's it's a lot easier to getstart getting results very
quickly, but how can you, howlong can you sustain those
results?
Saying that, I applaud theinnovation, and there was a
school I forget the name of themin in england about a year and
a half ago that started doingsomething similar with just a
(14:12):
small cohort of students, and mything is if it fails, it fails.
But that's how you innovate.
Yeah, you, you have to try newthings and and I know that's
tricky in education becausewe're literally dealing with the
students' lives who are in thatexperiment at that moment in
(14:32):
time.
So you've got to do it in areally, really safe way.
If you're going to doinnovation, you've got to do it
in a way that where you have abackup, where our students
aren't being failed if thesystem that you've innovated on
does fail.
But I applaud the risk takenbecause I'm a big believer that
we need to find another way,especially if education is going
(14:53):
to be effective within an AIworld, and every indicator tells
us we're moving very, very fasttowards a world of ubiquitous.
Speaker 2 (15:02):
AI.
Yeah, I looked into thembecause I was curious.
I even wrote about it onLinkedIn because I wonder could
the model work here, if it's sosuccessful?
Because it seems to be workingsomehow.
I don't have all the data and Idon't know if what they've put
in the public domain is actuallyverified data, but based on
what they're saying is that itis successful.
I'm sure it won't be successfulfor all students, but based on
(15:23):
what they're saying is that itis successful, I'm sure it won't
be successful for all students.
But you know, uae is on thecusp of innovation and they like
to really push the boundaries.
So I thought why not an alphaschools kind of model?
But I've been doing somereading recently.
I'm finding there's a pushback,especially among Silicon Valley
(15:47):
types, the ones who are makingthese things.
They're putting their kids intoreally different, alternative
types of schools where thetechnology takes a backseat.
Why do you think they're doingthat?
Speaker 1 (16:08):
why do you think
they're doing that?
I think for a number of reasons.
I think, and I think we allknow that constant technological
use is not healthy, for I meannot even healthy for adults,
never mind brains that are stilldeveloping and characters and
that are still developing.
So I think, coming back to yourprevious point as well, which I
think fits this I think whatexcites me is that we are
getting different types ofschools and we're getting more
(16:30):
and more choice, and I thinkwhat you said was so crucial
that, well, okay, alpha schoolsmight pop up or a forest school
might pop up.
It's not for everyone, but ifit caters to a certain child or
a certain family's values, thenamazing, absolutely amazing.
I'd rather have a hundreddifferent types of schools than
just one stale system that triesto cater to everyone.
(16:54):
There's that old adage if youtry to please everyone, you
please no one, and I feel likethe education system is the main
example for that, really.
So I feel like giving thatchoice is an exciting part of
the next few years that I'mlooking forward to, and I think
it's long overdue.
And I think like I get quiteexcited about the non-tech
(17:14):
versions as well.
I was in bristol in england afew months ago and I was talking
to a guy at a conference and hehe's opening up schools like
not just what we might call aforest school, but literally
schools in forests where thechildren kind of work and play
in nature constantly and theschool is a part of that nature.
(17:37):
And I think he's on to like hisfourth or fifth school now and
and I so there's that extreme aswell, which I don't blame
parents and people for, becauseif their life is going to be
surrounded by tech anyway, whynot, especially at a young age,
like an elementary level, whynot have those students have a
(17:57):
sanctuary from it, so that thereis that balance?
And I think the key is balance.
I think the key is balance.
So I think I think schoolsprobably should have a bit of
balance there, like maybesunscreen time in a meaningful
way and then and then time tonot have it.
But you could also take a stepback from there and go well,
what's the balance of thechild's life completely, not
(18:18):
just in school, and if they'regetting loads of tech outside of
school, then maybe we do haveprimary or elementary education
as a bit of a sanctuary.
So I think I think it reallyexcites me the different choices
, and I think it's clear thatyou go over to san francisco,
and I mean it's not just thechoice of the parents, they're
literally having state law thatsays that that is a bit more
(18:41):
advanced than the rest of theworld for using devices and
devices being introduced tostudents, and so on.
I think, yeah, I could be anumber of things.
I think probably the mostskeptical, and probably the main
reason, is that these peopleknow, because they probably see
the stats on this in their work,of how harmful this technology
(19:02):
can be.
But also, maybe it's theirparents the families are
surrounded by advanced tech allof the time that even they, when
they come home, they don't wanttheir family to be then in in
the sphere of advanced tech.
They want to go right, let'sjust have a break from it.
So I think it's probablycomplex, which most things are
(19:22):
on the multifaceted, butprobably the most critical take
on this I've seen is they'regiving it to our kids but not
giving it to their own kids, andyou could get very
conspiratorial around that.
But yeah, I think we know,though.
We know, don't we?
And it's not like SiliconValley is keeping the data from
us.
We know that from ourexperience, especially from my
(19:43):
experience.
We yeah yeah, I don't know whatother parents feel, but haven't
we?
Well, I've got twofive-year-olds at the minute.
They're five at the same timefor about 10 days because
they're just under a year apart.
But, like especially schoolholidays, we like, we bought
them an amazon fire tablet andput the apps on that we wanted
(20:04):
on there.
It's like a pendulum in ourapps on that we wanted on there.
It's like a pendulum in ourhouse.
So we let them use it for alittle bit.
Then we suddenly startrealizing that they're taking
themselves off to the kitchentable when they're tired to use
it and using it as a bit of acrutch for when they're tired.
So then we kind of wean them offit a bit.
They get a bit frustrated withthat.
Then we don't have them formonths on end and then all of a
(20:29):
sudden they'll find one of themin a cupboard somewhere and
charge it again, and then we goon that cycle again.
So we're, as a family, stilltrying to kind of find that
balance.
But we know, because it's inheadlines, because it's in data
sets that are out there, thatjust going, here's a tablet.
Use it as much as you want, isprobably the worst thing you
could do for a child in thisworld right now.
Saying that it's a struggle toknow what the balance is.
So I suppose maybe SiliconValley parents are doing this
(20:53):
because maybe they're just a bitmore saturated with the reality
than we are.
I don't know.
Speaker 2 (20:58):
No, I mean, it's a
good answer, but I also think
the advanced tech that they'resurrounded by okay.
Fine, they're developing it,but we're all surrounded by the
same tech.
If you look around your house Ijust looked at my desk as we
speak I can't spin the cameraaround I've got an iPhone on a
stand, I've got an iPad on thatstand and I'm on a Mac talking
(21:19):
to you Like I can't escape thething.
And so now I'm wondering shouldschools become safety zones
where they're protected fromthis tech, where this is the
place where they are notbombarded by tech, where it's a
tech-free zone?
(21:40):
For the most part?
It can't be completely free, weget that, but should we be
leaning the other way becausethe tech is so pervasive
everywhere else?
Speaker 1 (21:52):
Yeah, it's tricky,
isn't it?
Because I know sometimes whenI'm using my iPhone, like it's
in a very meaningful way it'sfor work, it's for contacting my
family and I see that as apositive.
I don't say that as somethingthat's having a negative effect
on me.
However, when I got into bed inthe evening and I'm scrolling
through video after video aftervideo for 20 minutes, and then I
(22:13):
realized what am I doing?
I should be asleep.
I've literally almost been in atrance for 20 minutes just
flicking through videos Then I'mlike that's completely the
opposite.
It's not meaningful, it'sharming my sleep, it's harming
the pattern of my day.
I'm being used by the.
I suppose that's the difference, isn't it?
Like the?
The first examples I'm usingthe technology.
(22:36):
The second example thetechnology is using me, and I
think I think that's a probablyquite a major distinction.
And, by the way, I'm not one ofthese people who says schools
shouldn't find phones andstudents should be allowed
phones, I think.
But I think if you're gonna banphones because you're scared of
the technology using thestudent, then we need to give
(22:59):
them something as a replacement,like a meaningful piece of
technology that they can use,and not to have all the time,
like maybe that's a chromebook,maybe it's a, an ipad, whatever
that is controlled, that has theapps on that they need to use
for learning purposes, if andwhen the teacher decides.
But yeah, it shouldn't be allthe time.
I don't think it should be,it's just not.
(23:20):
I don't know about you, but Imean, you probably spend a lot
of time at our computers andafter after an hour or so, you,
you do just, I don't know aboutyou, but I'd start to just get a
bit.
My head gets fuzzy, even if I'musing it in a meaningful way, I
might.
I need to just close the lid,go make a coffee, go get some
fresh air for a bit, and I thinkwe need to be mindful of that
(23:41):
for our students as well.
Yeah, but it's yeah.
Like we said before, it's messy, we're in a messy situation,
but we have to.
Speaker 2 (23:49):
We have to continue
evaluating that, I think yeah,
and because I thought of it aswell, because I think of these
things a lot.
One because I'm an educator andtwo because I'm a parent of
teenagers who don't have phones,who bug me, mom, we're the only
ones without a phone, I'm like,and you will be until you are
16, so really you get used to it.
(24:11):
I've been 13, so they only havethree years, so nothing too
major.
They've lived all their livesup till now without it.
They've got a Nokia brick andthey absolutely hate me for it,
but I'm like you will carry thatbrick and it's only for calling
mum and dad should there be anemergency.
Speaker 1 (24:29):
Yeah, I was reading
an article recently about a guy
who I think he had three kids,two teenage daughters and a
younger son and when the teenagedaughters had left elementary
school it was almost like a riteof passage that they would get
a smartphone.
So 11 years old, and then whenit came to, his son didn't seem
(24:51):
all that interested, so theydecided, oh well, we'll just not
do that.
His son didn't seem all thatinterested so they decided, oh,
we'll just not do that.
And he was saying thedifference between like 12 13
years old between his son anddaughters is like night and day
in terms of how they act, howthey behave, where they play,
how they play social skills.
Also, I think he said like hisson just feels a lot more like a
(25:15):
child where his daughters grewup very quickly with the
smartphones.
It's just interesting to listento you talk about, because I'm
going to go through that in afew years time and I'm going to
make those decisions.
Speaker 2 (25:28):
Yeah, I think it's
individual child.
I just I'm just scared to givethem a smartphone.
That is going to completelysuck their attention.
So they do have iPads, but wemonitor that very closely and we
hand it to them and time goes,ding, ding, ding, time's up.
They hand it back.
We check the history, we putthat away Next, you know, they
(25:52):
know the times that they can useit, but I'm just, a cell phone
is with them all the time andI'm like how do you even control
that?
Speaker 1 (25:59):
and what's the so
you're their friends.
Are their friends gotsmartphones?
Yes that's another layer ofreal complexity, because do you
ever feel guilty that you're notlike letting them socialize in
that digital way with theirfriends or like?
Because I mentioned that that'squite a complex dynamic hard.
Speaker 2 (26:20):
It's hard.
I don't feel guilt all the time, but there are some times when
they have like stuff and they'relike oh, mom, can I?
Speaker 1 (26:29):
and I'm like yeah,
that must be so tough.
It's hard even them just goinginto school the next day and
like not being part of a jokethat was said on a social media
platform the night before oneson, for one son.
Speaker 2 (26:42):
My other son couldn't
care less if the moon is blue
or green, but for once one, oneof my twins.
He does say, oh, they werestanding around this meme and I
didn't know what the meme meant,and blah, blah, blah and I'm,
and I'm like it's fine, it's ameme, it's going to die in five
days, it doesn't matter.
Anyway, no one remembers it ina few days.
(27:02):
But let's talk about inequitythough, because it's one thing
for middle class parents to saylet's withhold this tech, they
can have it at home.
We're educated, we can guidethem Da-da-da-da.
What about the kids who?
School is the only place wherethey'll interface with this
(27:23):
level of tech, because there isdeprivation at home.
How do we manage that?
Speaker 1 (27:29):
Yeah, I do.
Personally, I do think theschool has to have a balance.
I get the extremes, I get theforest school they're both
extremes but it needs to be amost schools probably need to
just have that healthy balancewhere our students are able to
use and are exposed to advancedtechnology that's going to
(27:51):
impact the rest of their lives,but then also are taught in a
healthy way that this isn'teverything.
It's tricky, isn't it, becauseI think schooling hasn't
necessarily done a good job fordeprived families and deprived
students in terms of tech anyway.
So it's not like they come toschool and get to do that, and
in fact I think there's ageneric trend for more deprived
(28:15):
schools or schools in deprivedareas, I guess in the UK, to go
very traditional and not use asmuch tech.
So I think we've already gotthat inequality to a certain
extent, which is probably beingmade worse by artificial
intelligence, just by simplybecause if a student's got
access, if a student's got theskills, then just by default,
(28:38):
when the technology gets better,their capabilities are
advancing and obviously astudent who doesn't have that is
staying quite static in in thatfield.
So I think we'll probablyalready got that.
I do think, and I I've beenthinking about something a lot
recently and I I was playingaround with these ideas in a
talk the other day.
Sometimes I like to just scrapmy talk and to do new things.
(29:01):
I used to if I was a teacher, Idid a bit of stand-up comedy
and sometimes like to do likealmost like the equivalent of a
new material night, where youjust go right, let's just try
some ideas and see if itresonates, and so on.
And I was talking about kind ofhow in agrarian times, feudal
times, when we had the thingthat made wealth was land, if
(29:23):
you were a landowner, you werethe people in society who were
making the wealth, generatingthe wealth.
And it led economists at thetime to come up with the kind of
economical theory that well,there are three ways to make
wealth there's land, there'scapital and there's labor.
And then we kind of see, as wego into the industrial age, that
land isn't as important butcapital and labor become really
(29:46):
really important.
And around that time anothereconomist said well, actually
there's a fourth, there's afourth factor here and that is
entrepreneurship.
And I think what we've seen iswe've entered the information
age and kind of the 80s and nowgetting into probably the second
stage of that, what people arecalling the intelligence age, is
.
We're seeing the importance ofland, the importance of labor
(30:10):
and capital to a certain extentdecline and actually the
importance of entrepreneurialskills go through the roof,
because anybody with a laptopcan sit in a starbucks, not own
any land and create amulti-million dollar company,
and it's happening all the time.
And then we're starting to.
Some people are sayingespecially people who work in
(30:31):
the eye are saying that it'sprobably not going to be long
until we have the first singleperson business that becomes a
unicorn like a billion dollarbusiness because of what these
tools will allow us to do.
So I think I honestly thinkthat the greatest thing we can
do for that equality right nowis give our students not just
(30:54):
the skills but the awareness ofwhat they could achieve with
this technology the fact that ifyou're from a council estate in
gateshead and you've grew up ina family that has no money and
you've never met anybody who'sbeen an entrepreneur before, you
don't just have to have an ideaand then let it die.
(31:15):
Actually, there's now thecapability if you can get on a
bit of tech, get on a laptop atschool or whatever, to actually
advance that and make somethingof it.
And if that is now possible,which it pretty much is then I
think the greatest inequalityfactor here is that awareness
(31:35):
and skills, or the confidence aswell to go right, I can create
something and find my meaning inthat creation and actually
maybe earn some money in thatcreation, start a business,
whatever it might be, living inthat entrepreneurial age which I
think we have been for a littlewhile already.
(31:57):
Then how do we equip ourstudents to thrive in that world
and where, essentially, we'vegot an education system that was
formed mainly in in theindustrial side of things, where
our students, if you went to aestate school, you were kind of
largely trained to be part ofthat labor and to be able to use
(32:18):
the capital, so the machinery,the, the systems, whatever it
might be, and we still largely Ithink it's unfair because I'm
painting in very broad strokesand being general here but I
think largely we still kind oflive or we still kind of operate
in that system with aneducation.
But the world has moved on sincethen and you could in fact say
(32:41):
it's moved on twice since thenand now actually given our
students the hope, the values ofan entrepreneur, the.
And when I say entrepreneur Idon't just mean someone who
creates a business.
I think the true essence of anentrepreneur is someone who
creates a business.
I think the true essence of anentrepreneur is someone who
discovers a problem and can thenthink of a solution and bring
that solution to life.
(33:01):
So kind of those three steps.
And I think if we equip ourstudents to be able to do that,
then we are getting them readyfor this world.
If we don't, then we could haveone school on one side of the
street, another school on theother side of the street
preparing their studentscompletely differently for the
world ahead yeah and I think Idon't think we've ever had that
(33:22):
before.
I think all schools, whether itbe the elite private school,
whether it be a comprehensiveschool in a council estate, I've
kind of still had the mainobjective right we need to get
you to pass these exams and goforward and get to the next
stage.
There might have been variouslevels of success there, but
still the same objective,whereas now I think we've
reached a point where there'sdifferent objectives.
(33:43):
A school like Alpha School, aschool like an enterprise school
, has a very different objectiveto a school that's just getting
students ready for an exam, andI think that would lead to huge
inequalities.
Speaker 2 (33:56):
Yeah, I agree with
you.
I was thinking about the factthat a lot of students now are
questioning whether or not theyshould even bother to go to
university, whether they shouldjust go ahead and pursue their
entrepreneurial endeavors.
And you can't blame them,because no one can seem to tell
us what jobs we will have, whatskills we will need, apart from
(34:18):
the human skills, which I don'tcall soft skills.
I call them core skills.
So, apart from those coreskills, nobody else can seem to
tell us what it is that we willbe doing in the future.
Who's going to get wiped out?
Which jobs?
Speaker 1 (34:32):
I've listened to
hundreds and I do mean literally
hundreds of podcasts, andeverybody says the same thing we
don't know well, interestingly,I think and I hate taking an
extreme attitude on this, butsometimes I'm just led there by
the evidence I think I thinkthat we could be in a situation
where very few jobs exist.
See, that's the problem.
Speaker 2 (34:51):
So what do we tell
students?
Speaker 1 (34:53):
that yeah, yeah, I
well, I think we need to prepare
them, not to go and take a jobin somebody else's idea of what
work should be, and that's thething.
And I'm coming back to theentrepreneurial side what if
we're still gonna needentrepreneurs in that world?
We're still gonna need to lookfor those problems and discover
solutions, and I think that'sstill gonna be necessary, that
(35:14):
mindset.
So I think I think that's whyand I come back to the
entrepreneurial side again thatjust preparing students to go
and take a job that somebody'sgoing to give them is probably
the most interesting we canright now, because what happens
when people stop giving thosejobs?
And I I just that presentation Imentioned the other day.
(35:35):
I I snipped a few headlinesjust from the last month about
five or six of them, put them ona slide and they're from major,
major newspapers and and newsagencies about how students
right now who've just graduatedcan't find jobs.
The job market is drying up,and it's not solely down to AI,
but AI is in the mix there, soit's already happening.
(35:57):
It's not like we're talkingabout something that's going to
happen in 10 years, it'sliterally happening right now.
Speaker 2 (36:02):
My own children are
what should we do?
And I'm like you know what.
Just do the best you can.
Learn as much as you can and wewill see how things go as you
move forward.
I've got no advice for my ownkids, let alone people's
children out there.
Nothing I can just say to themlisten, try to be the best
(36:23):
person you can Try to you know,learn as much as you can,
experiment, do different thingsand hopefully the path will
appear.
I don't have anything else.
That's all I have.
Speaker 1 (36:33):
I don don't think.
But I don't think the oppositehas been true in the past.
Necessarily.
I don't think we've gone right.
Okay, you're 13, right, sitdown.
What do you want to do?
You want to be an architect,right?
You, I'm going to really helpyou to like I'm sure there's the
odd student like that.
But I mean, I know from mypersonal experience I I mean to
a larger extent I still don'tknow what I want to do, but I
(36:54):
think I don't think I ever have.
I've kind of just kind of I'vealways made sure I've been
skilled whatever I've done andalways given myself those
opportunities or those choices,but I've never really known what
I want.
Speaker 2 (37:07):
Yeah, not in that way
.
I mean you could have saidlawyer or something, but now
you're thinking maybe there willbe no lawyers.
I could have said accountantthere, you know, but maybe there
will be no accountants.
So it's just those like but,but I would.
I heard hinton.
What's his name?
David hinton, jeffrey, jeffreyhinton.
Sorry, I said the wrong name.
(37:28):
I used to work with davidhinton, by the way.
Jeffrey hinton.
He said plumbing and I was likemaybe Robots?
I don't know.
Speaker 1 (37:38):
Yeah, he said that on
Diaries of a CEO, didn't he?
And that podcast I listened tothis morning that was mentioned
before was Diaries of a CEO, andhe said something which made me
think ah.
And he said you can't evenguarantee plumbing now because
we're moving.
I don't know if you've seen theprogress on humanoid robots
yeah, I've seen some, not not alot yeah.
(37:59):
Well, the chances are in thenext three to five years people
are going to start having theseat home and if that has got the
ability to be able to learnanything in a split second
because you ask it to, then ifyou've got your own humanoid
robot at home and somethinghappens with the plumbing, are
you going to ring a plumber?
Are you just going to say toyour humanoid robot, can you fix
it for me?
And it quickly downloads themanual and everything it needs
(38:20):
to know and goes and does itbecause it has more dexterity
than a human being, it has morestrength than a human being.
Now people might be sat therelistening to this thinking this
guy's gone mad.
It sounds like some kind ofsci-fi future.
It's literally not.
These things exist and they'regetting exponentially better all
of the time and this is goingto be part of our reality.
(38:41):
I think we're going to haveanother chat gpt moment when I
become the mainstream, where theworld kind of is shocked, but
these are going to be a reality.
I think it's interesting whatyou're saying because I wrote in
my last book the one beforeI've just released, infinite
education.
I wrote a chapter called humansof the gap and where I kind of
made the argument that weshouldn't be placing humans in
(39:01):
the gap of what ai can't do,because if we do that, if we go
right, what can ai do?
Okay, well, then we've got abig gap here that ai can't do
this, this and this.
Right, that's our curriculum.
Now we're going to teach ourstudents how to do them.
Then that gap's going to getsmaller and smaller and smaller
and smaller, to the point whereit closes, or or the gap in the
(39:22):
gap oh my god, if we put thevalue or the dignity of our
children on their work andthat's something we've done for
hundreds of years now work iskind of work, indignity of being
two sides of the same coin thenI think we're setting them up
(39:43):
to not just have a future wherethey don't have a job, but we're
setting up to a future wherethey're not going to feel like
they've got any value in theworld.
So we've got.
I think we've got to change thescript and we've got to.
We've got to stop thinkingwhere does our value come from?
Because it's not just the thingI do from nine to five every
day.
It's got to come from a deeperplace.
I think, and we're I'm almostlike sounding spiritual here,
(40:03):
but it's like, but it's moresocietal really.
It's kind of like, and I think,by the way, I don't this is
nothing new.
I think we, throughout humanhistory, we've created
technology that has shifted howwe function as a society.
I mean, again in that book,infinite Education, I kind of do
a bit of a historical timelineof this, going all the way back
to agriculture and farming.
(40:23):
There was one point wherefamilies didn't mix and if they
mixed, it was to kill each other.
We were like small tribesliving out wherever.
When farm and agriculture wereinvented and started to grow, we
literally had to learn how toget alongside each other so we
could take advantage of thattechnology, completely changing
our skill set, completelychanging our societal values,
(40:45):
our societal creating society,and then obviously it leads to
education and to healthcare andand then to politics and then to
democracy.
And now you take that contextand now bring in the context of
some people are saying AI isgoing to be a larger shift than
anything before.
Then I think it's probably notoutrageous to say we need to
(41:07):
expect another societal shift,something the level of that,
which is completely unfathomable, mainly because we don't know
what that will be yet.
But I think one of the one ofthe core things going back to
that original point, one of thecore things is that where our
value comes from in society isgonna have to change.
Speaker 2 (41:29):
Yeah, or or we're
just going to have to accept our
doom.
I know that I'm dying.
I'm just I'm just.
I'm just listeners.
I am being as positive as I canbe, but after you've listened
to one or two, three or fourpodcasts on this subject, you're
like what on earth is happening.
(41:49):
But let's take it back toeducation.
Speaker 1 (41:52):
I was in New York a
couple of weeks ago and I had
breakfast with somebody whoworks at deep mind which is like
this oh yeah and they weresaying to me we're talking about
ai and education and I kind ofsaid what I said to you before.
I said, well, there's kind oftwo parts.
This is the ai helping us withwhat we're already doing, and
then there's the ai potentiallydisrupting education.
(42:13):
And she made a comment she's alovely person but, like made a
comment like and I've made thiscomment many times of, well,
we're concerned about how we canhelp teachers and so on, and I
said pretty much what you justsaid.
I said, yeah, but DeepMind hasgot a bit of a history of
ex-employees going on podcastsand saying this is going to be
the end of the world.
(42:34):
So, so, literally, peoplewho've led in that team or led
in that company have gone on tosay, actually, this is a bit
more concern and which is whyI'm kind of talking about this I
try to avoid these types ofconversations because I just
don't think they're they'repractical and help all that much
with educators on the ground.
But yeah, I just listened tothat podcast this moment.
(42:54):
It was fresh in my mind, butit's.
I think it's definitelysomething we need to do and I
think innovation has those twoparts.
Doesn't it like innovationshould help right now with the
practical, but also it shouldhelp us look to the horizon and
go right.
Well, if we look up, where arewe headed and and I think
sometimes it's good to just sitback and go right we know
there's tools out there that arehelping teachers save time
(43:15):
creating lesson plans, all thatstuff that's been around for a
few years.
But actually we should probablyevery now and then do this,
have a conversation where we go.
Let's have a look towards thehorizon and prepare ourselves.
Speaker 2 (43:28):
Listen.
The direction of this podcast,I think, is necessary, because
we could have sat here and Icould have said Dan, what tools
have you discovered for teachers?
About AI, and I thought ofduping about that, you know.
Yeah, we're going to talk aboutthat in a minute, but I think
(43:54):
when people plug this into, themost people listen to us on
their commute to schools or theyplay us.
I've heard I don't know, butI've heard that they play us in
their classrooms when there areno students.
I'm like, maybe when thestudents are there, no, but they
need to.
Now, you know, we all need tostart to think deeper about this
, deeper than tools, becausetools are good, they're cool, we
love tools, but to what end?
Tools, but to what end?
(44:19):
And that's what I think thisconversation unearthed.
It's us going back and forth onhow are we thinking about this,
the thinking behind it, and no,we didn't have the answers and
we don't and we probably willnever, but it's good to think
about it.
Speaker 1 (44:30):
Absolutely yeah, and
I think there aren't going to be
objective answers that it'sgoing to be different for every
school.
I think it's going to bedifferent for every family.
That's what.
That's another reason why Ithink I'm excited that the fact
that there's more choice comingin terms of education, because
you know what you might get afamily who are like I I've read
(44:50):
some of the research, I don'twant to have any screen time,
I'm going to send them to aforest school, whatever it might
be, and then you might haveanother family who goes.
No, I want them to use this allthe time.
I want them to be able tocreate with it, to be able to
learn from it.
This is what they'll be doingfor the rest of their life.
And you know what?
Both children at the end oflife and doom, and another to be
(45:16):
a slave of ai for the rest oflife, or whatever it might be.
I think humans find a way.
Yeah, people who've had theworst education ever go on to
become some of the biggestentrepreneurs.
Some people who've gone toethan and had the best education
in the world supposedlysometimes don't go very far,
it's like, and that tells meit's more about the character,
(45:37):
isn't it?
It's more about the values of aperson, it's more about their
ability to keep learning, andmaybe that's what we need to
focus on a bit more.
Speaker 2 (45:45):
Yeah, I agree with
you, but educators know you as
the AI educator and now you havewritten a whole guide.
Yeah, yeah, I edited it.
Some of the tools and some ofthe the bigger things that are
happening within this space.
Tell me about it yeah.
Speaker 1 (46:02):
So I mean, if people
are listening, I'm holding the
book up now, so it's it's calledthe educators 2026 ai guide.
So I wanted because things moveon fast and because there's
some amazing teachers andschools around the world doing
some incredible things I wantedto try and just capture the
moment for the next academicyear so through to the end of
(46:23):
2026, and go right, if you wantto be out front with how some of
the best schools are doing thisright now.
Here's a bit of a manual andhere's a bit of a companion for
the next year and it's splitinto three basic parts.
So it's the first part is thetools.
So I put surveys out and ifpeople follow me on linkedin
(46:43):
they might have seen I about sixmonths ago, I put a lot of
google forms out saying, right,tell me what your favorite tools
are, why, and for leaders, forteachers, engagement tools, all
that kind of thing and gatheredit all back and kind of looked
at well, what are the tools outthere that are having the most
impact for certain tasks orcertain areas of the job and for
(47:04):
students as well, and then kindof collated them really.
So we thought like we've got achapter on top tools for
teachers, tools for AI, toolsfor ai, tools for assessment,
feedback, tools for engagement,terms for professional growth,
tools for education leaders andso on, accessibility, and kind
of just collated them into likea bit of a bit of a directory, I
(47:26):
suppose, first and foremost.
Then the second part is as,going around a lot of schools, I
noticed there were seven kindof trends in terms of ai
adoption, in terms of whatschools were doing so, in terms
of how they were auditing whatai they were using, how they
were anchoring it in theirpractice, how a lot of them were
acting very quickly, blindingto context, how they were
(47:47):
activating their teachers to usethis, accelerating innovation
and aiming high with innovationas well.
So I reached out and just saidright, well, to some of the
skills I worked with, but alsosome other skills, just to say,
right, do you think you mightexemplify any of these lessons?
And then got hundreds andhundreds of case studies coming,
(48:08):
which was amazing, and was ableto sift through and pick what I
thought were the ones that kindof exemplified this the most at
the moment and I thought wouldmight help other schools.
So there's essentially sevencase studies and we've got case
studies from around the world.
We've got the United States,ireland, england, the Far East,
we've got one from Dubai and,yeah, a couple another one from
(48:31):
America.
So so this is not theoretical,it's actually.
I can I learn from someonewho's actually implementing this
right now.
What are they actually doing?
And I was really clear that weneeded to see some data in those
case studies as well, of impact.
So we've got some hard datafrom each of those schools as
well.
And then the last bit is, Isuppose, the bit that everyone's
worrying about.
It's those ethical concernsthat we look at.
(48:53):
So there's a great chapter by RKingsley on governance.
There's a great chapter byMatthew Weems on data.
Victoria Hedlund wrote achapter on bias and how to
manage that, and Dr Karen Boydfrom the states wrote a chapter
on the environmental impacts aswell.
So it's trying to give like asnapshot of what's really
(49:14):
important to educators right nowover the next year what tools
could they possibly use, howcould they integrate them into a
meaningful strategy and whatconcerns are there for them to
look out for really.
So it's kind of like theopposite.
It's not the high level kind of.
Some of the stuff we've justbeen talking about.
It's very much on the groundand my last book, infinite
(49:35):
Education, was quite high level.
It was this stuff that we'retalking about.
So I really wanted to kind ofnow hit the ground and give
teachers not just a book theycould read on a weekend, but
something they could carryaround at school, they could
write in, they could make noteson, and we've also I was really
excited in they could make noteson, and we've also.
(49:56):
I was really excited.
We partnered with an ai companycalled symbol who made videos
for every chapter.
So you use artificialintelligence.
They've created like a 10minute video for each chapter,
which gives you like.
So if you do not someone wholikes to read all the time, if
you're someone who hasn't gotthe time to to sit down with a
book, then there's a someaccompanying videos that give
you an overview.
(50:17):
And I was really excited as wellbecause we've got some amazing
companies to support it.
So we've got google grammarlyschool, ai, brisk magic school
and quiz gecko who supported uson that journey, and some of
their innovators contributed aswell, like little chapter
introductions, which is greatbecause I wanted to have that
various points of view, soeducators, leaders, but also
(50:39):
those who are in edtechcompanies creating these tools,
and then we get the full kind of360 on this of opinions, of
what's going on there.
So, yeah, I'm to be honest, I'mreally really like I can say
this without having a big head.
I think it's really goodbecause, because I didn't write
it like I wrote an introductionand an afterword, but like, as a
(51:01):
result, I think I'm reallyproud of it.
I think it's probably the bestthing I've done so far and
because it's just got thatbreadth of of experience and
voice in there from educatorsaround the world.
So, yeah, if anyone's lookingfor anything really practical,
then Can you hold that up again?
Speaker 2 (51:17):
if you are listening
to this on audio, it's called
the Educator's 2026 AI Guide andyou can get it on Amazon.
Speaker 1 (51:27):
Yeah, and it became
an Amazon bestseller a couple of
nights ago, which I'm reallyproud of as well.
So thanks for letting me plugit.
Speaker 2 (51:34):
Lisa, Appreciate that
no problem.
So final question and that ismy final question on the pod is
one and I have to end on a tool,just because I want to go check
it myself One tool that youhave discovered recently that
you are excited to share, thatother people can check out
(51:56):
Preferably something free, butit's okay if they have to.
Speaker 1 (52:01):
I you know, I suppose
I've got a geeky one, but I
suppose, no, I'll stick, I'llkeep it for teachers.
I honestly think that one ofthe best tools that a teacher
can use right now is Notebook LM.
One of the best tools that ateacher can use right now is
notebook lm, I think, for Ireally like it.
Number one because you uploadthe source material and and
whatever happens within thatplatform is based on the source
(52:23):
material you've provided.
So you are providing thefoundational knowledge and then,
therefore, you can be confidentthat whatever it does or
whatever it creates for you isbased on that robust research,
like hopefully you've done.
And secondly, I love the abilityfor to interrogate source
material, documents, policiesand one of the ways I I use it
(52:44):
the most, because I imagine alot of people listen.
This they've probably probablycome across it before, but one
of the ways I use it the mostand it's going to sound really
strange, but but I collectpeople's brains, all right, now
let me explain it.
So if I'm an admirer of anauthor or somebody out there and
I'll give you an examplethere's a guy called Daniel
(53:05):
Priestley and if you listen tothe diary of a CEO, you probably
yeah, he's like thisentrepreneurial guy person.
Speaker 2 (53:11):
Yeah.
Speaker 1 (53:12):
Yeah, I really like
him and I like his advice, I
like his books.
So I decided and he knows I'vedone this I told him about it
because he reviewed my last book, so I was chatting to him and I
told him I'd done this and hewas pretty cool with it.
So I essentially tookeverything I could find on him,
so YouTube transcripts ofinterviews he'd done.
He puts his books out there forfree as PDFs, articles, blogs,
(53:37):
everything I could find and Iput them in a notebook.
I learned, and every time now Icome across something, I pop it
in there from him.
And so now if I've got aquestion or if I'm thinking
something about my business,let's say, I jump in there and I
ask it a question and I knowthat the answer I'm getting is
likely what he would probablysay to me in person, because
(53:58):
it's come from him and I thinkobviously this is not a
replacement of him, but peopleor students, the people
listening to this, teachers,leaders can go and start
building like their own, theirown kind of light.
They probably have expertisethat they can tap into whenever
they want, and it doesn't justhave to be people, it could be
(54:19):
themes, it could be like ifyou're like me and you come
across articles, podcasts, let'ssay, about leadership, and you
you know I should listen to that, I should read that and you
bookmark it.
The chances of ever coming backto that bookmark are very slim.
Let's be honest, why didn't youjust create a notebook on them,
called leadership or whateverthe theme is, and every time you
come across a bit of content,just add it, just add it, and
(54:40):
then, when you do get a spare 20minutes so in a week's time
you're not going through a listof them thinking I'll give you a
listen to this, read this, youcan just have a dialogue with it
, and then you're bringing yourconcerns your issues are the
things you need help with inthat moment to the content, and
the content is responding to youin a relevant way and I, I,
(55:01):
yeah, I think it's a phenomenaltool.
It's like just giving anybody aresearch assistant.
Essentially like, if you thinkof a professor in university,
they literally pay to haveresearch assistants.
Do this work for them that'sanother job gone but a good news
for everyone who can't affordto employ someone to do that,
(55:22):
that this technology can be thatfor them.
It can be.
I think that goes for ai ingeneral at the minute as well.
Like I think a lot of peopleget worried about asking ai
questions and and so on.
I think you've been a betterceo, or they'd be.
The principal of your schoolhas a pa that writes letters for
them, that writes emails forthem.
That does a lot for them.
(55:43):
Why are you denying yourselfthat just because you get a bit
of a weird feeling about ai,like you could be using ai to be
a personal assistant for you.
That's not to do everything, notto trust fully, but to provide
some first drafts yeah I, Ithink, and once we kind of get
into that mindset, I think we westart then properly tapping its
(56:06):
potential on a personal,practical level yeah, oh, wow,
thanks dan thank you so much forhaving me back.
I really appreciate it and,yeah, have a great year ahead,
thank you, and the same to you.