Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jocelyn (00:00):
Hello, hello, welcome
to this episode of the
Structured Literacy Podcast,recorded right here in Tasmania,
the lands of the Palawa people.
I'm Jocelyn, and today we'retalking about something that's
on the minds of many leadersright now, which is choosing a
program.
If you know my work at all, oryou've been listening to this
podcast for a while, you knowI'm not anti-program, in fact, I
(00:21):
make them.
And I have to be really honest,I have thought perhaps I
shouldn't be, perhaps I shouldjust focus on professional
development work.
But I know that in order toreally support teams, we need to
give them tools that get themmaximum output, and our programs
are written with cognitive loadin mind from the very start.
So I've decided that I can'tstep out of curriculum, even
(00:44):
though that was never my goal inmoving into this work in the
first place.
But this podcast is not justabout if you use our curriculum
resources.
The Structured Literacy Podcastis a podcast for everybody,
whether you choose our tools orsomeone else's.
So, with that in mind, let'scontinue.
(01:08):
Programs can be incrediblyvaluable tools that support
instruction, provide consistencyand help build teacher
capability.
But here's what I want you tounderstand.
The process of selecting aprogram is just as important as
the program itself.
Today, I want to share somethoughts on what leaders can
watch out for when considering aprogram and how we can position
(01:30):
ourselves as quality providersthrough thoughtful
decision-making.
And when I say we, I meanleaders, and we, I mean
educators.
We are all providers of qualityinstruction, that's what we
should be, and we have to havethe thought processes and the
planning structures to make thata reality.
(01:51):
So what's our starting point?
Before we dive into thespecifics, I think we need to be
really honest about where we'restarting from.
There are generally twoscenarios when schools begin
looking for a program.
The first one is this we don'treally have a strong sense of
what quality instruction lookslike and we're getting a program
(02:12):
to help us.
This one is actually a verycommon and completely
understandable position.
We know our students needbetter outcomes.
We all know that what we'reseeing isn't nearly good enough.
We know something needs tochange, but we're not entirely
sure what excellent instructionlooks like in practice.
The second scenario is we doknow what quality instruction
(02:37):
looks like and we want somethingthat aligns with our
understanding and supports ourgoals.
To be in this second place, wereally have to have a strong
understanding of the principlesof instruction, of the general
characteristics of instructionthat works.
It's not about saying I used touse XYZ program and I want
(02:58):
something that looks like that.
It's a really tricky space tobe sitting in and it's one that
is challenging, and I've gotsome supporting suggestions for
you today.
Both of the scenarios are validstarting points, but they
require different approaches toprogram selection.
What we need to be aiming foris research-informed practice,
(03:24):
and you may have heard me speakabout this view of research
before.
Research informed practice sitsat the intersection of three
critical elements, and thisversion comes from here in
Australia from a social workperspective.
We're looking for researchfindings first and foremost.
(03:44):
We're not looking for researchthat matches our ideology.
Our question needs to be whatdoes research say about the
development of this skill?
The problem with research isthat it's very often conducted
in small group settings thatdon't reflect the context of our
schools.
They also don't reflect ourprofessional knowledge about how
(04:06):
to engineer learning, becausewe are in the business of
engineering learning for a largegroup of students with diverse
needs.
Intervention-focused researchoften has three or four children
sitting around a table beingdelivered by a skilled,
knowledgeable researcher, andthe same goes for models
(04:28):
developed on intervention as astarting point.
So we need to look at, yes,what the research shows, but we
also have to consider ourprofessional knowledge about
this engineering for a largegroup of students.
The third part we're lookingfor here is the positive impact
on the students.
(04:50):
When we adopt thisresearch-informed model, we must
evaluate the impact of our workon student outcomes, both
academically and from awell-being perspective.
Student outcomes become theevidence of success.
Success isn't whether or notwe've implemented a program with
fidelity.
It's not what teachers feelgood about, and it's also not
(05:14):
what somebody tells us it shouldbe because we've ticked the
items on a checklist.
Student learning is the onlyacceptable outcome from any
effort.
So while we're saying yes, wemust respond to research, we
must layer in our professionalknowledge.
If our student outcomes are notthere, something has gone awry
(05:36):
and needs to be addressed.
So remember the goal isn't tobuy a program, it's to improve
student learning.
Everything we evaluate shouldcome back to that.
Here's something else we need totalk about, if we go all in,
spending significant money andtime on something that isn't
(05:59):
going to work for our context orfor our students For our
context or for our students,we've potentially spent all our
resources on an approach thatisn't serving us.
And the hard bit in determiningthis question, asking well the
question and answering thequestion.
Well, if this thing's notworking, is it us or is it the
(06:20):
program?
It's really easy to blame theprogram when things don't go as
planned, and that's a naturalresponse when we're very first
learning.
"wait also really easy toadopt a wait and see approach.
this the wait to fail approach.
We wait to one, the studentwill catch up.
We wait to see if the programwill start working.
(06:42):
We wait for things to improve,and often this waiting is for
two or three working.
We wait for things to improveand often this waiting is for
two or three.
years And We say in three years,when we get our NAPLAN results
back, we'll be able to determinewhether it's worked.
Now, all of this waiting comesat a significant cost to student
learning if we're not hittingthe mark on instruction.
.
(07:02):
So programs and our approach toadopting one and implementing.
One should come with short-termfeedback mechanisms so that you
can identify whetherimplementation has been
successful and is on track,because without these short-term
feedback mechanisms, we'reflying blind task, we're wasting
(07:22):
whole terms and whole years inthings that aren't hitting the
nail on the head, and this isexactly why, with our programs,
we now embed coaching supportthroughout the implementation
process, but I'll tell you a bitmore about that later.
We'll get better outcomes whenwe select a program to evaluate
(07:44):
against a set of fundamentalcriteria.
That acts as the minimumstandard for our first steps.
What we often do, student, iswe go to a Facebook group, or we
go to the school down the road,or we go to some other social
setting and we ask what programworks.
This is often reallyunsuccessful, because the people
(08:06):
who are sharing their opinionare doing so because they like
what they have found Doesn'tmean it's the most successful.
It just means they like it andthey feel good about it.
So we have to dig deeper.
Timothy Shanahan discusses threeareas to focus on in school
improvement.
There's more than three, butthese are the three that relate
(08:27):
directly to the classroom andthat have the biggest impact on
student outcomes.
He talks about time on taskwhether we're teaching the right
?
content What to the rightstudents at the right time, and?
(10:06):
quality of the pedagogy.
So if we're spending 45 minuteson the basics of a phonics
lesson, we're not enablingenough time doing, for the other
elements of literacy doing,.
we So time on task will be off.
If we are not being datainformed about how we make
decisions about what to teach, if we're treating every year
one or two students as ifthey're the same, then there's a
really good chance that we willnot be teaching is, the right
content to the right students atthe right .
the And then there's the qualityof the pedagogy, which so
program, we comes down to humancognition and what we know from
the cognitive sciences.
So if the programs we're usingdo not support working memory,
particularly for our strugglers,we're in trouble from day one.
So before we spend time andmoney committing to a program,
we need to know what we'relooking for so that our work
will besuccessful.
I've previously shared here onthe podcast about a NICE
framework, and so when we'remaking decisions, this little
framework can help lead ourthinking and guide discussions.
The N in the NICE frameworkstands for need.
Do we need this thing?
How do we know what in our datatells us that this
is needed.
We can't make good decisionsbased on gut feelings or what
seems land, we good idea or whatreflects what everyone else is
doing or what we think they'redoing.
We need evidence to know wherethe area of need is.
Otherwise, we don't know whatproblem we are trying to solve.
The I stands for impact.
(10:27):
What's the impact we expect tohave?
What will tell us that this issuccessful?
Often, what that sounds like iswe are successful because
lessons look like they're beingtaught.
The same way, we are successfulbecause we've implemented the
program so We are successfulbecause in our leadership
(10:48):
walkthroughs we are seeing thatthe lesson steps are all present
.
Again, the goal's in the wrongspot.
We need to know what we'relooking for in student outcomes
as well as all of these otherthings, because if we can't
articulate what success lookslike, we won't know if we've
(11:09):
achieved it.
The C stands for capability.
Will this program or approachhelp build our capacity, or will
it lock us into lessons andpermanently remove our capacity
to make decisions?
Now the goal should be toincrease teacher expertise, as
(11:31):
we support them and support thewhole school approach.
We're not looking to replaceteacher expertise.
And the E stands for ease.
Now, nothing is win, easy we inschool land.
We know that.
But in order to adopt a programor an approach, will we have to
turn our school upside down,spend our whole resource and
(11:54):
professional learning budget onthis one thing for one area of
curriculum, or placeridiculously unnecessary
pressure on our team?
is yes, then rethink what you'redoing, because implementation
needs to be sustainable, and I'mseeing approaches that look
(12:16):
like we want to tick the boxeson explicit teaching and low
variance.
So whole groups of schools arebeing required to change every
single curriculum area to ascripted or highly guided
program, and we're not justapproaching a literacy goal here
.
We're changing everything thatwe're doing all at once.
(12:38):
So ease is not about being easyand being lazy and taking
shortcuts and being lazy andtaking shortcuts.
It's about responding to thecognitive load needs of our team
and our school and our students.
Now I'm about to say somethingthat may sound strange, and if
you've been with me for a while,you know sometimes this happens
.
Most people think that the goalis to get a program, and I've
(13:04):
already said here though.
Whether they're free or paid,their goal is for as many
schools as possible to be usingtheir teaching.
that But the goal of getting aprogram is a very different
thingfrom the goal of students
(13:26):
learning to read and write.
Now they should be the same.
You should be able to be on thesame page as whoever's
providing the resources to you.
And I'm not questioningpeople's motivation.
I'm not saying that, hey,people are selling things or
giving away resources knowingthat scripting, are not going to
(13:48):
be useful.
That's not what I'm saying.
What I'm saying is that, if wereally get to the heart of it,
we have to know what we'retalking about.
Focusing on getting a programmeans we get a quick win.
We tick a box and move on.
Focusing on students learningto read and write is a much
(14:09):
longer term endeavour, and whenwe choose this goal, we're
choosing the hard path, we'rechoosing the uphill climb.
It also means we're choosingmaam a pproach commit to the
moral imperative of the work, tothe moral imperative of the
(14:32):
work.
The goal isn't to buy a program.
The goal is to improve student.
outcomes, not just a little bit, but so that every student is
succeeding.
Everything we evaluate shouldcome back to that.
So what should we not anexhaustive list, really, it will
give you something to startwith.
.
You're looking for shortfeedback loops that allow
(14:54):
teachers to notice when thingsaren't going to plan and adjust.
If a program doesn't tell youwhether students are learning
until the end of a term orsemester, it's far too late.
Learning until the end of aterm or semester, it's
prioritises too late.
Wheels have fallen off in weekRosenshine's of the term and
we've just wasted seven or eightweeks doing teaching.
That isn't Rosenshine's themark.
(15:15):
So we need short feedback loops.
We also need ongoing supportand professional development
baked into the program.
Two days, three days, howevermany days of training is not
ongoing support and professionaldevelopment.
We can also think about it likethis Does the program help the
(15:40):
teacher learn to make decisionsor does John simply provide
scripts to follow?
Now scripting if I have anotherpodcast episode about, can
great teaching be scripted?
Scripting absolutely can have aplace when we are first new at
something, but no programdeveloper knows the responses of
(16:02):
your students in yourclassrooms at any given time.
So there needs to be apartnership between the teacher,
the school and the developer.
We really have to havepartnerships for success.
So has the package beenstructured as a partnership for
(16:22):
success.
This is it a bit of a wham bamthank you, man.
Approach where you do the.
(19:41):
Are students training, you get(in your stuff letters) and then
you're on your own?
This is one of the mostcritical factors from a school
improvement perspective Someonewho is invested in your success
will be there for you whenthings don't go to plan, because
I guarantee you at some (Thanksthey Anita won't Archer), with
I guarantee you at some pointthey won't.
And here's what I really reallyneed you to understand.
If we do the training and havegood systems in place,
implementation is likely to gowell initially.
The real threat to studentoutcomes comes when something
about our context doesn'treflect what the program
developer had in mind and, let'sface it, that's all the time.
Because no program developercame and wrote you a bespoke
program, the wheels fall offinstruction when it's time to
make decisions for our contextand our students and when we
don't have the support to dothat effectively.
And this is why ongoingimplementation coaching is so
critical.
It is not just about theinitial training, but about the
sustained support when you'remaking those contextual
decisions that determine whetherthe program will actually work
for your students.
And why don't more providers dothis?
I'm going to tell you, becauseit's not as profitable as the
model we currently have.
We need programs that buildteacher making in.
Does the program and the supportthat comes after the training
focus on helping teachers learnto make decisions or does it
engineer development out of thepicture?
Supporting working memory isalso a must.
Has the program being writtenin a There one of the
principles is we break thingsdown into small chunks and we
teach them one at a time, wehave to know what is a small
chunk.
What is working memory capacity?
Well, I'm going to tell you,it's two to three, and not that
long ago, john Sweller presentedlive and I got to hear him
differentiate, he said oh, three'nother I don't know.
We're talking about kids, sowe're going to go two.
So if the program or the lessonis asking students to focus on
more than two things that theydon't already have to
automaticity, that's the wholebox and dice for most of the
kids, particularly for thestrugglers.
So we have to define what itmeans to support working memory
and evaluate through that lensthis leads into appropriate
cognitive load.
Does the program intentionallylimit how much new information
is presented at once and givestudents the opportunity to
practice and embed new learningin context, or does it try to
cram in as much as possible in ashorter period of time as you
can?
Student engagement is anotherfactor provided with many in
capital letters, opportunitiesto respond and engage
academically, intellectually andsometimes emotionally.
Or are the students taken outon a tour of content with the
occasional request to join in?
I do something, you dosomething.
I do something, you dosomething Thanks, anita Archer
(20:03):
With not just repeating orsaying things with me, but doing
heavy lifting and thinking in away that doesn't overwhelm
cognitive load.
That's what you're looking forin student engagement.
We're also looking fordifferentiation mechanisms.
Does the program provide themechanisms you need to consider
(20:24):
the needs of the students infront of you, or does it treat .
We one student or year fourstudent in exactly the same way?
Provide the mechanisms you needto consider the "evidence-based
of the students in front of you, or does it treat every year
one student or year four studentin exactly the same way, there
needs to be consideration ofwhere students are up to and
what their specific next stepsare, particularly in the
development of foundationalskills ok Bottom of the rope,
(20:48):
bottom of the model, thesefoundation skills are
constrained and they need to betaught to mastery before we move
on.
So we have to think carefullyabout what it means to
differentiate but that's a wholenother podcast episode and I'm
sure I've spoken about thisbefore.
Top of the rope, text-basedunits, one in all in, and we're
providing adjustments so thateverybody can access the same
content which comes directlyfrom the curriculum.
(21:10):
But in the development of theearly steps of foundational
skills, we need to be targeted,and that's where you get the
best outcomes.
Is the program designed forstruggling students?
This is something that Iconsider to be critical because
it speaks to my moral imperativeas an educator.
Have the materials been writtenwith the struggling student in
(21:35):
There are many ways to stretchstudents who need enrichment,
and we should be focusing onthem too, but it's actually
really difficult to make contentsuitable for struggling
students if it hasn't beenwritten for them in the first
place.
You basically have to rewriteit.
A program of again (21:51):
instruction
will needs to maximize every
instructional minute, beinteractive and maximize student
engagement.
Now the core here is optimizingintrinsic load.
That's what we're looking for.
And let's come back to thatcentral question Will the
(22:11):
program improve student learning?
And that's the lens throughwhich we need to evaluate every
single criteria I've justoutlined.
I wish we could believe thelabels on things, but we have to
be critical consumers.
Whether the resource in frontof us is free, low cost or
(22:32):
pricey, we can't believe everyclaim made by everyone, and we
know that every man and theirdog is just plonking
evidence-based and science oflearning on things.
So we have to evaluate forourselves the impact that these
tools will have on our students.
Marketing materials will alwayspresent the best possible case.
(22:52):
No one in is putting atestimonial on their website
that says, well, yeah, it wasokay for some students but
didn't meet our needsfor others.
So testimonials are alwaysdelivered by the people who got
a really great outcome.
jocelynseamereducation.
We have to dig deeper.
There is a difference betweenthe price of a resource and the
(23:12):
cost of the resource.
The price is the dollar figurethat you pay for the physical
elements and training, but thecost is measured in so much more
than dollars.
There's the cost in time, thecost in stress, the cost in lost
opportunity for studentoutcomes, which is the biggest
and most serious cost of anydecision we make.
And this brings us back to ourcentral question again Will this
(23:37):
investment, both in dollars andin opportunity, lead to
improved student learning?
Now, what do you do, though,when you're not sure?
Well, easier when you haveprevious success in getting
strong student outcomes forevery student, but how do you
make decisions when you aren'tsure about the answers to the
(24:00):
questions that I've asked inthis episode?
How do we feel confident whenwe don't have experience in
getting results for everystudent?
Firstly, I'd suggest that youavoid asking for people's
opinions on programs andFacebook groups or in any other
social media platform, becauseyou'll get lots of people's
opinions, but not necessarilyevidence-informed guidance.
(24:23):
Second, it's important to knowexactly what you're looking for.
In season three, episode 22 ofthis podcast, I shared a list of
criteria the options in frontof you, and you can also use it
to audit your current practice.
(24:43):
So this will help you identifythe strengths and areas of
opportunity, because no programdoes everything and we have to
know where our programs will hitthe mark and where we're going
to have to supplement.
You can find this on ourwebsite at
jocelynsemaeducationcom and justSpelling Success search Action
for for Choose a Phonics Program.
.
(25:04):
Leaders the points ofpersonalised for phonics
programs also apply to analyseother areas of instruction as
well, so there's no harm indownloading the free tool and
having a look and seeing whetheryou can use it for different
purposes.
Having a look and seeingwhether you can use it for
different purposes.
Third, make sure, when you areconsidering any program, you
have the opportunity to evaluatea sample lesson or lessons and,
(25:27):
ideally, try them out.
Don't commit to something thatyou haven't been able to see in
action.
You don't buy a new car withouttaking it for a test drive.
I would suggest that the sameneeds to be available for our
programs.
Yes, you'll be a bit clunky inteaching it because it will be
new to you, but you should beable to get a general sense of
(25:48):
how this thing rolls and theresponses of the students,
knowing that things always getbetter with time as we develop
fluency with something, time aswe develop fluency with
something.
Finally, the ideal situation isthat you can begin development,
so simple start and then buildup over time.
This allows you to evaluateeffectiveness before making
(26:09):
larger commitments.
Choosing a program is not justabout finding something that
looks good or feels rightbecause other people are using
it.
It's about finding somethingthat looks good or feels right
because other people are usingit.
It's about finding somethingthat will genuinely support your
students' learning and yourteachers' growth.
It's about ensuring that everydollar spent and every
(26:30):
perfection, that moves youcloser to your goal of student
success, and this is why we'removing to an approach where
implementation coaching is arequired part of our programs,
and we're beginning this withspelling success in action.
For years three to six andbeyond, leaders will have
context, live personalizedsupport as they collect data,
(26:53):
analyze that data and decide ona strong starting point.
For now, schools will haveaccess to very reasonably priced
ongoing support from anexperienced coach who knows how
to get results.
We're going to build this upover time and we have some very
exciting new developments coming, but that's where we're sitting
, so we can ease people in tothe new way in which we will be
(27:16):
working.
I said before that mostproviders don't go down this
road because it's not the mostprofitable way children, it's
business.
It's just ok.
But to sell the program, sellthe training and leave you to it
, but we all know what leads tosuccess.
We also know what researchtells us about strong
professional development thatreally builds teachers'
(27:38):
capability in meeting the needsof their students, and it's not
short-term training, handingover decision-making to someone
else or leaving schools to go italone.
In the previous episode of thepodcast, I unpack what research
has to say about thecharacteristics of high-quality
professional development.
So if you're curious about that, you can have a look and have a
(28:01):
listen.
When you are making decisionsabout your next instructional
steps, make sure that you andyour students are set up for
success.
You might choose our curriculumresources.
You might choose anotherprovider that's free or paid.
Whatever you choose, make surethat it's right for your school
and will lead to student success.
(28:22):
Remember you're not looking forperfection.
That doesn't exist.
We're looking for a programthat aligns with what we know
about how students learn,supports our teachers in making
great decisions, providesmechanisms for us to know
whether it's working andincludes the ongoing support to
(28:44):
help you navigate thecomplexities of your unique
context and does so in a waythat doesn't take your entire
budget.
The process of choosing aprogram is an opportunity to
clarify our values, ourunderstanding of quality
instruction and our commitmentto student outcomes.
(29:04):
When we approach itthoughtfully, with clear
criteria and realisticexpectations, we set ourselves
and everyone around us up forsuccess.
That's it from me for thisepisode of the Structured
Literacy Podcast.
Remember you won't break thechildren.
It's going to be okay, but youalso have to make sure that you
(29:25):
don't break the grown-ups either, especially yourself.
Thanks so much for listening.
I know these decisions are hard.
You've got this.
Happy teaching, bye.