All Episodes

April 22, 2025 • 48 mins

In this powerful episode, Angela Connell-Richards sits down with assessment expert Maciek Fibriek to unpack one of the most critical topics under the new 2025 Standards—assessment practices and audit readiness.

Together, they dive deep into:

✅ What’s actually changed under Standard 1.3
âś… Why fit-for-purpose assessment means contextualising for your learner cohort
âś… How to test, validate, and align your tools to ensure compliance
✅ The role of AI in developing assessments—myths, risks, and best practice
âś… Real-world audit insights and red flags to avoid
âś… Why consistent validation and a culture of compliance are key

Maciek shares practical strategies, client stories, and common compliance pitfalls, including what ASQA is really looking for when they review your assessments. From AI-generated tools to LMS adaptations, no stone is left unturned.

Whether you’re preparing for audit, re-registration, or just want peace of mind—this episode will guide you on how to create assessments that are valid, reliable, and aligned to industry outcomes.

Send us a text

 Join host Angela Connell-Richards as she opens each episode with a burst of insight and inspiration. Discover why compliance is your launchpad to success, not a limitation. 

Connect with fellow RTO professionals in our free Facebook groups: the RTO Community and RTO Job Board. Visit rtosuperhero.au/groups to join today. 

 Ready to elevate your RTO? Join our Superhero Membership community and gain access to expert resources, training, and personalised support to help you thrive. 

 Discover ComplyHub—Vivacity’s AI-powered compliance management system. Designed to streamline operations and give you peace of mind. Book your demo today. 

Wrap up with gratitude and guidance. Subscribe, leave a review, and join our community as we continue supporting your compliance journey in vocational education. 

Support the show

Thank you for tuning in to the RTO Superhero Podcast!

We’re excited to have you join us as we focus on the Revised Standards for RTOs in 2025. Together, we’ll explore key changes, compliance strategies, and actionable insights to help your RTO thrive under the new standards.

Stay connected with the RTO Community:

📌 Don’t forget to:
âś” Subscribe to the RTO Superhero Podcast so you never miss an episode!
✔ Share this episode with your RTO network—compliance is a team effort!

🎙 Listen now and get ahead of the compliance changes before it’s too late!

📢 Want even more compliance insights? Subscribe to our EduStream YouTube Channel for our FAQ series on the New Standards for RTOs 2025! 🎥

đź”— Subscribe now: EduStream by Vivacity Coaching

✉️ Email us at hello@vivacity.com.au
📞 Call us on 1300 729 455
🖥️ Visit us at vivacity.com.au

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Angela (00:00):
Yes, travelling around the country.
Still, matek was just tellingme he lost a wheel of his car,
so he's had an interesting week.

Maciek (00:13):
It's been good.

Angela (00:14):
Yeah, yeah, okay.
So let's take a deep dive in.
Well, first of all, we'll lookat so our focus point is going
to be around the revisedstandards and what's changed and
what hasn't changed, whichthere is quite a bit that hasn't
changed but also what we'reseeing at audit.
So what's happening withassessment tools in audit, and

(00:39):
then also having a look at theimportance of aligning
assessment practices withindustry outcomes, which is what
the whole point of these newstandards was about is about how
we aligning what we do withwhat industry needs.
So let's dive in.
So first one is 1.3.
Standard 1.3 is assessmentpractices must ensure validity,

(01:02):
reliability, fairness andflexibility.
Let's have a look at what thismeans within practice.
What's different from where wewere before?

Maciek (01:14):
What are your thoughts on this?
Maychek?
Realistically, whilst thewording has changed, I really
think the intent has not.
That's right, it feels the same.
Yeah, and look, the simplestatement is fit for purpose and
consistent with the trainingproduct.
It's it's really 1.8a.

(01:35):
Um, it comes to mind straightup.
And you know again, if ourassessment practices are not
aligned to what industry wantand we're assessing something
completely different, um, thenwe have a problem, right,
because students are going to beassessing things that aren't
aligned and therefore are goingto go into an industry with

(01:57):
potentially a different set ofskill sets or knowledge, or
potentially not even becompetent.

Angela (02:03):
Yeah, yeah, yeah.
And I think in the past, fromwhat I've seen, is people buying
assessment tools and then notcontextualising for their
learner cohort, and that's wherea lot of the gaps are.
But also they're not collectingsufficient evidence to
demonstrate that the student isconfident.

Maciek (02:22):
Yeah, there's definitely two parts to that and I think
you're right.
The one is not checking yourtools before.
That, I think, is a reallycritical thing that a lot of
people just call it not enoughtime, aka being lazy or not
having the capacity to checkingthat those tools actually align

(02:47):
to the unit of competency, arestructured in a way that makes
it easy to mark, easy forstudents to follow, et cetera.
But then the second point thatyou made is really really valid
is we're in an amazing part ofthis journey at the moment where
we do have, as you mentioned atthe introduction, a chat
concept of ai, chat, gpt andother tools, um, where we can

(03:10):
take the tools that we currentlyhave and and truly align them
to each individual cohort,whether they're for a specific
type of industry, um or aparticular type of cohort.
We're still meeting therequirements of the unit, but
making them more as the units,as the requirements of the
revised standards say, fit forpurpose.

Angela (03:32):
Yeah, yeah, really identifying who are your learner
cohort and how are youadjusting the training to meet
their needs.
And I think a lot of that fallsin under support services as
well.
We've done under the newstandards.
There's a lot about how are yousupporting the students along
the way and adjusting thetraining to meet their needs

(03:53):
yeah, look one and a perfectexample one of my clients that
I've worked for many years with.

Maciek (03:59):
They.
They work across differentindustries, so everything from
warehousing, truck driving, busdriving, etc.
And we've actually writtenunits of competency for the same
unit, but written materials andassessment tools in the context
of those learners.
So the same unit, but busdrivers are learning about safe

(04:19):
handling of luggage into bins,whereas truck drivers are
learning about loading andunloading goods in a trucking
context and warehousing peopleare working about.
And so suddenly we've got thisability to focus on the student
without spending too much timeon saying, okay, it's going to
take me a day or two days torewrite the assessment tools.

(04:41):
Well, if we've got theframework, it's only a small
tweak, right?

Angela (04:51):
Yeah, and how awesome is that?
Yeah, and adjusting the casestudies and adjusting the um,
the content of how they're, youknow, managing that as well,
yeah, yeah absolutely it's yeah,yeah, and I think there's.
I think the other one, wherethat one often comes up, is
first aid, and how are theyadjusting first aid for the
different learner cohorts andmeeting those different needs?

(05:15):
So that's where you go, okay,are we targeting, you know,
people who work in childcarecentres, or are we targeting
people who are going to be firstresponders in a maybe in a
voluntary role, or are theygoing to be just general Joe
Blow off the street who justwants to do first aid?

Maciek (05:38):
And so we had the same thing.
So we actually wrote again aunit for first aid in the space
of, like you just said,voluntary first aid as first
responders.
But then that same unit is alsoin the Coxon course and so
they're on a boat, Twocompletely different
environments.
They're still having to learnabout first aid.

(05:58):
But, you know, doing CPR on ahard surface for two minutes on
a rocking boat can potentiallybe a little bit more typical
than doing it in an emergencysituation versus a childcare
centre.
So, yes, you know, these areall things that we can do.
Which begs the question and westarted talking about this

(06:19):
earlier but that context,transferability of skill, which
then leads to a whole differentset of problems because units
define the outcome but theydon't define the context, and
that's where RTOs sort of fitfor purpose.
But then how do we look attransferability of skill and

(06:43):
also credit transfer, whensomeone's done it in a
completely different environmentand yet we're still forced to
do credit transfer?
so yeah there's two sides to thediscussion that need to be had,
and especially the whole umsort of review that's happening
at the moment about thestructure of qualifications and
and training packages and soforth.
It'll be quite interesting tosee how that works.

Angela (07:06):
Yeah, yeah.
So one of the main things thatI've seen that is different in
these standards is the testingof the tools, and I've had a lot
of people asking me so whatdoes that mean, Angela?
What does testing mean to you,Maychek?

Maciek (07:29):
to you, maychek.
I'm a believer and one thingyou know when communication at
its basic is words, all havemeaning.
But whether we're calling ittesting, whether we're calling
it what was once flagged,pre-validation or valid, you
know, ensuring that the toolsare valid at the end of the day,
ensuring that the tools arevalid At the end of the day.
For me, testing is firstlymaking sure that the tools again

(07:49):
are fit for purpose.
But just doing a pre-use testright, asking your team to read
through and undertake thequestions so that in a simulated
environment they can actuallysee do these tools actually fit
the intended use right?

(08:11):
So doing testing, askingindustry to potentially provide
some guidance, is this whatwe're doing?
I mean, a lot of that shouldhave been done through that
industry consultation processand you personally knowing
industry because you guys arethe experts, so a lot of that
should already come in from thatperspective in the tool
development stage.

(08:31):
But it's just about testing it,making sure that they are
actually just doing what they'redesigned to do.

Angela (08:36):
Yeah, yeah, yeah, um, the way I've been explaining to
people is, um, and what Irecommend is your trainers
should test the tool, theyshould complete the assessments
so that they fully understandhow the assessment is structured
and then how to, because if youknow, like if you start with

(09:00):
the end in mind, and if you knowwhat the assessment's going to
be, your delivery of trainingshould lead to that.

Maciek (09:06):
Yeah, so, and I think all assessors should complete
the assessment that they aregoing to be delivering yeah,
yeah, there's two things thatI'll say to that as well, though
is one if the person thatwrites the tools is testing
tools, then there could be somebias or some document blindness,
as I call it, yep.

(09:26):
So that's something also toconsider from that perspective.
But the other and I had asecond point that will come back
to me in a second but, yeah,it's just making sure that the
tools really, you know, arewritten in a way that students

(09:47):
can understand.
Okay, because it's all aboutthe student right and making
sure that it's easy to follow.

Angela (09:55):
Yeah, yeah, and I'll go back to that what you said if
the person who wrote the tool iscompleting the assessment, get
someone else on the team tocomplete the assessment.

Maciek (10:06):
so testing to me is yep so I was just going to say.
The other thing is which often alot of people don't do is make
sure that your learning materialaligns to what's also in your
tools right, so that if yourassessment tools have been
written or purchased separately,your tools right.
So that if your assessmenttools have been written or

(10:26):
purchased separately, make surethat the material actually
covers what's in your assessmenttools, especially at the lower
level aqf, so that you knowyou're not suddenly asking a
question.
That if the students have goneon the journey of reading the
material been listening in classor online and in a webinar that
when they get to the assessmenttask it's not unfair because

(10:46):
they've actually been taughtwhat is being assessed.

Angela (10:49):
Yeah, yeah, and they should have a thorough
understanding of the content bythe time they get to assessment.
Yeah, yeah.
So that's from my point of view.
I see the best way is toactually test the materials, is
to complete the assessments andthen from there then you build
out how you're going to deliveryour training, and I think

(11:10):
anybody who's hiring a newtrainer, that should be the
first thing that they get thetrainer to do is complete the
assessments.

Maciek (11:17):
Yeah, and it shouldn't take them long if they're the
industry expert right.

Angela (11:23):
Yeah, that's correct.
Yeah, yeah, and as a newtrainer, I think that's a
brilliant way.
Like I know, I've worked inRTOs as a trainer and often I
got the assessment tools justbefore I was walking into the
classroom and I've never seenthem before.

Maciek (11:41):
And it's a great way to pick up on spelling, grammar
flow, all things.
You know.
If someone fresh is looking ata set of documents, it's it's
likely that they'll pick up onthose silly mistakes that other
people haven't seen yeah, yeah,yeah so but, I've also been uh
as a trainer assessor.

Angela (11:59):
I've also developed assessment tools along on the
fly.
So where I, I was in an rtothat didn't provide me with
anything, so I they had thetraining materials but not the
assessments.
So I took the trainingmaterials and took the unit of
competencies and then developedassessment tools.
Um, I don't know whether everytrainer could possibly do that,

(12:21):
and this was before we had thediploma of um uh, developing, uh
, developing assessment tools.
I'd come from a backgroundprior to getting into training
writing policies and procedures,so it made sense to me, but I
couldn't see every trainer beingable to do that.
So I think we are living in abeautiful world.

(12:44):
Yep 100 yeah yeah, yeah, okay,so, um, uh, I I haven't really
seen anything else.
That's, that's changed, and inour next episode we're going to
be talking about rpl and credittransfer, so that it works
really well with what.
What we were just discussing,so we'll, we'll go through that,

(13:07):
but when it comes to what areyour thoughts around what makes
an assessment valid, reliableand fair, how long have I got?

Maciek (13:23):
exactly right.
Um, when we're looking at thecurrent standard, so 1.3, we've
spoken about 1.4, and again,that to me is no different to
1.8.
Um one and two right as asthere are at the moment.
So again, not really much haschanged there.
So when we talk about whetherit's validity, reliability and

(13:45):
so forth, so for me, if the unitof competency asks a student,
from an outcomes perspective,that you know they need to be
able to do all of theseperformances and have them
underpinning knowledge orknowledge evidence, they become
valid when they actually achievethat.
There's a lot of discussionabout assessment tools being the

(14:08):
minimum requirement andtherefore adding certain things
on that might be required for aspecific industry Again, not
really the place to be talkingabout that but from a mobility
perspective, as a a minimum, thetool must address the, the
minimum requirements of thatunit of competency, right?
I think we all can agree tothat yeah from a reliability

(14:29):
perspective.
Um, I think for me it's beingable to use the tool multiple
times, over and over again, withdifferent people, et cetera,
and there's enough instruction,enough understanding to have
consistent approaches.
So that's from an assessor'sperspective, but also from a
student's perspective.
You know, a lot of the times Isee assessment tools where the

(14:52):
first, you know, even 16 pagesof the assessment tool are
repeatable information about theunit, information about these
things, the principles ofassessment and rules of evidence
, and I sort of go by the timestudents get to the assessment
task if they've actually readthose first 16 pages, they're

(15:15):
sort of lost.
And so, for me, having toolsthat are easy to use, clear
instructions, conciseinstructions, but enough to
allow for that reliability tohappen Likewise from an
assessor's perspective, is key.

Angela (15:30):
I've had many discussions and written to the
level of the student and writtento the level of the student.

Maciek (15:34):
Yeah, Correct many discussions with auditors,
assessors, quality assessors inthe past about reliability,
where you know it's a balancingact between providing
information at a depth wherethere's no professional judgment
available, but providing enoughinformation to ensure

(15:55):
consistency of, of use andperformance from a student's
perspective, and so I thinkthere is a balance where a
suggested answer in the contextof is provided and a potential
example or two of what iscompetent performance or
satisfactory performance toachieve competence is important.

Angela (16:14):
So it's not up to individual interpretation, it's
there is a framework that theyneed to work with.

Maciek (16:20):
Yeah, yeah, I agree with that a framework, but also an
instruction to the assessor thatsays that you know, based on
the student's past andexperience, the answer or
actions may be slightlydifferent, and that's okay, um,
and that should be brought up infuture, either validation
sessions or even at moderation,if the RTO is doing that.
But to a flexibility to usesome professional judgment but

(16:45):
then report and record that sothat it then can be updated in
your tools later on if it'sconsistently being undertaken
that way yeah, yeah, yeah, and Itotally agree with that.

Angela (16:56):
the way I explain um, when you look at a unit of
competency and then you have therules of evidence and principal
assessment, you're basicallyoverlaying the unit with these.
So when you're validating theassessment tool, you're really
asking the question of the toolnow, is the way this is written

(17:16):
fair and flexible and valid, andis it current?
Is the way the student isconducting the assessment or the
assessor is conducting theassessment?
And I think the biggest thingis consistency.
That consistency that you haddiscussed, and I think one of
the big things that not a lot ofRTOs do is they should be

(17:37):
validating assessment toolsacross a range of different
students and different trainersand assessors so that you can
see that you've got thatconsistency between different
classes and different trainersand assessors.

Maciek (17:50):
And what a perfect segue to go to 1.5.

Angela (17:53):
Yes, RTOs must retain evidence that assessment
decisions are made against thetraining products requirements.
So what I've seen what ASQA islooking at when it comes to
evidence and I'd like to knowyour point of view on this.
So I get this question how longare we supposed to keep the

(18:16):
completed assessment tools forand in the new standards?
I've seen conflict six months,and then I've also seen like
seven years and it depends on ifyou're doing government funding
.

Maciek (18:35):
Yeah, so my understanding the current
standards say six months, right,yeah, from the time from the
date that the completion or thecompetence has been recorded.
So in the current standardsit's always bizarre because if
someone's doing a one-yearprogram, potentially you don't
need to retain record ofevidence of that person's

(18:55):
competence by the time they'rehalfway through the course,
right?
But my understanding is fromwhat I've read under the new
standards is that new standardsrequire you to they actually
fall into alignment with morethe ESOS Act than the National
Code, where you have to retainthe evidence for a period of two
years after the completion ofthe program.
Is the way that I've understoodit.

Angela (19:18):
Yeah, yeah, yeah is the way that I've understood it.
Yeah, yeah, yeah.
So some of the things that I'veseen with audits is and this is
something that's come uprecently a lot with assessment
tools is assessment toolswritten using CHAT, gpt.
So I've seen a lot of peoplegoing, oh well, we can write all

(19:39):
our assessment tools nowourselves, we don't need to
purchase.
But ASQA auditors are actuallyrunning the assessment tool
through ChatGPT to find outwhether it's or other AI
software, to see if it's machinewritten, and they also want to
know like when they you knowwell and good use ChatGP, gpt,

(20:02):
but are you contextualizing it?
Are you, are you just simplyputting in the unit of
competency and say write me anassessment tool based on this,
or are you putting yourknowledge and skills into it?
What are your thoughts aboutusing chat gpt to create
assessment tools?

Maciek (20:17):
yeah look, I I've not heard of asco doing that.
Um, if they are, I would we onlyhad that yeah, if they are,
then then again, if they'reusing it as a a checking
mechanism, um to to effectivelyspeed up the validation process,
but it's.
If it speeds up the the processof the audit and therefore

(20:38):
saves the rto money based onASPR auditing fees, no problem.
What I do know, and talkingabout that process, is that
there are AI validation toolsout there at the moment.
I've seen some of them andthey're not always perfect
either, just like AI is notalways perfect.
From my perspective, I like tocall it hybrid AI right, where

(21:04):
we're incorporating our ownpersonal experiences and our
years of experience that we'vebeen in the industry with AI and
developing a prompting sheet orsomething along those lines
that gives quality output, thatallows for a quality product to
be produced.

(21:25):
I think if we're not using AIand there's a lot of people out
there that say, no, you can'tuse AI, or et cetera I think if
you're not using AI, then you'revery quickly going to start
falling behind in industry.
There's no reason why AI shouldnot be used to develop learning

(21:49):
material if it's done correctly.
That's the critical thing here.
You know, to use, uh, aninappropriate, sorry analogy
shit in, shit out, right.
Um, if we, if we really don'tthink about what inputs we're
putting into any ai system,we're going to get rubbish out,

(22:12):
and so a lot of people that I'vespoken to that have tried ai,
um, various tools.
They don't choose to understandthe system first and they just
are this rubbish, etc.
I, I, I refer to ai as thesmartest, dumbest intern that
you will ever have.
Yeah, and you teach it andcontextualize the um, the system

(22:38):
to be the um, I guess personathat you want it to be.
It can really do some amazingthings, and we have produced
countless uh learning materialsand assessment tools for various
providers and we see someamazing results.
Is it perfect all the time?
No, right, but that's where thehuman bot comes into it.

(22:59):
Our us as a human to review thematerial, to do a quality
assurance process on it, whichthen goes to what we were
speaking about 1.3, is makingsure that tools are fit for
purpose right, but the timeframeof taking what used to take two
to three weeks down to a matterof days, it's a no-brainer.

(23:21):
Now, on the note of ASQA and AI, though I don't believe ASQA
have a problem with RTOs usingAI.
What I think ASQA have aserious issue with is RTOs not
having a process or a procedurein place to check that students
aren't abusing AI for thepurposes of determining or

(23:42):
filling out assessment tools,etc.
And there's no, no checks andbalances in place.
I think that's where us we havethe biggest concern at the
moment and realistically, mostRTO providers should have that
concern yeah, yeah, and I think,look, I certainly have no
problems with people developingassessment tools using AI and

(24:03):
chat GPT.

Angela (24:04):
I use it every day, all day.
So I do, and I keenly encouragepeople to use it, but it's
making sure you are checking thedata.
As you said, shit in, shit out.
You've got to make sure thatyou've got the right data that's
going in there and use yourhuman brain to really review it
and identify.
Well, what are the?

(24:25):
Is this going to be fit forpurpose?
Is it going to be meet therules of evidence and principles
assessment and does it makesense, like when the student's
reading it?
And I think the most importantpart because I see students
reading it.
And I think the most importantpart because I see, like what

(24:45):
you said about chat gbt, I seechat gbt as a highly skilled
research assistant, but they'renot they that, that's what
they're good at, but they're notgood at that contextualizing.
So that's where you need to getin and do that.
Although, however, my botslearnt a lot about me and my
organisation, I still have toreview stuff, though you can't

(25:05):
just take the first thing thatcomes out.

Maciek (25:09):
Look, I'm not going to give away too much, that most
people should already know.
But when I use ChatGPT, whetherit's for the development of
learning materials or assessmenttools, the persona that I ask
it to take on is I want you totake on the role of a

(25:29):
world-renowned vet, slash RTO,consultancy and instructional
designer, but an instructionaldesigner who has got over 10
years experience in X industryright.
And so by taking on thatpersona of a consultant, an
instructional designer and anindustry expert, they suddenly

(25:51):
have a persona that goes right.
I know how I need to think.
And then we start doing theother prompts right, and so once
you've got that down Pat, thenthe rest prompts right.
And so once you've got thatdown pat, then the rest follows
right.
But it's making sure thatthey've got the right framework
at the beginning, becauseotherwise you do get rubbish.

Angela (26:11):
Yeah, yeah, yeah, and I think we've still got to put in
our skills, knowledge,experience in that industry
sector.
That's the crucial part, and Ithink one of the other things
that is really good aboutCHAT-GPT is personalised
learning, where we cancontextualise the training and

(26:33):
assessment materials to thelearner cohort by doing exactly
that.

Maciek (26:37):
Creating a prompt of this is our typical student,
this is their backgroundbasically your training and
assessment strategy, aqf levels,language issues, uh,
terminology, um even ethnicbackgrounds, um asking to do
better case studies that areappropriate.
I mean, it's really uminvaluable, what, what tools

(26:59):
we've got available to us.
So, yeah, but the other thingthat I would say is, when you're
using it, don't just acceptwhat it's saying, as, again,
someone who uses it daily in ourbusiness, we see that even chat
GPT has bad days right, andwhat you say to it one day and
then ask the same instructionsthe next day, it's potentially

(27:21):
learned different things or it'sbeen updated, and so you've got
to always be checking andamending your prompts,
potentially to to make sure thatit's still coming out with good
materials yeah, yeah, yeah, andyou have to yeah, yeah, yeah
yeah, and one of the things Irecommend if you are using

(27:41):
ChatGPT is you can now putprojects together so you could
have a project that could be acertain industry sector and then
that way, it will only take thecontent from that prompt.

Angela (27:53):
Yeah, yeah, yeah, awesome.
I know these are areas thatwe're both passionate about AI
and I really love training andassessment as well, and so we've
sort of covered some of thethings around.
The most common assessmentcompliance failures, and what I
see at audit and then I'll askyou, maychek, is what I see at

(28:18):
audit is and I've touched onalready the main things that
assessment tools arenon-compliant with is they
haven't been contextualised forthe learner cohort, they haven't
been validated, they don'taddress the unit of competency
requirements and the assessmentconditions, or they've simply
spewed out the performancecriteria of a unit of competency

(28:41):
and turned it into a questionand answer assessment tool based
on the performance criteria.
I think, uh, where, if you'regoing to be developing your own
assessment tools, make sure thatyou are fully addressing all of
those areas.
When you're looking at um rulesof evidence principles,
assessment and the unit ofcompetency, what are your
thoughts around?
What are common areas, errorsand failures, rules of evidence,
principles of assessment andthe unit of competency.

(29:02):
What are your thoughts around?
What are common areas?

Maciek (29:06):
errors and failures, look in generally, or using AI,
or In general everything.
Look, I think in general, a lotof people don't, like we spoke
about at the beginning, don'tcontextualise the materials.
I've seen a lot of tools thatalso jump around a lot or use a

(29:30):
lot of opinion or pastexperiences of the developer,
where you sort of sit there andgo, okay, this is nothing to do
with the context that we'redelivering in.
So I think it's reallyimportant that when you purchase
materials, that you do thechecks again, making sure it's

(29:51):
fit for purpose.
It goes back to that simplestatement, and so I see a lot of
times people buy it, theyimplement it, but don't actually
give it any thought as to howthat context works.
They don't analyze thequestions and the depth of the
questions, so a lot of the timethey don't take into account AQF
level and dimensions ofcompetency.
So there's a lot of thoseissues.

(30:16):
When we start looking atdeveloping assessment tools
using AI, again it's thatprompting that needs to be there
, but then also cross-checking,asking it to reverse, map or map
the materials that it's beingproduced, because, again, it
doesn't always get it right, andso using the tools that we've

(30:37):
got and understanding the use ofthose tools is really important
.

Angela (30:44):
Yeah, yeah, yeah, I think you've wrapped that up
very nicely together there.
I think, like the top threenon-compliances I see
consistently is trainers andassessors, assessment tools and
training and assessmentstrategies.
Any one of them can trigger oneor the other to be
non-compliant.
So if you're starting with agood assessment tool that you

(31:06):
are validating, you are testing,you're going to have much more
better success at audit, butalso with student completions as
well.

Maciek (31:15):
Yep 100%.

Angela (31:17):
Yep, yep, yeah, okay, so let's so.
We sort of discussed what aresome of the red flags that ASQA
are looking at withinassessments.
Have you got an example of whatpeople should be focused on
when they're preparing for anaudit with regards to assessment

(31:38):
tools?

Maciek (31:40):
Yeah, just the biggest thing is obviously when we're
going into an audit whether it'sinitial application or
monitoring is making sure thatthe people that know the tools
are there.
When we're talking aboutinitial applications, we're
generally providing the toolsthat are not completed, so the

(32:03):
raw or the unanswered tools,because we don't have any
students yet.
So at initial application, it'salways advisable to have the
person that's either developedthem or the industry expert
that's there to support that,Because often a CEO may not be a
trainer and assessor.
There's no requirement for theCEO not be a trainer and
assessor.
There's no requirement for theCEO to be a trainer and assessor

(32:24):
.
So, yeah, it's really aboutmaking sure that whoever is
responsible for administeringthe assessments within the
organisation understands them.
I have been in audits in thepast where the client has been
unwrapping the textbooks out ofplastic on the day of the audit
right oh dear, and so that's nota great look.

Angela (32:49):
It should be doggied with Post-it notes all through
it.

Maciek (32:53):
Yeah, look, 100%.
You know that's the wholepurpose, right, that we know our
product when it's a monitoringaudit or re-registration audit,
where there are completed tools.
Again you know it, we shouldnot be using the audit as the
point of time where we'rechecking our systems yes, yes we

(33:14):
often.
We often do because we haven'thad time or we've been lazy or
there's been other priorities.
Realistically, the audit in aperfect world.
We should be going to an auditconfident that what we've
submitted is compliant, right,if your validation approach has
worked, if your qualityassurance mechanisms have worked

(33:35):
, there shouldn't be a problem.
But if you're going to an audit, check your materials before
you submit them to the auditor.

Angela (33:42):
It's not hard.
Yes, yes, definitely.

Maciek (33:46):
We'll have an opportunity where auditors are
telling us who they you know.
Four years ago we got toldpre-COVID.
Five years ago now, we didn'tknow who was being audited.
Right, what student files.
They were just told on themorning.
These are the student files Iwant to look at.

(34:09):
We now know.
There should be no reason whyyou're not checking those files
and making sure that they'recompliant before we submit them.

Angela (34:11):
Yeah, yeah.
And then it's nice and easy tofollow through.
Follow in the documents, yeah,yeah, um.
And I think think the otherthing is don't leave getting
yourself compliant to oh, we'vejust submitted our re-reg.
It should be a process, itshould be ongoing, but whenever

(34:35):
we have someone come to us who'sgetting ready for re-reg, we
want a minimum of six months towork with them to make sure that
they're on track, and thatshould be what people should be
doing.
At least six months before yousubmit, you should be preparing.
Well, look, yeah.

Maciek (34:58):
I don't disagree, but I also think that we shouldn't be
using re-reg as a point ofchecking.

Angela (35:08):
We should be using it annually right?

Maciek (35:11):
Yes, it's a point where it's a reminder, but
realistically, let's have aculture where we're focusing on
our business practices and ourquality assurance practices
ongoing.
Yes, we can use re-reg as apoint to remind ourselves, but

(35:33):
let's try and have a culturewhere re-reg comes around, it's
like, yep, we're good to go, weknow we're confident.
Re-reg comes around, it's likeyep, we're good to go, we know
we're confident.
You know we've had this mindsetof focusing on compliance for
six years, seven years, you know.

Angela (35:48):
So let's keep going and what I and I've said it many
times before is a culture ofcompliance.
You've got a culture ofcompliance throughout the
organisation.
But time and time again I seepeople who come to us who go, oh
, we've just had ASQA contact usand we've got an audit, and I'm
like, well, you should havebeen prepared.

Maciek (36:07):
but that doesn't always happen.
We're talking about perfectworld, but yeah.

Angela (36:14):
Yes.

Maciek (36:17):
Worst case scenario, 100%.
If you're coming up to re-regand look, some re-regs go
through with no audit, some gothrough with no audit, some go
through with an audit, some gowith no audit, with an order a
year after.
So, um it.
It really depends, I think, onasper's workload, risk rate,
risk factors and an amount ofstudents that you've put through
and the type of scope thatyou've got.

Angela (36:36):
That that will often determine whether you're going
to get audited at re-reg or notyeah, yeah, and I like I don't
see any consistency of how anaudit is triggered.
So, yes, you've got your re-reg, but you've also got an
addition to scope.
Like, we have people who putsubmitted addition to scope and
and then, oh, okay, now we'regoing to audit you, whereas

(36:57):
other times they put asubmission addition to scope and
nothing, they, they don't haveanything, and we're always
preparing them for that worstcase scenarios.
If you do go to order, you'vegot all your ducks in a row and
you're ready, ready to go.
Yeah, yeah, all right.
So thank you very much, maycheck.
This has been awesome talkingabout assessment tools, so.

(37:19):
So my final recommendations isconduct a full assessment review
and really have a look at allof your assessments and schedule
validation as a regular thingthat's in your diary twice, like
I used to do it twice a yearand I used to bring all the
trainers in and conduct it twicea year, making sure that your

(37:42):
trainers are meeting therequirements with the background
, skills and knowledge withinthe units as well, and then
making sure that you're testing.
So my recommendation with thetesting is get your assessors to
complete the assessment tools.
What would be your biggestrecommendations, maychek?

Maciek (38:03):
Exactly what you said.
But also one thing I'm seeing afair bit of, obviously
post-COVID, is people have takenpaper-based type tools that
they purchased years ago or evena few years ago and putting
them into an LMS.
And that's fine, we're movinginto that digital world great,
and that's fine, we're movinginto that digital world great.

(38:25):
But they are often notreviewing the instructions that
have been provided in thepaper-based and are copying them
directly into it.
So I have seen LMSs that saysyou know, using a blue pen,
answer the questions.
And it's in an LMS withauto-graded theory questions.

Angela (38:42):
Or go to PAGE.

Maciek (38:44):
Yeah, you know, whenever you're transferring anything,
use common sense, take the timeto read or create new
instructions, again using AI.
Not difficult to create newinstructions that actually align
to the assessment practicesthat you've got.

Angela (39:02):
Yeah, yeah, you could just put a prompt in.
We want to take our paper-basedassessment tools and put it
into LMS.
Yeah, yeah, not hard at all.

Maciek (39:09):
That would be my final recommendation.

Angela (39:11):
Yeah, yeah, All right, thank you, Matek.
Once again, great to catch upwith you.
Until next time we'll see yousoon.
Thank you, Until next timewe'll see you soon.
Thank you, and stop there.
We go.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.