All Episodes

August 27, 2024 • 52 mins

This episode reveals laboratories' most common pitfalls with implementing AASHTO R 18, such as calibration record-keeping, training and competency evaluation, and internal audits. Learn how to sidestep these issues by implementing robust quality management practices. Listen in to transform your laboratory practices and get tips to move beyond standard compliance.

Read this episode's companion article or watch the video on YouTube.

Send us a text

Have questions, comments, or want to be a guest on an upcoming episode? Email podcast@aashtoresource.org.

Related information on this and other episodes can be found at aashtoresource.org.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Kim Swanson (00:03):
Welcome to AASHTO Resource Q&A.
We're taking time to discussconstruction materials, testing
and inspection with people inthe know.
From exploring testing problemsand solutions to laboratory
best practices and qualitymanagement, we're covering
topics important to you.

Brian Johnson (00:18):
Welcome to AASHTO Resource Q&A.
I'm Brian Johnson.

Kim Swanson (00:22):
And I'm Kim Swanson , and we have another one.
It's the first of this season,but it's another Common Findings
episode.
Which one are we talking about?
What standard are we talkingabout today, brian?

Brian Johnson (00:33):
We're talking about everyone's favorite
standard, AASHTO R18, the basisand foundation of AASHTO
accreditation.

Kim Swanson (00:44):
It is, and so that means that everyone in the
AASHTO accreditation program isaccredited to AASHTO R18.
So this is a very wide reachingepisode, I hope.

Brian Johnson (00:55):
It should be, and I did want to mention, if you
had not heard yet.
This is episode three of seasonfive, and this is available not
only in audio form but alsovideo, so you can see uh us
staring at a screen or a camerablankly while the other one

(01:20):
talks, which is really exciting,it's riveting it.

Kim Swanson (01:23):
It's a riveting video.
So, yes, go check out ourYouTube, our YouTube channel,
and so you can actually see thiswonderful dynamic that Brian
and I have.
We're matching today, totallyunplanned.
So, yeah, it's a great, a greattime.
Go check out our YouTube videoif you can.

Brian Johnson (01:42):
Yeah, and that's not all.
You can see what my basementoffice looks like in my house
and you can see the cell that wehold Kim in.

Kim Swanson (01:52):
It does very much look like I am in a prison cell.
It is just my living room ordining room.
If I know what room I'm in,it's in my dining room, so it's
just happens to be a very blankwall behind me.
So very exciting, but it doeslook a lot like a prison cell,
yeah.

Brian Johnson (02:13):
So that's what you can check out on our YouTube
channel.
Yes, and I'm sure everybody isjust frantically going.

Kim Swanson (02:19):
Yeah, I'm sure, I am sure everyone just stopped
listening to however they wereand went right to YouTube.
So exciting things.

Brian Johnson (02:29):
Yeah, all right, let's get into it.

Kim Swanson (02:32):
Yes.

Brian Johnson (02:33):
Because we don't want to waste too much time.
We're going to talk aboutcommon R18 findings and we're
going to do this a lot likewe've done the other common
findings episodes, where I askKim questions that she couldn't
possibly know the answer to, andthen she guesses and then we
see where it goes from there.
So there will be some questionshere and you can kind of play

(02:54):
along at home and see if you doas well as Kim does, or maybe
better.
I hope you do better than me.

Kim Swanson (03:02):
I hope everyone that is playing does better than
I, because I don't knowanything.
So this is going to be veryexciting for me.

Brian Johnson (03:10):
All right.
So first we're going to easeyou into this and I will not ask
the first question, but I'mgoing to tell you about the most
common categories of findings,and they are calibrations,
standardizations, verifications,checks, maintenance records,

(03:32):
all of that stuff which, for thepurpose of being concise, I am
just going to call itcalibrations, because most of
our customers just call themthat.
They're not all categorized ascalibrations, but I don't want
to have to say that long stringof words every time I mention
this.

Kim Swanson (03:49):
Okay, all right, it gets tedious.
Duly noted, you're going torefer to them as calibrations,
but it's everything that youjust said, so got it.

Brian Johnson (03:58):
Yeah, any equipment records.

Kim Swanson (04:00):
Okay.

Brian Johnson (04:01):
So this was a section of data that I pulled
from our system over the courseof I think it was almost a tour
it was like a two-year data setand there were 3,717 findings

(04:23):
written on calibrations.
Next it was training andcompetency evaluations, and
there were 2,837 findingswritten on that.
So that is, typically thelaboratories had not presented
records on either training ofstaff or competency evaluations

(04:47):
of staff, and that could be anumber of issues going on that
we'll get into, but those twocategories were far higher than
any other categories and thenbelow that.
So I was somewhere around the3000 range there.

(05:07):
And then the next, the thirdone, goes all the way down to
371.
So that shows the discrepancy.
Yeah, wow, so super commonissues and then not as common
for organizational issues thatcould be common for
organizational issues.
That could be org chart wasn'tupdated, didn't have one

(05:28):
Position, descriptions were notin conformance with RIT, a bunch
of other things like that.
Then we've got 306 findings onmanagement reviews.

Kim Swanson (05:41):
Okay.

Brian Johnson (05:43):
I thought that would actually be a little
higher.
People really struggle withthat one.
And then 216 for internalaudits.

Kim Swanson (05:51):
Okay.

Brian Johnson (05:52):
And I think that shows progress on the internal
audit front that there were notas many as management reviews,
because a lot of times those twoare neck and neck with each
other.
So those are the most commoncategories for R18 findings in
general.

Kim Swanson (06:09):
So, for some context, there are over 2,000
laboratories accredited forAASHTO R18.
So those numbers that were likethe 3,000 findings or
nonconformities, is that spreadout over the entire 2,000
laboratories that are accredited?

Brian Johnson (06:28):
Not exactly so the way it worked is I pulled
data from two years ofassessment reports from AASHTO,
resource and CCRL, so what thatmeans is the laboratories that
were no-transcript may have somelaboratories with no findings

(07:16):
in a particular category and youmay have some with 16 or more
findings.
Okay on that.
So it's really it's it's reallyit's very hard to get a sense
of exactly what to expect if youwere coming into the program
based on these numbers.
I don't think that would,because it's really dependent on
how the laboratory manages thequality program in their

(07:42):
facility.
So I said I would add moreinformation about what type of
calibration uh issues there were, and I do have more data on
that.
So, uh, and I I think this is alittle telling is so there were
2 000 or sorry, 3 717 findingsuh, 2406 were just not presented

(08:07):
.
So there was a record straightup not presented.
So they were operating withsomething just completely
unknown I wouldn't say out.
So this is another misconception.
People say, well, if theyweren't calibrated they must
have been out.
Not necessarily, it may havebeen fine and they just didn't

(08:29):
know if you know if it was ornot.
Uh, they didn't have a recordto show that, which, uh, is not
something you want to have as apolicy, that we're just not
going to do it and assume thatit's in.
Um.
Obviously, if you're accreditedyou can't operate that way, um,
but there are, there are lapsesthat happen occasionally, and

(08:52):
there are times when people um,can't find it at the time, which
isn't good either, yeah, butit's not as bad as not having it
at all, yeah, and then anotherbig drop.
So we go from about 2,400 to544 that were missing results.
So that could be.

(09:13):
Maybe they let's.
I'll use a thermometer, becausethat's a typical one.
Maybe they had a thermometercalibration record, but it
didn't have the test points thatwere needed.

Kim Swanson (09:24):
Okay.

Brian Johnson (09:24):
So that would factor in on that count.
Other ones intervals notcorrect, so they might have had
a 12-month interval when therewas supposed to be a six-month
interval.
And then some just detailsmissing equipment, IDs of the
calibration equipment, ormissing the ID of the equipment

(09:45):
itself or misidentifying it.
We've run into this issueoccasionally where people they
will replace a piece ofequipment and give it the same
ID number as the one that isn'tthere anymore, which you can't
really do.
No, I don't think you can thatno that seems like it would be

(10:08):
very confusing it would be veryconfusing and it's hard to tell
what you used.
Yeah, practice if that's whatyou've done, but we've run into
that more than a few times wherepeople really they they push
back on that concept that eachone has to have a unique
identifier, but that's kind of.
The whole point is that it hasa unique identifier so you can

(10:31):
trace back to what went wrong,if something did go wrong, or
which equipment was used.

Kim Swanson (10:36):
So, going back just a little bit, you mentioned the
intervals not being correct.
Being correct Now canlaboratories have different
intervals than what's stated inR18 if they have the record
showing that they don't need to,like that they've proven that
they can go longer, or somethinglike that.
I believe I recall somethinglike that.
Am I wrong?

(10:57):
Is this similar?
Is this relevant or no?

Brian Johnson (11:00):
It is.
You're absolutely correct.
It is possible to change theprescribed interval if you have
data to support it, and this cango either way.
So let's say I have there's aninterval that says I have to do
something every three months andI have years of data showing
that this is just too frequent.

(11:22):
Like I never see changes on this, I want to extend it to six
months or a year, and if youhave the data to support it,
yeah, you can do that.
And, conversely, if you've gotsomething that you're doing,
let's say you're calibrating itevery year, but every time you
find that it's out of spec, fromoveruse or misuse, you need to

(11:47):
shorten that interval, and youdon't need any special
permission to do that.
You've already proven that youneeded to, because you're not
able to show that it's it'sstill in spec because it's
either drifted out or there'ssome some factor is playing into
it that you have not resolved.
Or maybe you just do a ton oftesting and it just needs to be

(12:08):
shorter so you can make thosechanges on your own.

Kim Swanson (12:13):
Now do you have to have policies and procedures in
your QMS describing how and whyyou change your intervals for
that, documenting that process.

Brian Johnson (12:23):
Or no.
You don't necessarily have tohave a procedure written for how
you did that, but you will haveto show the data and show how
you did it.
Um, there there is an AASHTOstandard that kind of takes you
through that decision makingprocess and you can follow that,
as I believe it's r61.

(12:46):
See, now I'm going to have tocheck that and make sure that's
accurate as well.
Good news it was R61.
Other items with calibrationinclude the lack of uncertainty
on calibration records formeasurement standards.
There were 145 labs that didnot have uncertainty and 84 of

(13:06):
them also did not have the 17025accredited calibration, which
is probably why they didn't haveuncertainty.
So those issues, people arestill figuring out how to deal
with that.
So question for you yes, okay,this is your time to question
for you.

Kim Swanson (13:25):
Yes, okay, are we this?
Is your time to play.
Okay, ready.

Brian Johnson (13:30):
What do laboratories typically struggle
with when resolving calibrationnonconformities?

Kim Swanson (13:38):
I would say with proof of or evidence of that was
actually calibrated, like thecorrective actions they took
actually were taken.
No, is that what you're asking?

Brian Johnson (13:53):
well, that's a big, that's a big part of it.
I think a lot of times whathappens is that they uh resolve
part of it, or they don't reallyknow what to do and they
haven't looked back at all ofthe requirements.
So so these things are oftenlayered.
So you've got your r18requirements, you've got your
requirements in the test methodor whatever other standards it's

(14:15):
used in, and they may not lookat those two items together to
see, oh, what test points do Ineed?
But, as it is with so manyissues that occur, a lot of

(14:37):
times with the issues the rootcause of the problem is not
often identified, and a lot ofit is communication Right and a
lot of it is communication,right, so communication.
I think if you're ever beingasked about, like what is
important or what could go,there's a lack of communication
between the laboratory and theagency they hired to perform

(15:11):
those calibrations, so, or viceversa, so that would be one of
the things of that.

Kim Swanson (15:19):
While the provider that they chose, the laboratory
chose may be, while the providerthat the laboratory chose may
be 17-025 accredited, theydidn't ask for a 17-025
calibration.
So they didn't get theinformation that they needed,
even though they are accreditedfor it.
The calibration agency isaccredited for it.
They didn't provide that levelof service because it was not

(15:43):
specified by the laboratory.
Is that what you're kind oftalking?

Brian Johnson (15:45):
about that happens.
That is part of what I'mtalking about.
The other thing is okay.
Well, the calibration agencyshould ask the laboratory, well,
what points do you need, whatare you using your equipment for
?
And then they can have thatconversation about how to
deliver the services that thatlaboratory needs.
Another issue that we've foundcan happen is the calibration

(16:09):
agency's technician might showup and perform the work and they
might find some problems andnot communicate those to the
laboratory and then leave themwith a bunch of records, Some
saying that things are inspecification meet
specifications, some that don't.
But if there's no communicationabout that, those can just go

(16:32):
right into a folder or into afile even harder, a file on
somebody's computer or in aninbox and never looked at until
it's too late.

Kim Swanson (16:43):
Yeah, so it's just assuming that.
Well, they didn't say anything,so it must be all within
specification, which that's notalways the case, so the
laboratory has to actuallyverify on the record what was
actually said.

Brian Johnson (16:58):
That is correct.

Kim Swanson (17:00):
They can't assume no news is good news in this
instance.

Brian Johnson (17:03):
Definitely not, and it is a challenge.
I mean I don't want to make itsound like it's a lack of effort
.
Sometimes it's just a lack ofavailability, because the people
, the contact person at thelaboratory, might just be
running around all the time,yeah, and it's hard to find them
.
And the person's like I have toleave these with somebody or I

(17:24):
have to send this to them afterthe fact.
And I mean it's typicallythey'll send them the official
one after the fact anyway, butthey might not be able to easily
have that conversation thatthey need to, but I don't know.
There's a lot of things that cango wrong.
That is a common issue.
Okay, let's get into trainingand competency evaluation.

(17:47):
So, all right, what do youthink the main issue was?
The, the.
There was one that, with this,so we talked.
There are a lot of findingsrelated.
This is number two uh, with abig gap between two and three on
training and competencyevaluations.
What is the main reason thatthat this happened, or not the

(18:07):
main reason, but what was thewhat was the main issue?

Kim Swanson (18:10):
I'm gonna take a take the lead from the
calibration one and say recordswere not presented you nailed it
.

Brian Johnson (18:17):
Yes, yes, that is correct.
So score one for kim swansontoday.
Uh, so most of them that theywere just not presented, and
then a big drop off, and thisguy actually kind of surprised
me is that.
So it was 2482 not presented,so 2,482 instances of a record

(18:37):
not being presented and only 355times that a test method or
practice was missing from therecord.
So maybe they had a record forsomebody, but it was missing one
or two of the test methods thatthat technician performs
normally, areas where alaboratory can struggle due to

(19:08):
not necessarily a lack ofcommunication in this one, but

(19:29):
just a lack of keeping up withwhat is going on at their
laboratory, so they may havepicked up a new technician or
they may have started doing newtesting and forgot to document
the training, and they may nothave a great system in place for
doing that.
You know, one thing that we'vebeen working on at AASHTO
Resource is an onboardingprocess that is improved, and I
think that would be puttingtogether a checklist at your
laboratory, and this is one ofthose things.
That's outside of R18.

(19:51):
It's outside of all of thestandards to have something like
this, but you really have tothink about those non-standard
things that are helping youoperate your quality management
system better and implementthose.
So put a checklist together.
When I hire somebody, this iswhat I need to do, and one of
those things should be makingsure that they have training

(20:11):
records, making sure they'vebeen trained on everything
they're going to do for you.

Kim Swanson (20:17):
And similar checklist of when you're adding
a new test to the scope, rightLike of making sure like are the
people trained?
Do we have like documenting it?

Brian Johnson (20:25):
So, not only with new hires, but with new
additions to your scope oftesting, that's right, and and
you all, the ones that have donethis really well typically will
maintain a matrix on all oftheir staff and all the tests
that they perform or all theactivities that they carry out,
and that way they can easilykeep track of who needs what and

(20:49):
if they have any gaps in whatthey are saying they offer and
if they can actually deliverthose services.

Kim Swanson (21:00):
Yeah, and I think your clarification on the
activities they perform and notjust the tests reminds me of, I
think, one of your pet peevesMaybe not, maybe I'm just
projecting, but that assumingthat a technician knows how to
calibrate or check equipmentbecause they know how to run the
test, and I think, isn't thatsomething that I don't know?

Brian Johnson (21:25):
I feel like we've had a conversation.

Kim Swanson (21:26):
I feel like we've had a conversation about that
before.

Brian Johnson (21:28):
We have.
So that was the one of the mainissues I was trying to get.
Added to R18 is that thereneeded to be a documented
training for any technicianwho's being asked to also
calibrate or what you know.
The other words I mentioned inthe beginning standardized check

(21:50):
uh equipment that they areusing, because that is often
overlooked and often assumed bymanagement that they will know
how to do.
But it is a quite differentprocess than performing the test
with the equipment, and sothere does need to be care taken
to make sure that they knowwhat they're doing before you

(22:11):
just say here do this.

Kim Swanson (22:14):
Yeah, but that is currently not in R18.
That's just kind of a bestpractice that you should do,
that laboratory should do.

Brian Johnson (22:20):
That's correct.
But it's so critical and itcomes up so often that to me it
should be in R18.
So we will continue to work onthat and see if we can get that
added.
But for those of you whounderstand how the AASHTO
standards are developed, you'llknow why that can't just be done

(22:43):
.
And for those of you who don't,it's because I am a staff
member at AASHTO.
I do not vote on what getsadded to the standards.
So it is the Departments ofTransportation, materials,
engineers or their designeesthat make the votes on the
additions or removals or editsto the AASHTO standards.

(23:07):
So if the collective wisdom isto not include that, then it
does not get included, even if Iam trying to get a change made.

Kim Swanson (23:22):
Yes, you can only do so much because it is
consensus standards, so it isnot just what Brian wants,
that's right, it is the AASHTOstandard, not the Brian standard
, and same with the AASHTOaccreditation program.

Brian Johnson (23:35):
We are not operating based on my whims.
There is a big process andwe've got tons of content on
that, so I'm not going to gettoo into it.
All right, let's move on.
This is another issue thatcomes up on this.
Do you think that the problemlies with the implementation of

(24:09):
the requirement or that there isa possible problem with the
requirement itself?

Kim Swanson (24:16):
oh, so me knowing nothing I'm knowing nothing me
knowing nothing.
This is great opinion.
I'm gonna say again you said noright and wrong, but just from
context clues I would have afeeling that it's probably more
in the implementation than therequirements.
But it could be a good half andhalf.

(24:38):
It could be a good mix of both,like I don't think it's just
one or the other most likely.
But if I had to lean I wouldsay it was implementation and
lack of communication and lackof clarity on how to implement
those correctly.

Brian Johnson (24:51):
Yeah, I think you're probably right on that
one, and I know you say knowingnothing.
You've been around, you'veabsorbed a lot of information
that you might not want, but youhave over the years, and I
think a lot of it isimplementation based on the
gross amount of findings thatthere are on this topic.

(25:15):
Say that because we're lookingat these and the outliers are
very high, but they are, um,they're, they're not, uh, just a
case of like oh, unlock, yougot unlucky there, uh, and you
got some non-conformities.

(25:35):
Uh, there's some problems withwith it, how difficult it is for
people to keep up with thisstuff.
So it does make me wonder ifthere are some things that can
be done to make it a littleeasier to conform, uh.
And not, when I say easier, Idon't mean like, oh well, it
doesn't matter, uh, we shouldjust just punt on this concept

(25:58):
and people can just kind of dowhat they want, uh, what I mean
is maybe some of the specificityof the requirements is too
stringent and maybe there aresome unnecessary items in there
in the details.
Because ultimately, if I feltlike people were still doing a

(26:19):
great job and this is stillcoming up, I'd say, oh, it's
definitely a problem with therequirement.
I don't feel that way.
There's definitely someimprovements that need to be
made with the quality of testing, but I think that we do need to
look at the standards anytimewe see a large number of
nonconformances to see ifthere's some changes that could

(26:40):
be made.
And I think that there are somechanges that could be made, and
I think that there are.
But it's going to take somerisks and anytime you loosen
anything up, you're taking onsome risk, right yeah?
So there'll have to be somereflection and some

(27:03):
consideration of what risks arewilling to be taken on and
whether they're worthwhile.
I'm not sure how much we'd wantto loosen up, but I think that
there's probably some changesthat could be made.

Kim Swanson (27:12):
Yeah, and maybe it's not loosening up but
streamlining it and making itmore.
You know like you're notloosening it but you're making
that process easier and so maybethey don't have to look at all
of the other data points.
You know like maybe it's just aconsolidation of things in one
area instead of loosening up.
Because I don't love that as auser, as the public, as using

(27:35):
the benefits of the testing ofjust driving my car down the
road and or across the bridgeand not worrying about it
collapsing.
You know I don't love the ideawhen you say loosening up, but I
do.
I would agree that there'salways likely opportunities to
streamline the process and makeit more clear and concise, but
also knowing, like you saidbefore, like are you asking, are

(27:57):
the requirements making adifference?
Does it matter that?
It is like, is there data tosupport that?
It doesn't matter that youcheck this X, y and Z or
something like that.

Brian Johnson (28:06):
So yeah, and I think it's a mixed bag.
There's some areas where Ithink that there are is very
little evidence to support theneed to do certain things, uh,
and there is plenty of evidenceto support the need to do others
.
So, uh, it is a mixed bag andwe just have to keep thinking
more about it and not justassuming that everything that is

(28:29):
already the way it is is therebecause it needed to be.
So that's all part of continualimprovement.
It is these processes, so weabide by that concept and we'll
continue to work on that.
Okay, for organizational issues, we've got 146 on the org chart

(28:53):
, 146 findings on the org chart,with 120 findings on position
descriptions okay, so with?

Kim Swanson (29:03):
So with the org charts.
Why does it matter?
Why does it matter that that'saccurate and up to date, and why
should the laboratories careabout?

Brian Johnson (29:14):
that.
There are a couple reasons whyso.
Number one they need to knowwhat their organizational
structure is at that laboratory,and it's good for everybody on
their staff to know what that is.
Number two it is a way for themto keep track of who is working

(29:37):
there and who is under eachperson on the org chart, so
who's supervising whom and howall of the different departments
relate to each other.
The third reason is because italso helps identify the

(30:00):
requirements for certification.
So if you've got a requirementfor the technical director to
have a PE license, then we wouldwant to see if that is the case
, is there a technical directoridentified on the org chart and,

(30:23):
if so, does that technicaldirector maintain a PE license?

Kim Swanson (30:29):
Because I was asking, because it was.
I can see why, like part of myjob sometimes is the paperwork
and I'm like, if I don't get whyit is I mean not this type of
paperwork but if I don'tunderstand the reason why it's
important that this is up todate.
That's moving to the bottom ofmy to-do list and I'll get to it
when I get to it.
So I think it's helpful forlaboratories to know why it's

(30:51):
important that that's up to date, because it's not just a hoop
you have to jump through.

Brian Johnson (30:56):
It's actually useful information not only for
accreditation but for in theirday-to-day laboratory practices
but for in their day-to-daylaboratory practices it is, and
I would say there are elementsof it where they're just not
good at the paperwork part.
Right, like, sometimes peopleare terrible with the software

(31:16):
that you have to use to make anorg chart.
You know as simple as drawingthe lines connecting these
different boxes of staff members.
And how do I organize Ourorganization is really
complicated.
How do we make it look likethis makes sense?
One thing I don't love is thatsometimes people will have one

(31:41):
and then they'll make us aspecial one to make it easier
what, what they think is easier,to show conformance, which
sometimes that means that andthis is not allowed.
Uh, that they not.
Not that we would know.
This is the hard part.
Sometimes we wouldn't know ifyou're doing this yeah uh, but

(32:04):
what can happen is sometimesthey will remove people from the
org chart that aren't certified.
So that we don't go asking aboutwhether that person is
certified or not, and you can'tdo that.
So that is a you are basicallyhiding someone or not disclosing
some information that you needto disclose.

(32:25):
There are other issues thatoccur too, like they don't know
who is in charge of thetechnician, so you might have a
complicated situation where,well, it depends on the project.
So for certain projects, thisengineer is in charge of what

(32:46):
they do you know the quality ofwork that they do and on this
project it might be somebodyelse, and you've got to figure
out how you want to structurethat.
Maybe you have a box ofsupervisors and then they all go
down to the technicians.
That would be fine.
That's one thing I want to makesure people understand, too is
that it's okay for us to ask youquestions about okay, well, how

(33:12):
would this work?
Or can you explain this in moredetail?
In some cases, you may have toadd some kind of note to your
org chart, or maybe you justhave to have a document that
supports it to kind of includethose little details.
Or sometimes it's just have tohave a document that supports it
to kind of include those littledetails, or sometimes it's just
going to be an explanation thatyou give us every time, and you
might get tired of doing that.

(33:32):
But if you do get tired ofdoing that, you can just
document it.
But maybe you do need just somequestions.
But where we've seen some realproblems occur is where you've
got a quality manager that's notreally connected to the rest of
the organization and that isgoing to lead you down a path

(33:53):
that uh is not quality orientedUh, and I would not like to see
that like.
Ultimately, the quality uhquality needs to be in line with
with the operations and theperformance at that laboratory.
So you never want to have asituation where, oh, we just

(34:14):
have this quality person thatkind of checks in here and there
, gives some feedback, and thenthey're not really involved,
other than that they need to beable to enact change.
Because if you are focusedsolely on productivity without
quality, then you cannotmaintain that quality mindset

(34:38):
that you need in order tocontinually improve and to
maintain your accreditationultimately.

Kim Swanson (34:44):
Yeah, that definitely makes sense.
Thank you for clarifying allthat for me and our listeners
not just me.
I hope our listeners gotsomething out of that, because
my day-to-day is not going tochange, but hopefully this
prompts somebody in a laboratoryto take meaningful action.

Brian Johnson (35:00):
Yeah, yeah, you'd hope so, but if you have any
other questions about that, youcan talk to any of your quality
analysts, or me or Amy, or evenKim.

Kim Swanson (35:21):
Apparently, I will say.
You mentioned how people don'tlike making the connect boxes
and lines together, and you canchange the orientation to be
portrait instead of landscape inPowerPoint as well, so you can
do it.
I'm sure there are YouTubetutorials on that as well if you
have questions about it, but Ifind that's the easiest way to
make org charts for me and myneeds is in PowerPoint.
All right, easiest way to makework charts for me and my needs

(35:44):
is in PowerPoint.

Brian Johnson (35:45):
All right.
Next, we're going to get intomanagement reviews, which is a
common issue, but not as commonas I expected.
I think it's just because eachlab only has one management
review, right?
So you're only going to haveone or two findings on this one,
whereas you could have amultitude of ones on
calibrations or training or copyfiles.
So anyway, uh, 218 just didn'tpresent management reviews,

(36:08):
which is not at all surprisingto me, because people still
struggle to figure out what themanagement review is.
Uh, and in light of that now,kim, you have, you have
experienced some managementreviews.
You've, you've participated in,um, uh, you have read them.
You have, I have what amanagement review is.

Kim Swanson (36:28):
Well, I will say from all of the webinars I've
been a part of, with TracyBarnhart, our quality manager,

(36:50):
talking about this, that I willsay.
One of the big takeaways for meis that a management review is
an input into an internal audit.
They both look at the bigpicture but it's different, as
in, the, management review is aninput to an internal audit.
But I think you asked what thebiggest misconceptions were

(37:11):
about it.
I will say the frequency thatyou have Frequency.

Brian Johnson (37:26):
I don't know, I don't feel strongly about that
answer, but I'm gonna sayfrequency okay, that's part of
it, and I wouldn't say thefrequency necessarily, but the
um, the when, when do we, whendo we do it in relation to the
internal audit?

Kim Swanson (37:43):
oh, okay, so that that's so.

Brian Johnson (37:45):
So that is part of it that people struggle with
Like okay, we scheduled ourinternal audit and our
management review on the sameday.
That's not going to workbecause those are two totally
separate things.

Kim Swanson (37:55):
Well, as one is the input to the other, so you
can't really have an internalaudit before your management
review, because that would be anincomplete internal audit,
would it not?

Brian Johnson (38:04):
Maybe it's a sampling.

Kim Swanson (38:05):
I mean, you can talk about the last one right,
oh, yeah.
I guess that's true.

Brian Johnson (38:10):
So they're both inputs of each other in a way.
So when we're having amanagement review we will talk
about the internal audit results, the last internal audit
results and how did we do onthat and were there any?
How do we do on our improvementopportunities, and you can talk

(38:32):
about all sorts of stuffrelated to the last internal
audit.
And during your internal audityou're going to check to see if
you conducted your managementreview last year and did you
resolve non-conformities fromthat.
So you do talk about them inrelation to the other.
So that that part is is kind ofinteresting.

(38:55):
But with the management review,I think a lot of times people
just confuse it with theinternal audit because of the
way and this is an area where Iwould say r18 is partially to
blame, because the wording usedto describe the management
review makes it sound like itmight be some kind of audit and
it really isn't so, it's more.

(39:15):
it's more of tracy I think aptlydescribes it as kind of the
state of the union address foryour laboratory.
So, uh, you take all this stuffquality related issues and
otherwise it could beproductivity, it could be
acquisitions, facilities, couldbe all kinds of different stuff
and you present it to topmanagement and it gives them an

(39:39):
opportunity to ask you questions, to give you the go ahead on
making some changes or spendingsome money or using some
resources that you didn't knowyou had, and it just kind of
closes the loop on theorganization between your
laboratory quality andoperations and top management.

(40:00):
Because we do.
If you're wondering why thatmatters, if you don't do that,
you're going to have someserious issues keeping up with
everything.
And we have seen thatlaboratories that don't do a
good job with management reviewsoften can't keep up with their

(40:22):
expectations because they're notallowed to spend money on
anything.
And it might not even be thatthey're not allowed to.
It's that management was neverinformed.
So get back to communicationthat we talked about at the very
beginning of this.
If top management that has tomake the decision to allocate
resources isn't informed thatresources are needed, they're

(40:43):
never going to come to you andsay, hey, I got a great idea.
Why don't you blow a bunch ofmoney on this stuff I don't know
anything about?

Kim Swanson (40:52):
Yeah, no, that won't happen.

Brian Johnson (40:54):
Instead of spending it where they do know
where it's needed right.
So it's really important thatthat communication loop is
closed so that things can justfunction properly.

Kim Swanson (41:07):
Yeah, and I do know that we have and correct me if
I'm wrong because my brain mightbe misconstruing things but we
do have a policy for newlaboratories gaining
accreditation that we don'texpect them to have a management
review right away or aninternal audit right away.
Am I correct on that?
I feel like we had an episodeon that but can you explain that

(41:29):
?
Thank you.

Brian Johnson (41:31):
Can you explain that more?
I can?
Yeah, we allow six months kindof leeway for them to operate.
So if they're a new laboratoryand they're trying to get
accredited, we don't expect themto have conducted a management
review or an internal auditbefore they're even working,
because they really haven't hadanything to evaluate yet.

(41:51):
So we give them some time to dothat, so they can get
accredited, and then six monthslater we would expect them to
send us their completed internalaudit and we'll have conducted
a management review.
Now they can do it anytime inthat time period and they and
but they have to at least showus that they know what it is and

(42:11):
that they've they've had a runthrough at least once.

Kim Swanson (42:14):
Yeah, and that's a definitely an instance of don't
perform them on the same day,don't wait till the last day.
Maybe do the management reviewat three months and then at you
know five.
Maybe do the management reviewat three months and then, at
five months, do the internalaudit or something like that to
make that work appropriately,because that will be a red flag
if you said you did them on thesame day.

Brian Johnson (42:33):
Yeah, for sure, and I think it's okay.
You can do them however youwant, because the real value is
going to reveal itself later on.
Anyway, you're going to see itin the subsequent ones the next
year, the next two years, nextthree years and so on, and
things should change over time.
Like you should.
Don't just make this a staticactivity.

(42:56):
You know, think about howyou're doing it, and the
management review is a good timeto think about how you're doing
it and ask around.
You know, is this working forus?
Those kind of questions shouldreally be asked during those
management reviews.
What else can we do?
Does this make sense?
Are we doing what we need to do?
Does this follow our vision andmission?

(43:17):
Those kind of questions aregreat if you've got them.

Kim Swanson (43:22):
What's the next thing on the list about?
That was management reviews.
Internal audits is the last one, right.

Brian Johnson (43:27):
Yeah, internal audits was the last one, and
that's your basic expectations.
Is that people haven't beendoing them or they're missing
some details?
Now for R18, this is an areathat I think needs some
improvement is it says that theentire QMS needs to be audited.

(43:50):
That wording is going to leadto some inconsistency on how
that is carried out and how thatis assessed carried out and how
that is assessed.
So I think that one thing thatI would like to do with an R18
ballot upcoming is to actuallylist out need to include this,

(44:12):
this, this, and I think our listthat we talked about today is a
good start for what needs to beincluded in that.
And actually recently somebodyreached out to me and said, hey,
I was written up for thisaspect of you know, something
was missing in the internalaudit and my immediate thought
is I don't know that I wouldhave looked for that as an
assessor.
So I know that there's someimprovements we can make in

(44:35):
consistency of the way we'relooking at this.
That's something that will bein an upcoming R18 ballot.
It was in a past one thatdidn't make it through, but it
wasn't related to that.
Exactly One thing I did want totalk about with internal audits
.
Some people who feel that theydon't have time to perform an

(44:55):
internal audit will try to hirea consultant to do that.
And they ask is that okay?
And it's like, well, probablynot, but it depends.
You know, I'm not going to saycompletely no on that, because
I'll use Tracy for an example.
Let's say she retired and shesaid you know what, I'm going to
be a consultant now.

(45:15):
And we say, oh boy, it would bereally great because she's such
a great internal auditor, sheknows our system in and out If
we could hire her as aconsultant to carry out those
internal audits.
You know, can we do that?
I would say absolutely, thatwould be a great option.
But if we hired an externalauditor that we see once a year

(45:37):
to perform an internal audit, Iwould say no, that's probably
not appropriate Because thatperson is not going to have the
knowledge required to carry outan effective internal audit.
So yeah, it's there, like a lotof things.
There's no black and whiteanswer on that, it's just going
to depend on the situation.

(45:57):
But I would say if you're notsure, don't do it.

Kim Swanson (46:00):
Channeling my inner Tracy Barnhart of nobody knows
your business better than you.
So that's the value of internalaudits.
And to challenge you, brian, ifyou said you would hire you
know Tracy as a consultant, Ithink that's only going to work
for a couple of years becauseour processes will change.
So, like that's not a long-termsolution, that's a.

(46:21):
That's a temporary bandaid on asituation where in five years
she's not going to be asfamiliar as she is, you know,
right after.

Brian Johnson (46:28):
So yeah, and she's not going to want to do it
anymore.

Kim Swanson (46:31):
No, I would.
I mean, I know that for sure.

Brian Johnson (46:34):
Yeah, but in the made up scenario of that, Right
Of that, yeah, so yeah, like Isaid, it's not a static thing.
You got to constantly look atit and think about quality
improvements, Yep.
Other topics that have come upnot enrolled in proficiency
samples not reporting testmethod.

(46:55):
Reports in accordance with thestandards.
Sometimes people don't have theup-to-date standards.
That happen.
Actually, that one I thoughtwould have been a lot higher
number, but it was only 44instances that's not bad at all,
yeah not bad.
I think the um subscriptionservices has helped a lot in

(47:17):
that regard.
Is that more people are goingto that so that if they need
something they they can easilypull it up.

Kim Swanson (47:23):
And I will say AASHTO standards are only
available digitally now, sothere are no hard copies on that
.
Just an FYI, in case peopledidn't realize that.

Brian Johnson (47:34):
Right.
Yeah, that is good information,because some people always they
still will call it the books,right, and then this may
surprise people.
So, in case you're wonderinghow many times falsified records
showed up, only 38.
I would say only 38 should bezero, right?
But it's not.
You know, human nature is whatit is and there were 38

(47:54):
instances of falsified recordsthat were documented in those
findings over the course of twoyears.

Kim Swanson (48:00):
And I will say that you said that's two years, but
that's, you know, over 1,500laboratories I'm going to assume
under 2,000.

Brian Johnson (48:07):
Yeah, probably.

Kim Swanson (48:08):
So I mean percentage-wise that's not
horrible.
But yeah, obviously, in thecontinual improvement area we
would want an integrity andtrust and all of that.
We would like that to be zero.
But you're right, human nature.
There will always be shadypeople doing shady things.

Brian Johnson (48:25):
There will, and I will say we only write false
bite if it's a really noquestion about it.
So there are other times wherethere might be some wiggly
wording that was used by theassessors so that they were like
did not appear or somethingwhich was kind of not conclusive

(48:45):
about the falsification.
So there were probably morecases of it than just 38.
But that's another thing towatch out for.
On those internal audits, Kindof pay attention Like hmm, why
does this look like this?
Why are these numbers sosimilar all the time?
Kind of keep an eye on that.

(49:06):
And, as Kim said, you know yourbusiness better than anybody.
You know your personnel betterthan anybody.
You are more than likely goingto be able to figure out what's
going on a lot better than anexternal auditor would.

Kim Swanson (49:18):
Yeah, that was Tracy.
I mean, I was channeling myTracy.
I'm like I'm not taking credit.
That is all on our qualitymanager, tracy Barnhart, but
that does bring up that yourinternal audits prepare you for
the external audits, for theassessment.
So there shouldn't be anysurprises really in the
assessment because you are doingthe internal audit.

(49:39):
You should be aware of thoseissues ahead of time in trying
to fix those or in the processof doing corrective actions for
those based on your internalaudit.

Brian Johnson (49:50):
That's right.
Yeah, if you do a good internalaudit, you should really not
have any problem.
And definitely don't be afraidof the external audit.
They're not going to findanything that you haven't
already found if you're doing agood job.

Kim Swanson (50:02):
The internal audits for laboratories don't go over
the specific test procedures,right?
They don't do a demonstrationof that, or do they?
I can't recall.

Brian Johnson (50:11):
They don't.
But what they would do insteadof that is look at the
competency evaluations and thetraining records and if they
look at their training program,they should be made aware if
there is a need for additionaltraining before the actual

(50:31):
external assessment.
But yeah, things can come up.
Yeah, and I think a lot oftimes with the test
demonstration is that people arerunning the wrong version of a
standard.
Maybe they have an old versionof a standard that they've been
trained on and just you know,yeah, sometimes just hard to
break old habits.
Or maybe they're using a statemethod or a Corps of Engineers
standard or something else.

(50:52):
So those kind of things canhappen and those are relatively
minor issues.
But it is possible to have arelatively clean report if
you're keeping up witheverything.
We've seen it time and timeagain.

Kim Swanson (51:04):
Anything else that we're going to talk about today,
because this is turning out tobe a very long episode, I think.

Brian Johnson (51:10):
Yeah, sorry about going so long on some of those
things, but through the magic ofediting I'm sure you will be
able to cut out some of thewaste.

Kim Swanson (51:18):
We'll see.
We'll see how that turns out.

Brian Johnson (51:22):
Thanks for hanging in there.
I'm sure it's still going to belong, even if you cut it, but
thank you for listening andchecking us out on YouTube if
you did that, and hopefullywe'll see you on the next one.

Kim Swanson (51:33):
Thanks for listening to AASHTO Resource Q&A
.
If you'd like to be a guest orjust submit a question, send us
an email at podcast atAASHTOresourceorg, or call Brian
at 240-436-4820.
For other and and content
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.