All Episodes

July 1, 2022 72 mins

A risk-based approach to Computerized System Validation has been around for over 20 years, so what's the hype about CSA? Join us in this episode as we are joined by our amazing guest, Dr. Bob McDowall, to explore this question.

Bob McDowall is an analytical chemist with 50 years of experience including 15 years working in the pharmaceutical industry and 29 years as a consultant to the industry and suppliers to the industry. He has been involved with the validation of computerized systems for over 35 years and is the author of the second edition of a book on the validation of chromatography data systems published in 2017 and a book on data integrity for regulated laboratories in 2019.

Bob is the writer of the Questions of Quality (LC-GC Europe) and Focus on Quality (Spectroscopy) columns.  One such column in September 2021 was entitled Does CSA Mean Complete Stupidity Assured? It is available online at:
 https://www.spectroscopyonline.com/view/does-csa-mean-complete-stupidity-assured-

*Disclaimer: Podcast guest participated in the podcast as an individual subject matter expert and contributor. The views and opinions they share are not necessarily shared by their employer. Nor should any reference to specific products or services be interpreted as commercial endorsements by their current employer.

This is a production of ProcellaRX

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Dori Gonzalez-Aceved (00:10):
podcasting from Alexandria, Virginia, just
a few miles from Washington DC,where we all hope doing what is
right the first time iseveryone's top priority. This is
software quality today presentedby Purcell RX, a podcast about
the trends and challenges ofsoftware quality testing and
computerized system validation.
And the people who are leadingthe way. Here interviews with
special guests and news fromcustomers and vendors. I'm your

(00:31):
host Dori Gonzalez Acevedo andwelcome to today's episode.
Welcome to another episode ofsoftware quality today. I'm your
host Dori Gonzalez Acevedo.
Today I'll be bringing you alively conversation that I had
with Bob McDowell. Bob is ananalytical chemist with over 50

(00:52):
years experience inpharmaceuticals as well as
consulting. He has been involvedwith the validation of
computerized systems for over 35of those years. He is the author
of a book on validation ofchromatography data systems
published in 2017. And a bookentitled data integrity and data
governance practicalimplementation for regulated

(01:12):
laboratories published in 2019.
Bob is the writer and editor ofquestions of quality in LC, GC
Europe and focus on quality andspectroscopy. One of the columns
that He recently wrote inSeptember 2021, was entitled,
does CSA mean complete stupidityassured? I'll have the link in

(01:33):
the show notes available to you.
So without further ado, pleasewelcome Bob McDowell. Well,
welcome, Bob. It's a pleasure tohave you join software quality
today. I'm really excited to beable to have this chat with you.

(01:54):
Well,

Bob McDowall (01:54):
thanks very much Dory for the for the invitation
there. I've it's a niceopportunity to have a nice
discussion about computer systemvalidation and computers,
software assurance.

Dori Gonzalez-Acevedo (02:10):
Yeah. So I think hopefully, our listeners
will really take to some of thetopics we're going to talk
today. I think you and I bothare can can heat this up a
little bit. Before we get to inthere, because it's a very, very
rich conversation for both ofus. I'd love for you to take us
just to give some frame ofreference of who you are in the

(02:32):
industry and how you came to bein the place that you are today.

Bob McDowall (02:37):
Okay. Essentially, I started life as a forensic
toxicologist. So I have a Ianalyzed dead bodies for a
living, and I have a PhD indeath, which makes me the least
likely person do you want to sitnext to over a meal. I then saw
the light and the additionalfunding available in the

(02:59):
pharmaceutical industry where Iworked for 15 years 10, of which
at SmithKline when it wasFrench, and five years at
welcome. And then for the past29 years, I've been a self
employed consultant, but really,I'm a frustrated academic, but

(03:19):
academia pays peanuts, and mywife would prefer some a better
standard of living. But why didI come here? I was in the wrong
place at the wrong time. And Iif I go back to the 1980's, I
went into my boss's office. Andhe said, you've got a few

(03:41):
minutes, you got a bit of sparetime. And I made the mistake of
saying yes. So I ended up with alaboratory information
management system project torun, which I enjoyed, and edited
the first book on the thingsthat we did, and more

(04:01):
importantly, should have donebut didn't. And I could start
getting into computer validationwhen the software services
manager came up with a great biggrin on his face and says, Well,
Bob, Isn't it nice to know if Iscrew up your limbs project? You
go to jail. And I thought hangon a sec. This isn't right. And

(04:26):
then I found out it was. So inOctober 1986. I attended a
training course in Amsterdam forthree days on computer
validation. And I've beeninvolved in computer validation
ever since then. It varies fromthe sublime to the ridiculous.

(04:50):
taking screenshots of just abouteverything under the sun to
trying to be a lot slicker.
earning from your last projectand trying to improve that. So
there's, there's a number ofthings there. So that's how I'm
sitting here today.

Dori Gonzalez-Acevedo (05:13):
Great.
Thanks. It's, um, you know,being in this industry for such
a long period of time, right?
We've seen a lot of guidancesand documents from variety of
agencies and you, like you said,you're a frustrated academic,
but you are a prolific author,and you write, and you also part

(05:35):
of ISP E. So can you tell us alittle bit about all of your
improvements over the years?

Bob McDowall (05:41):
Okay, well, I've, I suppose, since I was I've been
doing my since I did my PhDstudies, I've been writing
articles, I was allowed both atSmithKline. And welcome to keep
publishing and presenting. Andas well as the limbs book, I've
written two. I've written twobooks on validation of

(06:04):
chromatography, data systems,and also another one on data
integrity and data governancefor the regulated lab. But with
ISP, I guess I've been, I joinedprobably in the late night,
middle 1990s, if only to savemoney on buying the

(06:26):
publications, but then I've beeninvolved with a good practice
guide for IT infrastructure lab,second edition of the lab guide,
and then over the past few yearsinput and review into the record
some data integrity guidance,plus also data integrity by

(06:50):
design and data integrity, keyconcepts. So that's my input
from there.

Dori Gonzalez-Acevedo (06:58):
Yeah. And so you know, ISP, for me is
always the go to and one of thethings that has been so helpful
over the years is to be able togo back to a standard and
industry colleagues that arekind of dissecting and putting
out in in layman's terms, somethings of that can be done in

(07:20):
practice. And so I appreciateyou and others that have done
that for everybody.

Bob McDowall (07:25):
I think I think I'd rather correct you on that,
if I may. It's it's, it's it's aguide, it's not a standard, as
shown when the editor of theGAMP guide says, you can enter
it's it's subject to furtherinterpretation. And on
occasions, I will do that. Forexample, I modify a GAAP

(07:49):
category for lifecycle in theGAMP. Five guidance, and I
condensed it down for some ofthe laboratory systems that I
work with, in terms ofvalidation, because you can
streamline things a little bitmore. So it's it's a guidance

(08:12):
rather than a standard. It's agreat new having said that,
having said that there's a lotof people involved. And they're
all volunteers, and they alsoget involved with with
regulators as well.

Dori Gonzalez-Acevedo (08:28):
Right.
Yeah. So I think Thanks forclarifying that point. And it
really is one of the hearts of Ithink the the nuanced
conversation between CSA andCSP, which we'll talk about,
right? Is this this? It? They'reall guidances. Even the FDA puts
out guidances. Right. And, andit's how we interpret them and
what we do with them as anindustry that really matter.

(08:51):
Right?

Bob McDowall (08:53):
Yeah. Yeah. I think the course the one thing
that is interesting is youcompare the regulations on one
hand, and the guidances on theother and both contain the word
should. However, in the morerecent FDA guidances, it defines

(09:14):
should as something that isrecommended, you can deviate,
but you need a goodjustification. And of course, on
every single page, it statescontains non binding
regulations. Oh,recommendations. So it's one of
those things.

Dori Gonzalez-Acevedo (09:33):
Yeah.
Great. So I want to talk todayabout the article that you
published, I guess in late 2021.
Because is it September 2021.
Yeah. So the title is, which Iloved and I thought was very
provocative and needed to betalked about was does CSA mean

(09:54):
complete stupidity assurance, soTell us a little bit about how
you came up with that title. Andwhat was your

Bob McDowall (10:05):
I think if the Genesis was if you if you go
back in time to the the FDA iscave case for quality, they've
been talking about CSV being abottleneck. Now, again, I will
most of my experience is inpharmaceuticals, and a little in

(10:28):
medical devices. So, I have beeninvolved in 21 CFR 820 and ISO
13 485. And also IEC 62304. Workin the past, but it is not the
main part. So I would preface mycomments. So, the work that was

(10:50):
coming out of that seemed tosuggest we use critical thinking
we use trusted suppliers andeverything else. And if you look
in the publications from theFDA, in terms of what guidance
is they're going to be issuing.
It's come up, I guess, over thelast three or four years, that

(11:10):
there's going to be a guidanceon software quality assurance,
which is touted to be a completereplacement for computerized
system validation, which, okay,we move on. And things evolve.
And I don't have a problem withthat. But then you start seeing
a lot of publications frompeople presentations. And for a

(11:33):
regulated industry, it saysessentially, don't wait for the
guidance from the FDA just doit. And that concerns me a lot.
We are a regulated industry,heavily regulated. And because

(11:54):
of that, it is really asituation that we want the FDA,
even if it's a draft guidance,and of course, the some of the
draft guidances are updated,probably a little slower than
glaciers recede. Having saidthat they do they can get their

(12:18):
act together. I mean, the partylevel scope and application
guidance was draft and finalwithin a year, actually less
than that, probably eightmonths. But for the most part,
where's the guidance, becausethat is the definitive issue.
That's the definitive baselinefrom which we work. And you get

(12:39):
people talking about thispublishing it, which is fair
enough. But when they say ignorethings, ignore the guidance,
don't wait for it, just do it, Ihave a major concern because
it's being filtered by thosepeople, you cannot see the
definitive guidance from whichyou can draw your own

(13:04):
conclusions. So it comes FDA, aperson through either
presentation or publication andfiltered down to you. Now, I'm
not saying that the people aremalicious or or anything like
that, or have ulterior motives.
But I want to see, based on myexperience, I want to read that

(13:27):
guidance, and I want tointerpret it myself. So I became
increasingly concerned thatbecause the FDA, in my view,
have been, have actuallyabrogated their responsibility
of actually getting the guidanceout. I wanted to write about

(13:50):
this, because from where I'mcoming from, actually, I don't
think you need CSA. And I havebeen writing two columns, one,
both for analytical magazines.
One is questions for qualitycolumn, which I've been writing
for 29 years now in LC GC,Europe. And my focus on quality

(14:14):
column in spectroscopy magazine,which has been going now for 22
years. And the i because theytend I tend to want to write to
get people to think whether youagree with me or not. I'm not

(14:36):
worried. I don't want yousitting on the fence saying
well, possibly. You're eithergoing to agree with me or
totally disagree. Right. And soit's written in that sort of
provocative style. It's alsowritten I would hope in a fairly
blunt Well, it's very sarcastic

Dori Gonzalez-Acevedo (15:00):
It's extremely factual, like the
details of which you lay out thetimeline of guidances that have
been previously written whatthey state all of those things.
What I was very impressed withwith the article in general was
how factual it was and howbalanced it there was on both

(15:20):
sides of the argument to thenwant to, you know, argue,

Bob McDowall (15:25):
I think, because we are a regulated industry. And
let's let me let me be totallyhonest here. Having having
written books, and educate andeducate people on data
integrity, I've got to be veryhonest here for 15 years, and I
worked in the industry from1978, to 1918, sorry, 1993. I've

(15:53):
had worked under GLP, under GMP.
And I never once read a singleregulation, it was always
interpreted for me. But being aconsultant, you suddenly
realize, hang on a sec, I'veworked in two companies where
there were two totally differentinterpretations of the same
regulation. Therefore, I have togo back to basics. And what we

(16:16):
find even now is, people do notread the regulation. And
therefore, if I write somethingbased on my opinion, you've got
to be able to derive thatopinion, from the regulation, or
the guidance, and then say, thisis my opinion, you can't just

(16:40):
come out and say, This is myopinion. And that's where I
think I'm, I'm concerned, wherethere's no guidance from the FDA
on CSA. Now, to come back toyour original question about how
did I get the title? Well,let's, let's go back to the 17th

(17:05):
century, where the father ofchemistry Sir Robert Boyle,
talked about this skepticalchemist? Well, I'm probably the
21st century, the skip thecynical chemist. So I wanted
something that would attractpeople's attention and get

(17:26):
people's to say, hang on, what'sthis idiot talking about? And to
go down into more detail? Igather, it's, it's, it's it's
created some interestingfeedback, most of it I've
received directly has been very,has been very positive. I'm

(17:53):
fairly certain there are a fewimages of me floating around
various organizations, withlarge leaders in them.

Dori Gonzalez-Acevedo (18:05):
I wouldn't necessarily say that,
but I, you know, being

Bob McDowall (18:08):
got a fairly good impression by

Dori Gonzalez-Acevedo (18:11):
being a consultant, as long as I have as
well, right? Every single clientthat I deal with has a very
different view or lens in whichthey interpret right, so So to
your point earlier, is now yousee as a consultant rather than
being in a company, right. Andof course, there's those that

(18:32):
that jump from company tocompany, and they take their
philosophies and ways of beingwith them. But for the most
part, consultants get to see avery, very wide spectrum of how
to interpret and what companiesthink about the regulations,
right. And one of the things I'moften asked for is, given that I
see all of that, what do I knowthat works better or not? Right?

(18:56):
How does one apply theseprinciples in a way that makes
most cost effective, most, youknow, balance, risk approach,
all of these sorts of things.
And to get to some of the heartof the things that you talked
about, like least burdensomeapproach, like that has been a
thing forever. And now,suddenly, it's like popped up on
on the CSA kind of radar, butthat has always been the intent

(19:18):
of the regulation from day one.
So why why did

Bob McDowall (19:23):
you go back to 820? It's in 820. And in the
preamble, toit 20 as well as inthe general principles of
software validation, which is 20years old. Yeah. And I think the
there's an I guess, where I'mcoming from, as I say, you under
the article was trying to put infairly. What I was hoping clear

(19:52):
terms, is why we don't need it,because we already have the
regulation. and the guidancealready there. And that's the
important thing, if you've gotit, and and this is, I think
part of the problem that as wecome through the, if we go back

(20:18):
to 1993, when I first startedconsulting, I went to a company
that shall remain nameless toprotect the guilty. But I was
presented with two filingcabinets that were six foot
high, two meters high, twometers wide. And there's two of

(20:41):
them full of screenshots. Now,I'm thinking, why? Because
there's some requirement forwitnessing and everything else.
This is the, if we go back intothe 80s, this is what was
started. Yet, if you look in theregulations, the only

(21:02):
requirement in GMP is forcopying the master batch record.
On the basis that you don't wantto make five tons of drain
cleaner, you want a finepharmaceutical product. So it's
risk management where it'sneeded. And I think there's also
a reluctance to take risk,especially with QA, and also

(21:32):
with some management. Now, letme give you an example where I
was working in under 820, andISO 13 485, we were implementing
a learning management systemthat was going to be ultimately
having a quarter of a millionusers on it. And I went through

(21:56):
the regulations. And I said,here's the rationale. You don't
need to sign electronic orelectronically sign your
training records, you just needattribution of action. Right? So
we do the nine week validation,and we come up to the last week.

(22:17):
And on Monday of the last week,the guy I'm working with gets an
email from the Vice President,what's this, you're not putting
in electronic signatures, wesign our records, training
records. Now, we're going to doit in the new system. Now, this
is stupidity on stilts.
Management, not understandingthe regulations, and being stuck

(22:42):
in what I would technically callthe Middle Ages. And so what
happens here is that we now havefour days or three and a half
days to understand howelectronic record electronic
signatures work. Write the testscript to demonstrate that it

(23:04):
works update the URS, thetraceability matrix, the risk
assessments, all the otherthings, implement it, and write,
test it and write the validationplan. We just about made it.

Dori Gonzalez-Acevedo (23:22):
Yeah, I see this still today. So by the
way, so I still see thisactivity happening today. I also
see auditors insisting thatsoftware vendors have training
systems, quote unquote,validated and part 11 records.
It's really absurd. Exactly, toyour point. And so, but how do
we how do we have? How do wechange this? If that's if that's

(23:46):
what's out there? And that'sbeen for decades now. Right? I
think part of this CSA movementwas a hope to, to spark some
sort of conversation. Yeah, thispractice, right?

Bob McDowall (24:01):
I think, actually, I think this is where ISP and
the camp forum come in, becausethey have published a document,
a guidance on enablinginnovation. And the one section
in there was agile development.
And they go to town in thatsection, to say, Look, you don't

(24:24):
need electronic signatures, youneed attribution of action, all
this stuff. And if you've gotdevelopment using JIRA, or
DevOps on equivalent sort ofapplication, you have all the
it's so easy to audit, and it'sa doddle. And if you put the

(24:48):
gates in where you can't, youcan't test until you've done a
peer review and mocked up allthe comments. It's beauty Cool.

Dori Gonzalez-Acevedo (25:00):
Yeah.
Yeah, I laugh so I drive aTesla, right. And I get stuck
for a while you over Wi Firight? I fairly certain they
don't use part 11 signatures andsoftware development, and yet I
entrust my Tesla to drive me andmy children, you know, to impro
where they need to be. So I feellike this risk conversation also

(25:21):
comes circle back around aswell. Right. It's like not being
willing to have a nuanced hardconversation about risks
associated risks within certainsystems. Certain systems are
extremely risky, right? Thereare the ones that are absolutely
directly impacting to productquality, safety, patient safety.

(25:41):
Yeah. But then there's othersthat are not

Bob McDowall (25:45):
at all, yes, I can't agree. But even those that
impact directly product quality,because of the nature of the
software, you can take a simplerview. And this was something
where we developed an approachgoing back 1314 years now, where

(26:11):
we condensed the whole of thevalidation down into a single
document.

Dori Gonzalez-Acevedo (26:19):
Okay, I've done the same thing to with
my customers. Yeah, that allowme to because it makes sense,
right? Why have all these 567different documents when you can
just summarize them? One littleshort thing?

Bob McDowall (26:29):
Yeah. And so what you have, and if you go back and
look at the regulation, it saysintended use. So you take from
211 63, intended use, and youdefine just intended use
requirements.

Dori Gonzalez-Acevedo (26:45):
I think folks struggle with that,
though. Right. I think thatunderstanding what intended use
is, and what that means, interms of a predicate rule is
still very, very hard to define.
For some organizations.

Bob McDowall (27:00):
The easiest way to get around that is to actually
draw the process out

Dori Gonzalez-Acevedo (27:05):
a process now and what novela Yeah,

Bob McDowall (27:09):
well, actually, the best piece of consulting I
did, I met the guy in April,that I worked with a company.
And the we did a two day processmapping. And then, a month

(27:30):
later, another two day redesign.
And the redesigned process touse electronic signatures, has
been running now for 18 yearsunchanged. It has gone through
several upgrades in thesoftware. But it was just for
one site. It is now six businessunits globally, the same

(27:57):
process, and I'm thinking, wow,best piece of work I've done and
we publish it, we published itin 2005. We didn't I would if we
did it again, the mapping wouldstill be the same. But the level
of testing would be a lot morereduced now. Because we'd be

(28:19):
leveraging from a trustedsoftware supplier. Right?

Dori Gonzalez-Acevedo (28:27):
Because we weren't. Right. So I want to
hold on to that trust, becauseyou were you raised it a couple
of times in your article, right?
There's a level of trust, thereneeds to be right there needs to
be a level of trust with yoursoftware vendors. There needs to
be a level of trust with yourtesters, you highlight as well.
And I wonder if you can talk alittle bit more about that,

(28:49):
because I wonder if that's partof the QA conundrum of terms of
when when to release some oftheir angst is around this trust
issue.

Bob McDowall (29:06):
Okay. Well, let's, let's look at it from the you're
you're buying a system andyou've got to us you've got to
assess the supplier. So what dopeople normally do, they will
send out a supplierquestionnaire and the supplier

(29:30):
will fill it out. You could fileit. I would always verify one or
two key elements there,especially if it was a gap
category three software. Butthat assessment is only part of
the job. In my view as you getinto a category four system. I

(29:57):
would do a one day a assessmentof software development. And you
can do this remotely or you cando it face to face. If anyone
who's listening has a as asoftware supplier in Hawaii, I
can do a very good deal for you.
Okay, I suspect Dory living abit closer will do a better

(30:18):
deal. But that's beside thepoint. But the point, the issue
is, how do they developsoftware? Do they use a
waterfall model? Do they userequirements, specs, all this
sort of thing. And what you findwith some companies and and when
I, I am also a trained auditor,and I spend my life when I'm

(30:43):
looking at software development,trying to tell people to do less
than do more. Because what youfind is that they have a
software development based on aJIRA or DevOps or an equivalent
piece of software. And then youget the pharmaceutical auditor

(31:06):
in, where's my requirementsspec? Where's this? Where's
that? And these guys are workingaround with the engine room.
That's perfectly adequate. Andthen they're putting in all
these bells and whistles thatbasically says, we want this, we
want that which are totallyunnecessary. And that's where I

(31:28):
think auditors actually bring orsome auditors I should say,
because Can't we obviously haveto exclude us too from that.
Bring in a degree of separation,you need to be able to separate
the people that say, I don'tcare what you call it, it's do

(31:49):
you do it? And the key thingthat an auditor has to ask
whatever they're auditing is,are these guys in control? Do
they have? Can they demonstratethat they're in control, and we
have a traceability from wherethey start, be it a requirement

(32:13):
spec, a marketing spec, a userstory, an epic, whatever you
want to call it. And I can gothrough the whole of the
lifecycle. And they've donesufficient. If they've done
that, and you don't have to takea lot of time, you can do it
remotely, you can do it on site.

(32:36):
But if you have a report, andthis is where you need to start
to differentiate what is a, youmay have a category for
application. But many of thefunctions within that are
actually category three, youonly parameterize it

Dori Gonzalez-Acevedo (32:57):
yeah, there's a great a great
distinction I when I wasrereading your article, and how
you talk about that is becausethe majority of tools that are
purchased on the market todayare made for general
consumption. Right? They're madefor a variety, a host of
industries across the board,JIRA, you've mentioned, you

(33:18):
know, all of these things.
They're not made for LifeSciences, specifically, right,
like, so let's be real. But whatthey do do well, is they're
highly configurable. And they'reconfigurable in ways in which a
life science company can usethem. And back to your data
mapping, sort of conversationprocessing the customer, the
company needs to understand whatthey're going to use that for,

(33:40):
right? And then figure out whatis important about that it's not
on the software vendor, to tosay, Oh, well, by the way, you
guys only want to use this forGXP systems, then this is what
you should do, or this is whatyou should do. These are bitten
you multi national companieshere. It's up to the company

(34:00):
that what you want to use it forreally matters, right? And if
you want to go through thisextra due diligence of writing
formal, Urs isn't FRS isn't allthis stuff for for functions
that already exist that havebeen tested gazillion times by
people that know their stuffinside and out. And if you think
you can do better than thepeople that actually developed

(34:23):
and was paid to do that testing.
I don't know. Yeah, kind ofarrogant

Bob McDowall (34:31):
still need we still need a user requirements
spec or some form of spec at thefront end, which defines their
intended use. Now, in terms ofmanaging risk and intended use,
let me go to a lab example of achromatography data system. You

(34:52):
look in the CDs, and you havesomewhere between 10 In and 15
different calibration models youcould use, which ones do you
actually use, you document thosein your urs. And you also and

(35:17):
you then, as part of yourassumptions, exclusions and
limitations you document, I'mnot using these, and they're
excluded from the validation.
And I think the I useassumptions, exclusions
limitations a lot, because whenI first started compute computer
validation, there was no campguide. And the only thing that

(35:39):
we could find were the Instituteof electronic and electrical
engineers, software engineeringstandards. Now, if you're
building a nuclear power plant,or you're you're putting in, you
know, a microprocessor chipfactory, you're gonna be up the
top end of, but we wantsomething that is fit for

(36:02):
purpose. So you take thesestandards and documentation, and
you adapt them for what youwant. But in the standard 829,
which is software testdocumentation, there's a test
plan. And I still use that tothis day. And in the section 6.3

(36:22):
of that test plan is a sectioncalled assumptions, exclusions,
limitations, or better known inthe trade as alibis excuses and
lies. But I won't say thatpublicly. Now, what you're able
to do with the theseassumptions, exclusions and
limitations, you cannot testeverything. And you're sitting

(36:45):
your approach on top of what thesupplier has done. So if you've
done your assessments of thesoftware development process,
all the functions that areessentially GAMP, category
three, you're not going to testyou may have to use them
indirectly, to when you validateyour workflow. But that's the

(37:09):
way you're going to be. You'regoing to be trying to construct
things. So it's a matter of whathave to supply a test, you won't
be able to do everything,because you've only got a day.
And it's an audit. But you havethose functions. And you also

(37:31):
have your assumptions,exclusions limitations, do you
test all the differentcombinations of say, user role
and the access privileges,because that will keep you tied
up for a few, a few weeks. Theseare the sorts of things you can

(37:51):
try and reduce. And it's tryingto keep both a level head of
trying to see what you can do.
And the other thing, and I'veI've been I've been caught up in
this is don't customize changethe process to match the system

(38:17):
rather than change the system tomatch your crappy process.

Dori Gonzalez-Acevedo (38:22):
A little bit more about that. Because
that's one thing that over thelast several years, I've tried
very hard to persuade not tohave happen. But there's always
those stragglers that say, oh,but we have to have it our way.
Right?

Bob McDowall (38:37):
All right. There's a Scottish comedian called Billy
commonly. I know, Billy, yep.
Yes. And he has a class ofpeople called stupid but
saveable. Okay, so you have topersuade those people to not do
custom software development,because I've done it, and I know

(39:02):
what the pain is. Okay. And onceyou do it, and if once you if
you do it once, you don't do itagain. So let me go back to last
century, when I had what my wifewould call a proper job. And we
were implementing the firstlimbs I was involved with at
SmithKline. We had to take whatessentially was a sample driven

(39:24):
limbs and take it into a make itinto a protocol driven limbs,
and we were quoted 14 weeks ofcustom software. It took 151
weeks. We then had to validateit. And then of course, you find
the problem. I know what it saysin the specification, but what

(39:46):
we really want is this. And ofcourse because it was a fixed
price contract and the supplierhas lost a lot of money. You're
going to pay for this throughyour nose and it's not just
going to The Peanuts, it's goingto be every single time you want
to update the system. Becauseit's mostly hard coded. So my

(40:06):
second system was, yes, we'llhave a protocol driver, it will
be part of the standard system,I will not accept custom code.
So that's it. So my view isalways look at your process, and

(40:26):
where at all possible, changethe process to match the
standard system, because you'llbe a lot easier, you won't have
custom code to upgrade and linkwith any changes. And you can
upgrade a lot quicker and a loteasier. Rather than leaving it
to when you get the dearesteemed customer letter that

(40:50):
says you've got nine months leftbefore your system's dead.
Right. And that's, that's reallythe way I would, I would like to
go, it's changed the change theway that you work. And I've got
examples where you look at aprocess, and you've got two or

(41:10):
three ways to run through. Andthat's not counting the
undocumented processes, thatpeople sneak out in confidence
when you're doing the mapping.
And then, of course, if youautomate that, it's going to
take a lot of time to automate.
And it's going to take even moretime to validate. And that's the
big problem.

Dori Gonzalez-Acevedo (41:34):
Yeah, I mean, trying to have a company.
Now, again, this is excludingthe companies that are software
as a medical device, or folksthat are actually making
software product for LifeSciences, which is a whole
different category. Yes. Butwe're talking about for those
applications that are readilyavailable off the shelf, you
know, most configurableconfigurable everyone that that

(41:57):
most companies already have,right, because if we look at the
life science industry, a lot ofthe companies I would say, I
would argue have 50 60% overlapof all the applications that
they already purchase, right? Sothis one has the same as this
one has the same as this one hasthe same right, there might be a
couple of different flavors inthere. But for what we're
talking about, it's usuallypretty much the standard same.

(42:20):
So no customization is really,truly required in order to get
your work. Yeah, we also see Isee that a lot in SAP world,
right? Where,

Bob McDowall (42:29):
yeah, well, I'll leave you to do suffering and
pain. I'll keep out of that. Butthe the one thing, if I can come
back briefly to configurationversus customization, you have
to be very, very careful.
Because marketing authorizeorganization or marketing
departments of certainorganizations have discovered,

(42:51):
we can't call it customization.
Even if they give you alanguage, it is always
configuration. So if you go tothe GAMP guide, and look in
Appendix M four, it says if yougo to vendor supplied language,
treat it as category five. Sosome limbs are going to be in

(43:14):
that that sort of situation. Aslong as you control it looks
fine.

Dori Gonzalez-Acevedo (43:20):
And tested. Right? Because you're
gonna test that configuration byforced it. Yeah, exactly. Yeah.
Yeah, great points. And withthat, though, also implies a
very natural, critical thinkingskill set that goes along this
whole entire thing that we'retalking about, right? Oh, yeah.

Bob McDowall (43:42):
Yeah. Now, I talking about critical thinking.
I mean, I make the point in thein the article, I was auditing a
clinical system a few yearsback. And I was looking at the
test script, and it started inNovember. And it finished in
February, and I'm, why is ittaken so long? We were waiting

(44:05):
for the password to expire.
Yeah. Now, the normal way ofgetting around that is that you
would normally set you've gotsay a 90 day standard password,
which tends to be industrystandard these days. Unless
you've got a few people thatwant to be a little more lazy
and push it up to 120 or even190. But say 90 days, you would

(44:28):
normally in the test script youwould go in change the
configuration to one day, waitfor it to expire. But these
days, that's not seen as a goodidea because you are now messing
around with the configuration.
And in today's data integrityenvironment. That's a total no

(44:51):
no. But look and see. How Howdoes the computer actually
determine mean time, you have atrusted time source that sets
the time of the clock. And howdoes the computer work? Well,
it's got a little pizza electriccrystal that vibrates around and

(45:12):
the computer counts, it convertsthe number of vibrations into
time. And guess what? It's onevery single computer. So if you
want to wait 90 days to test theexpiry of the password, that's
great. On the other hand, if wecome back to the assumptions,

(45:35):
exclusions, limitations, whyyou're testing it, if you've got
a trusted time source, andyou've got a pizza, electric
sell on every, every singlesystem, what's the point? So it
is easier to exclude. Butdocument it, that's the
important thing. Because all ofthis is important to put down to

(46:00):
say we're not doing certainthings, or we are doing certain
things. But there arelimitations. And it's that
thought process that is reallycritical. And this is where I
think the these assumptions,exclusions limitations come in.
Because if you go back to 1970,where the US military, were

(46:26):
asking Barry Bohn to predictwhat software was going to be
like in 1980. And he says, I gotsome good news and some bad
news. So they said to him,what's the good news? There is
none. He said, The bad news isthe software situation is gonna
get a lot worse. And he actuallygave him this report, a, here's

(46:52):
a diagram, simple diagram ofsome software. And if you could
test one way through thispathway per nanosecond, and you
started when Jesus was born, bythe time this report was
published, you're might behalfway through testing it. Now,
management's rather unwilling toallow that amount of time to

(47:16):
test software. Oh, and by theway, you said, this is a simple
program flow segment. There'ssomething like 10 to the 21
different pathways through it,you can't test it. So what are
you going to do focus on what isyour intended use? Yeah, and

Dori Gonzalez-Acevedo (47:33):
so but to your point, so this, this kind
of critical thinking aboutlooking at each system, doing
that analysis, writing up yourexclusions, writing up your
limitations, all of thosethings, one take time, take, you
know, perhaps differentperspectives in order to get all
that documented? Where in yourarticle, and I also have seen

(47:57):
this over the years, right, thiskind of bucket tising approach
where we want to have thistailored checklist and do it the
same way every time sort ofmentality, rather than having
the conversation. It's it'salmost as if that, that it's a
it's a people don't know how tohave that conversation. I don't

(48:18):
know, maybe not know how tofacilitate that conversation.
What are your thoughts?

Bob McDowall (48:23):
I was at the first face to face meeting I've been
to in two years in Italy, at theend of April. And somebody in
the audience said, we are whenwe're implementing some
automated training records, andthe QA department want to have

(48:49):
everything signedelectronically. And I said, you
want to fire the QA departmentbecause they don't understand
the regulations. Tell them to goand read the regs. So the first
thing, read the regs, read theguidance, understand it, and
then you start to work out fromthere, what you really need to

(49:11):
do that makes it so much easier.
And so much effective. The thetime, what I would say is
documenting these assumptions.
Exclusions limitations areactually relatively
straightforward, because as youstart to design, some things are
going to be fairlystraightforward. You're going to

(49:32):
exclude pulling the plug out ofthe back of the computer. Right,
right. Unless you're testingunless you're testing a UPS.

Dori Gonzalez-Acevedo (49:42):
Yes, I agree.

Bob McDowall (49:44):
But there are other things as you go down and
you start to write say, well,actually, I could do two or
three things here. But whatreally is my intended use, and
you document it at the time youwrite it Your test script. And
that's where I think it is veryimportant that you, you keep

(50:08):
aware of what you're trying todo. And keep in mind the reg,
the, the the requirements thatyou've written. I think you're
slipping out of the windowsthere. Oh, sorry. Am I putting
you to sleep?

Dori Gonzalez-Acevedo (50:25):
No, it's just my half sitting half
standing at my desk throughoutthe day? Yeah, I think it's, um,
you know, having clearrequirements is also an art
form. Right? Yes, writing,writing good requirements,
writing testable requirements,writing requirements that are

(50:46):
technically understood in a waythat, again, if we're looking at
general software, use some ofthose requirements are
technically easy to do from afrom a software vendor
perspective, not reallyunderstood from a from a end
user perspective, right? Andreally differentiating the two.

(51:07):
Clearly. The other part of that,though, I also, when I was
reading through your articlearound, I still see folks
wanting to do the full FMEA sortof, you know, assessment on on
standard software. And that kindof worries me. Because it just

(51:28):
doesn't make any sense. And whatrather really taking a critical
look at their user requirementsfrom from a risk perspective,
and really understanding why whydid Yeah.

Bob McDowall (51:41):
It's interesting, and I don't wish to criticize
the gap forum, but I will makethe comment that they've even
put FMEA into the certainly intothe first edition of The Good
Practice Guide forinfrastructure compliant control
and compliance. Now, I never usethat because doing failure mode

(52:03):
effects analysis in an ITenvironment, when you've got a
lot of standard stuff. On theother hand, there are two very
good, it based risk assessments,the first being an old British
standard 7799, part three. Now,the first two parts have

(52:24):
migrated into ISO 17799, andthen into 27,001, and 27,002.
But the 799, part three is hasgot a risk assessment in it.
There's also a NIST SpecialPublication 800, either 31 or

(52:46):
41, that has it related, andthey focus on what's the value
of the asset that you aremanaging. So if you have to have
a batch of product where it's $5million per batch, you want to
make certain that yourcybersecurity and all the other

(53:07):
stuff back up is right point.
And you've only got and that'sonly one batch and how many
batches, how many batches ofdata, or you're keeping your
data for registration, andyou've got a billion dollar a
year product. So these are thesorts of things so it defines
the assets. And then it startsto look at the vulnerabilities.

(53:29):
What have you got in place? orwhat have you got in place? What
are the vulnerabilities? What doyou need to do? And you can
conduct one of those riskassessments in half a day with a
small team of people. And youdon't have to worry about an
FMEA where the Martians arelanding next week.

Dori Gonzalez-Acevedo (53:52):
And this makes a great point, because one
of the things that I try to helpmy clients with is really, you
know, getting to the dataitself. Right. And I think it's
something around when you talkabout and you lead your
workshops on data integrity,right? I mean, what we're what
we're reviewing and approvingmatter, right, like, and it's
not just necessarily the processthat we go through, but what

(54:15):
what is it right, so while alearning management system need
not be, quote, unquote, fullyvalidated in this very old
school traditional way, butwhat's probably in some ways
more important is the content ofthe training that is being done
and delivered, right, and who isreviewing and approving the
training itself, right, ratherthan just posting trainings and

(54:38):
all that sort of thing. And sogetting back to the focus of the
actual data, right, yeah. isreally where the heart of the
whole conversation I think needsto go.

Bob McDowall (54:49):
I think I think that's a good point. Because if
you look at the way thatcomputer validation started,
it's top down. I think I alwaystake the view that there's no
point in buying a computersystem and implementing it
unless you get significantbusiness benefit out of it.

(55:09):
Otherwise, give the money to mydaughters and my grandson, and
they will waste it far moreeffectively than you ever will.
But the point is, so you getbusiness benefit. And you think
about compliance coming along asa secondary objective. So look
at the business benefit, look atthe process, the business

(55:31):
benefits from that. And then youstart to identify the data, and
how you manage it. What the GAMPforum did with a now out of
date, good practice guide isthat they had in 2005, a party
11 I think it was party lemon,compliant party level records

(55:56):
and signatures. And what theytried to do was a identify the
records and then they did abottom up approach. But the
problem with a bottom upapproach is you never get a look
at the process and the processefficiencies. So it never really
took off, what I would advocateis really both a top down and a

(56:18):
bottom up approach, top downfrom the process, look at
configuring the software, andthen look at the data and look
and see what vulnerabilitiesare. control those. And then
that's what you end upvalidating. And you should and
again, the streamlining thebusiness benefits. Because if

(56:42):
you spend six months a year, Iknow companies spend even longer
than that validating a system asystem, you don't get an
efficient business process.
You've wasted all your money.

Dori Gonzalez-Acevedo (56:57):
It's a great point. So to that point
around we talk about agileproject management and agile
deployment as a methodology,right. One of the things I've
been advocating for a lot nowis, you know, this NPV sort of
way. Right? And that, you know,we can we can validate and
control things along the way, wedon't have to wait this 678

(57:20):
months process that many ofthese organizations want in
order to see the business valueof what it is that we're going
to want to do, right? Because itmay not be you can you can't
assess every possible scenario.
But what you can assess is someof the best, the quick hits or
the the things that will improveyour process immediately. Right?

(57:40):
cliquish quick wins, get thosein place, put a process
together, test those an intendeduse, and then start using the
system right away, rather thanhaving to wait like some of
these approval cycles or two,three weeks long in order to get
you know, a requirement spec ora summary report approved? And

(58:02):
is that really adding businessvalue to the critical quality
attributes that that thecompanies actually want to
document and maintain and dometrics on? And so I feel like
it's a counterproductive and,you know, argument and how does
the business line the businessprocess owners? Right, really
justify waiting or lettingletting project teams wait in a

(58:26):
waterfall approach in thatregard?

Bob McDowall (58:31):
Yeah, I think I think the number of signatures
on a document need to be verymuch curtailed, you need
technical or other technicalreview, technical approval,
compliance, review and releasemaximum of four. And in fact,

(58:53):
when I get down to come down totest scripts, I try and
negotiate upfront with QA, tohave just two signatures on the
technical review, or technicalcontent, technical review and

(59:15):
release. The end. The review,the post execution includes a QA
review, but different to manypeople. I have an overarching
test plan, which is approved byQA before it goes. So QA have a

(59:36):
good oversight. But there's nota lot of value that a QA
signature has on a test script.
Agreed when you want to movequickly. And if I come back to
the issue I had with my learningmanagement system a few years
ago, where we had to implementthings very, very quickly.
Having just to sit He hasenabled us to write the test

(01:00:00):
script, execute it, and thenhave it reviewed quickly. Right?
Within that short period oftime.

Dori Gonzalez-Acevedo (01:00:11):
Do you think it's a fear of failure?
That that holds our industryback?

Bob McDowall (01:00:16):
No, I think it's a fear of it's basically it's a
cya. And if it was, Okay, lastinspection, it's okay this one,
and they don't think about thesee in cGMP. That's the biggest
problem. And if you changing thetack slightly, if you look off

(01:00:41):
the CSA, Arctic CSA side, andlook at some of the warning
letters that are coming through,FDA are getting really cheesed
off with industry. You look atthe station and tender warning
letters from July 2020. You lookat the BBC warning letter from
August of last year. Here,you've got company bit with BBC

(01:01:06):
you've got instruments wherethey have the ability to store
electronic records, but thecompany didn't use it. They just
print it out. And they werecited for that both in raw raw
materials input. And in finishedproduct testing. Station and

(01:01:27):
tender were absolutely screw tothe back wall by the FDA, with
two different parts of the CFRcited and identical word for
word remediation. And it's pagesof it.

Dori Gonzalez-Acevedo (01:01:47):
Yeah, I find that very disheartening,
when I hear any client wantingto take a PDF of an electronic
record and slot to anothersystem. I mean, the the amount
of more non compliance areasthat they make by not using the
systems as they are designed tobe makes them more risk for

(01:02:12):
compliance risks in general,right. Yeah. So what do we do
from here, Bob? Like how do we,how do we continue to help an
industry that that, thatclearly, I think, wants some
help? Because I think the partof the CSA movement in general,

(01:02:32):
people want something, right,they want to be told something
they want very, it's like aprescription. Right. And for
seller X, my company isprescription for software
quality, because I feel likethat that's what they really
want, they want a very clearroadmap of what they should do.
But at the same time, that's notwhat the regulatory bodies want,

(01:02:54):
at all. Right?

Bob McDowall (01:02:55):
I think the key thing here is you need to have a
flexible approach to computervalidation, you cannot have a
one size fits all, you've got tobe able to say, Okay, if I've
got to, if you look at the thethe lifecycle models in GAMP,

(01:03:16):
five, seem to become secondfive, Second Edition. If you
look at those models, thecategory three is basically
specify, build, test. Andtherefore, that's where you can
condense virtually all of thatinto a single document with

(01:03:38):
traceability, with assumptions,exclusions, limitations, and a
pro forma report, etc. If youand that's where I think you
need to be able to have that, Ithink people should use
prototyping a lot more. Becausethis says, I've got an idea,

(01:04:03):
I've got a generic spec toselect this system. But that's
generic. And this is system Xversion of why I gotta be able
to understand how it works. So Ineed training, I need management
to say, Okay, you're gonna gettraining, and you now need to
play with the system. And in myview, for a large system, you

(01:04:29):
really can't do it. Part time.

Dori Gonzalez-Acevedo (01:04:31):
No, you can't. That's right. And if you
don't, it takes a long time toreally understand that system in
its entirety, to then get theenablement and adoption across
an org large organizations thatwe're talking about, right?
We're talking about very, verylarge organizations. So you need
that time to play without theconstraints of all of that the

(01:04:52):
heavy heaviness of thevalidation that's under

Bob McDowall (01:04:56):
that's where I would see if we come back to CSA
that's where I would see anddocumented, you play around with
the system, you look at it, yousee what it's like. And from
there you write your secondversion of your urs that
reflects what the system is andhow you're going to use it. And
you play around with theconfiguration setting so that

(01:05:18):
you can really get a far betterunderstanding of how things
work. Once you've got that, thenyou can try and reduce the
amount of testing by assessingthe supplier. And of course,
this is a two edged sword. Ifyou find that they're working on

(01:05:43):
sealing wax and Sue string, yougot a problem, or I should say,
backs of envelopes andundocumented testing, then
you've got a major problem, butmost companies don't do that. So
even if you don't have aformally assessed or certified

(01:06:04):
QMs, if there is a QMs, then Iwill be quite happy. Providing
it meets certain requirements,then you can start to reduce the
amount of testing by saying,Okay, if it's a category four
system, how many of myrequirements are actually

(01:06:25):
category three functions? Andthen from there, focus it on
what is configured? And don'tforget, in 211 63, it talks
about adequate size. And whathave you got to do? If you look
at a GLP? It's even betterbecause 5861 talks about

(01:06:48):
adequate capacity. So where areyour pinch points? Because you
need to make certain you've gotenough capacity or size to
handle those pinch points.

Dori Gonzalez-Acevedo (01:07:00):
Yeah, that's a great point. Because in
software quality testing,performance testing is actually
really super important. Andoften not thought about when we
talk about CSV in general,because it's not one of those
things that you check off on thebox from a traditional CSV model
perspective. But when we'retalking about software in

(01:07:21):
general, performance,availability, these sorts of
things, especially as we shifttowards a SaaS model, lots of
these these these systems thatwe're talking about, right? So
where insecurity, right securityand performance for SAS is much,
much more important than whetheror not you know, that particular
feature is in its full function,you know, you might be beta for

(01:07:45):
a feature perspective, but iftheir security and their
performance is not up there forthe size and volume of what
you're going to do, it doesn'tmatter. Right. So yeah, all
right. So Bob, last words, lastthoughts?

Bob McDowall (01:08:04):
I think, I think the one thing is the last, the
last thought would be I wouldencourage people to go or
companies to go back to basics.
Look at the regs. And if youlook between the lines of the
party level scope andapplication guidance from 2003,

(01:08:25):
one of the things that it didsay was look at the regs, not in
so many words, but it says wherethe predicate rule says, and
that says, go back and read. Soback to basics, look and see
what it says, look at some ofthe guidance documents. And I

(01:08:45):
would advocate in the absence,the general principles of
software validation, front ends,some of the development side of
things will probably be out ofhis 20 years out of date. But
there is there are publicationson the use of agile, I think

(01:09:06):
there's a joint FDA, or Taylorfrom medical devices was
involved in one of these AMIpublication,

Dori Gonzalez-Acevedo (01:09:22):
T MRI, T MRI or a MRI. I have the link
here. We'll add it to the shownotes. Yeah.

Bob McDowall (01:09:28):
Yeah. I think that wood looks at agile in a for a
regulated era or developingsoftware. And I think it's
really trying to look at whatyou want out of suppliers and
out of systems and part of thatwould be retrain your auditors

(01:09:53):
because some of the things thatthey're asking for, and as I
work Not just for the industry,I also work for software
suppliers selling into theindustry. If an auditor asks for
something, just say, Okay, Ican't understand why you're
asking for this. Show me what itsays where it says in the

(01:10:18):
regulation. Is it the auditorsopinion? Or is it actually a
regulation a regulatoryrequirement? Or it's in a
regulatory guidance document?
Show me. And I think the onething is that we are the
software suppliers are moreinterested in getting a sale and

(01:10:39):
therefore they start to rollover. I think they need to push
back a bit more politely, notaggressively. But to really do
things. And in coming back tocompanies really assess what you
do. Do you need screenshots foreverything? No. You need. You
can use the system and the audittrail to self document and

(01:11:04):
validate the audit trail. Itmakes so much easy sense. Yeah,
keep it simple.

Dori Gonzalez-Acevedo (01:11:13):
Keep it simple. I love it. Well, Bob,
thanks so much for sharing yourtime and thoughts with us. I
really appreciate it.

Bob McDowall (01:11:21):
Thank you for inviting me. I have enjoyed our
conversation. I hope I haven'tmonopolized

Dori Gonzalez-Acevedo (01:11:26):
it. Not at all. Not at all. I welcome
always the the conversation andI appreciate you taking the time
across the pond. So it's late atnight for you and staying up for
us. I appreciate that as well.

Bob McDowall (01:11:41):
Okay, all right.
Thanks very much.

Dori Gonzalez-Acevedo (01:11:43):
Well talk to you soon. Take care.

Bob McDowall (01:11:44):
Okay, that's good.

Dori Gonzalez-Acevedo (01:11:49):
Thanks for listening to software
quality today. If you liked whatyou just heard, we hope you pass
along our web address for sellerx.co to your friends and
colleagues. And please leave usa positive review on iTunes. Be
sure to check out our previouspodcasts and check us out on
LinkedIn at reseller X. Join usnext time for another edition of
software quality today
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.