Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Dori Gonzalez-Aceved (00:10):
podcasting
from Alexandria, Virginia, just
a few miles from Washington DC,where we all hope doing what is
right the first time iseveryone's top priority. This is
software quality today presentedby Purcell RX, a podcast about
the trends and challenges ofsoftware quality testing and
computerized system validation.
And the people who are leadingthe way. Here interviews with
special guests and news fromcustomers and vendors. I'm your
(00:31):
host, Dori Gonzalez-Acevedo, andwelcome to today's episode.
Welcome to another episode ofsoftware quality state. I'm your
host Dori Gonzalez-Acevedo.
Today I have the pleasure ofinterviewing Jacqueline
Davidson, owner of Davidson, QA,consulting and head of
regulatory intelligence andInnovation at Square. Jackie is
(00:54):
an IT Quality management andcompliance professional with
over 25 years of experience inlife sciences, spanning
pharmaceuticals, biotech,medical device diagnostics, and
clinical labs. Jackie enjoyshelping small and large
companies streamline theirquality processes, especially as
it pertains to IT QualityAssurance software lifecycle
(01:15):
management, quality riskmanagement, and software
validation and verification.
Jackie brings a wealth ofexpertise in CSV and regulated
system deployments. And today weexplore the boundaries and the
levels of validation that arerequired for different systems,
who's responsible betweenvendors and sponsors, tips for
how to prepare for audits, andshe delves into soup software of
(01:36):
unknown provenance. This mightbe a new term for folks, and I'm
hoping that you enjoy ourconversation today. To learn
more about how it's playing abigger role within our industry.
What does it mean? And how do weneed to apply CSV principles for
industry moving forward? howcompanies can demystify the
(01:56):
assurance of open source soupapplications so that they can be
inspection ready and able tohelp companies speed the pace of
innovating products to serve thehigher purpose of promoting
product quality and patienthealth and safety. So without
further ado, please welcomeJackie. All right, well, welcome
Jackie. How are you today?
Jackie Davidson (02:21):
I'm great.
Dori, how are you?
Dori Gonzalez-Acevedo (02:22):
Good.
Well, thanks for joining us onsoftware quality today. I'm
really excited about ourconversation today, because
we've had some conversationsover the last year getting to
know each other and I reallyhappy we get this chance to
record it for
Jackie Davidson (02:34):
real. Yeah,
well, thank you. I'm glad to be
here. This is the first timeI've ever been on a podcast. So
I'm a newbie to this. But I wasawesome forward to chatting with
you and imparting someinformation where I can.
Dori Gonzalez-Acevedo (02:47):
Awesome.
All right, so Jackie. So tell usI'm going to ask you for a brief
introduction as the head ofregulatory intelligence and
innovation for software as wellas other providing consulting
services. Tell us what you doand how you kind of came into
the space.
Unknown (03:03):
While I've been in the
biotech life sciences, pharma
space for Well, I just actuallylooked the other day and I'm
closing in on 30 years. Istarted out in medical device
down in San Diego after gradschool and ended up moving up
(03:25):
here in the mid 90s. And workingwith alza Corporation for quite
some time in it doing at thetime, I was doing more training
and technical writing, but theywere really struggling with
validation because part 11 hadjust come out. Nobody knew what
to do. They were really confusedaround testing. And even at that
(03:46):
stage, we were trying to writelife cycles and stuff like that,
and come up with a sensible wayto approach it as opposed to
test everything. So from likethe first time I saw part 11 and
saw people panicking. Today,I've been trying to get people
to realize that quality is aboutmaking the right thing easy. And
(04:07):
validation is not necessarily anexercise in paperwork, but an
exercise in thought andcompliance and how you're going
to come up with evidence thattruly crystallizes what you
tested and the applicationsuitability for intended use, as
opposed to 1500.
Dori Gonzalez-Acevedo (04:31):
Yes, and
that in lies the challenge,
right. And over the course ofour our careers, right, we've
seen, I'm sure a lot of varietyin and how to get there some
more elegant than others, Ishould say. And so along those
lines, though, I know you havesome passions and things that
(04:53):
you're writing today and thingsthat you are really interested
in in sparking some interest inthe community.
Unknown (05:00):
Well, I've got a real,
a couple things that I really
love. I have an odd penchant forsupplier audits, especially in
the area of it. Because I feellike getting an understanding of
your suppliers at an early stageis going to help you in the end,
(05:22):
have a better relationship withthem overall help you have a
better validation, or SoftwareAssurance outcome, and help you
had problems off the pass. Andas part of that, of course, very
conveniently this year GAMP cameout with their second edition,
which of course focuses on that.
And then of course, the CSAinitiative, which has finally
come out in September is I knowit's out for comments. I'm not
(05:47):
sure when we're going to expectto see that become released for
real, right. Final guidance. I
Dori Gonzalez-Acevedo (05:56):
sparks
conversation. That's enough, I
guess.
Unknown (05:58):
Yeah, yeah. And to the
end of that, that's been a real
source of confusion for people.
And to me, I feel like it's oneof those all the little threads
are coming together, the littlethreads of how do we assess
risk? How do we assesssuppliers? How do we know that
we're getting the right amountof testing? How do we know we
have the right deliverables? Andthat falls across the spectrum
(06:20):
for CSA so I just finished awhite paper for software that
will be will be publishedprobably after the first of the
year, that discusses kind ofCSA, how it's a bridge to
innovation, for Life Sciences,and how we can leverage those
approaches to come up with acommon sense way to show that
our applications are fit forintended use and have the right
(06:43):
deliverables. And that makespeople less would you nervous
around? Oh, my God, can I reallytake this approach? Well, one
Dori Gonzalez-Acevedo (06:51):
of the
things that you're just saying
that strikes me interesting isthat, you know, so I was a
chemist, and I made activepharmaceutical ingredients. So
my upbringing is frommanufacturing, and the
relationships with our suppliersin the manufacturing area, are
very, very intimately tied. AndI wonder, I'm wondering what
(07:12):
your thoughts are around thefact that that in the supply
chain has been happening for along time, right. And most of
the auditors that we know youand I know professionally as
well, right are very, from thatmindset in that manufacturing
mindset, but haven't yetshifted, or moved towards an
understanding of how to do itvendor management, or it audits
(07:35):
and what that relationship isbetween IT and the business and
the sponsor. That's about that.
Unknown (07:42):
I think you're right,
that's been evolving, and that
that was a huge part of my job.
I was with Jazze pharma before Iwent to swear. And we had a
tremendous program around it,vendor auditing IT vendor
management. And then there wassome struggle at points because
it was like, Okay, so at whatpoint is, you know, if we're
(08:04):
bringing in consultants, at whatpoint is that a staff
augmentation versus having toqualify and train the staff?
When you're bringing inapplication? You know, who's
controlling the application? Isit our internal IT department?
And therefore, you know, do weneed to do a different level of
testing and training? Or is itthe vendor that's managing it,
you've got systems like viva,that have three times yearly
(08:28):
releases, and you have to beable to get those validated in
time, you know, you can't waitfor that to pass because you'll
be out of compliance. So at theget go, we were finding that we
really had to veteran vendorsearly on that it was more than
just, Okay, we've selected thisvendor. Now go audit it, because
(08:50):
that was my experience. For manyyears, it was like it, Jackie,
we signed a contract with suchand such go out and audit them.
And I'm like, Okay, you just letthe horse out of the barn,
because of this vendor turns outto be problematic. Now, we don't
know you've already signed thecontract. And that had been my
experience, I had seen thathappen numerous times. So
pivoting to an earlier audit.
(09:14):
And that was one of the thingsand, and, oddly enough, the
pandemic really helped becauseit made it easier and quicker to
do an audit because you weredoing virtually and really, you
don't need to be on site foraudits, because most of them are
decentralized, you're not goingto look at server rooms anymore.
(09:35):
You're not looking fortemperature controls. You're not
looking for the same things thatyou would need to when I was
doing audits for labs that Iworked for, and we would have to
go walking through themanufacturing plant, right look
at the clean rooms, you know,check, pest control, check
Eyewashes, check, spill kits,all those things, you're not
(09:56):
needing to do that. So so it wasa lot easier to get more
proactive. Then the it audits.
And also as part of that, toreally develop a relationship
with the vendor. And I have kindof a little different philosophy
as an auditor, I'm definitelyvery risk based. But I try to
make sure that I, I've alwaysbeen told, I'm like, the nicest
auditor I've ever had, which is,it's a good thing. And a bad
(10:20):
thing. I mean, I'm nice, becauseI'm nice. I'm not nice, because
I want something. But in theend, I get what I want, because
now nobody who ever listens tothis podcast will ever want me.
Because they always give me alot of information because I get
into conversations with them. Ittells me a little more about
this process. Can you show methese things, I asked for a lot
of stuff up front. So to savetime, because nobody in their
(10:44):
right mind wants to be on Zoomcalls for 16 hours, you know,
two days of eight hour calls ina row. So I tried to be really
proactive in terms of sending acomprehensive questionnaire
upfront, asking them to share asmuch information as they can and
feel comfortable with upfront.
Because another thing as youknow, that IT auditors hate and
(11:05):
probably the oddity hates, isgoing step by step through
procedures and making you readthem on screen. So being really
proactive upfront usingautomation to every extent
possible, whether it's a sharingplatform where we can share our
SOPs, share the data, share theobjective evidence, whether it's
(11:27):
being able to send them stuff tofill out proactively, so that I
can then compare notes and addin. So there's all those little
things that make a difference.
And do you have
Dori Gonzalez-Acevedo (11:37):
that's
really helpful? Do you have any
tips for for sponsors to preparebetter for those things with
their vendors? So I guess, sothere's sort of two things as a
third party auditor coming in,right, like you, there's,
there's the vendor, and you'remitigating that with the
sponsor. So that's, so whichside what what do they both need
(11:58):
to bring to the table to beeffective?
Unknown (12:00):
I think that when
you're a pharma company, when
your sponsor, first of all,having a clear set of policies
that that tell you, Okay, whatkind of audit do I need. And by
this, I mean, because sometimesyou're auditing software that's
like part of a device. So you'rereally auditing more than just
(12:21):
the software, then there's justsimple software systems. So
first of all, understand whatyou're auditing, and have a
policy that helps to guide youalmost like a rubric or a
flowchart. I have a flowchart,where it's like, if it's this,
I'm going to do this. So that'sthe first thing. The second
thing is to, I have a set ofquestionnaires, like questions
(12:43):
that I have, and I map them outto the relevant regulations. So
I know, if I'm doing medicaldevice, I need to look at party
20, I need to then look atsuppliers, I need to then look
at all the things under that ifit's a European organization, I
might need to look at annex 11,if it's US and European might be
looking at annex 11, part 11. Sounderstanding the sponsor,
(13:07):
understanding what they need toaudit what their relevant
regulations are, what theirconcerns are, is one thing, and
from there having questionnairesand you're going to need to
customize those. So I haveseveral questionnaires in my
toolkit, if you will. And Icustomize them depending on like
if I'm looking at a companythat's just mining data. So for
(13:28):
example, we did an audit of acompany down in Texas that does
they mined data out of militarydatabases is to get like patient
data for different types ofthings that were prescribed. And
you can then use that for kindof non life studies for data
(13:48):
comparisons of maybe yourproduct versus another product
for that. It wasn't a softwaresystem. So I knew I didn't have
to have all the questions aboutthey weren't going to access any
systems that were internal tothe company that I was on.
Dori Gonzalez-Acevedo (14:01):
So it
sounds like you do your
homework. Yeah, doing yourhomework ahead,
Unknown (14:05):
knowing are they going
to touch your data? Are they
going to control anything? arewe controlling it? Do we have
control over when things getrolled out? Or we just mining
data? So knowing the purpose,knowing the system doing your
homework ahead of time? Yeah, isnumber one. Number two is
knowing what questions to askbased on that. Sit, you're being
(14:27):
wise about the time alwaystalking to your oddity, once you
kind of have done this homework,talk to them, figure out the
agenda. You know, make sure thatthe agenda is appropriate to
what you're doing. Setting upfair times like all these
things, being cognizant of, youknow, if you're auditing
(14:48):
somebody in Bangalore, you know,come up with something that's
fair to both of you becausenobody's on their game, if
they're up at midnight, rightand it's always Unfortunately,
one of the other person. Sothere's a lot of like, it's a
lot of prep work. Havingsomebody in your on your team
that handles the scheduling. So,you know, when these audits are
(15:13):
going to happen if they'rescheduled, if they are for
cause, you know, really talkingto your internal quality team
understanding, if therefore,cause what are the outstanding
issues that have driven thisaudit, if they are routine, also
reviewing all that internalstuff. So it's, it's really,
(15:34):
before you go out and audit,don't audit blind, and I've seen
those I've been on the other endof those where they're just
coming in, and you can tell theyknow nothing about what you're
doing. Right?
Dori Gonzalez-Acevedo (15:44):
Yep. So
that's great. Thanks for sharing
that because I, and I know wekind of maybe swerved a little
bit but that this is a this is ahot topic as we're shifting
towards or wanting to shifttowards better risk based
approaches. And that risk basedapproach extends through our
vendors, and educating ourvendors as well as the sponsors
(16:07):
on what the shift is, and how weall have to come to the table
and redesign what that is,right? In this new way.
Unknown (16:16):
It's not something
that's new, this is the part
that kind of back in Gosh, it'sgotta be 2011 or 2012. I know.
Dori Gonzalez-Acevedo (16:27):
We're
going to age ourselves here, go
ahead.
Unknown (16:31):
company down in San
Diego, it was a lab. And in
fact, they we had had a documentmanagement system that we didn't
like, so we were switching, andwe actually end up switching to
Viva and we were Vivas firstinstallation versus like regular
installation. And it was infact, where I met the CEO of
(16:53):
who's now CEO swear he was thenthe VP of Sales name is Brian
Ennis. And Brian came to oursite. And he's like, you know,
what can we do to make this, youknow, work. And at this point, I
was a consultant to this littlelab, this little genomics lab,
and I hadn't been part of thevendor audit process, because
they had done that ahead of me.
So what can you show me what youhave? What can I leverage? If
(17:16):
you're telling me that you'regoing to be releasing these
updates three times a year? AndI've got a, you know, what can
you what can we leverage? Can Ireview it, so I kind of did the
secondary mini audit, to seewhat I could leverage and
actually did some testingagainst it in their sandbox, I
wanted to see because this wasthe first time we had done this,
it was completely new toleverage and vendor
(17:38):
documentation, right in thatway. So let me do some testing.
So I did an initial kind ofsmoke test, it's okay, they
really did test these things, itreally did work. So we decided
we made a we create a validationplan, we said and that's another
part is if you're going toleverage vendors, and you're
going to go out and audit them,make sure you specify that in
your project plans in yourvendor, in your your your all
(18:00):
your vendor audit documentation,in your validation plan, make
sure you're doing this ahead oftime that you're if you document
what you're going to do, and youdo it. Unless you're grossly
missing something right, you'reokay generally come out find it
an inspection I have in allthose years that I mentioned, I
(18:23):
have never once in an inspectionbeen called out on software
validation, on leveraging vendordocumentation on how we've
managed change, because we'vealways clearly set it out. So if
I'm auditing my vendor with theintention of using their
documentation, making sureright, that's there. But there's
(18:44):
there's something else that'sthat's come to my attention
recently, especially with theadvent of the CSA approach,
because I see that people arebecoming more comfortable with
that risk based and leveragingthe vendors. And there's
certainly validation automationplatforms out there that will
help you manage that workload,especially if you're one person
(19:06):
managing 15 of those platformsand and knowing the stress of
okay, it's November 15. And I'vegot five of these, and how am I
going to get it all done, and Idon't have a big team. So being
able to leverage that being ableto justify what I'm doing being
able to regression test, likethe smaller things that I might
(19:27):
have configured. Those thingsare really important because but
I've got a known vendor. Andgenerally speaking, the more
robust your vendor is, when youcome out of an audit, and you
know, this vendors reallyrobust. I don't have any
critical or major observations.
If I've got a minor one, they'reable to address it in a timely
(19:49):
manner. So I know I'm prettyconfident that I can, you know,
use that CSS approach to usethat gap approach to leverage
what they've got. That's great.
But What happens when you are amedical device company, you are
a biostatistician, you're doingsomething completely new. And
(20:09):
you're leveraging software ofunknown provenance, which we'll
call soup, or off the shelfsoftware OSS, or FTO. Call it
OTS that you don't get to lookat the source code. You don't
have all the documentation. Youdon't know who all's been
working on it. It's like kind ofcrowdsource this thing is
(20:33):
something like our which is avery well documented piece of
soup.
Dori Gonzalez-Acevedo (20:38):
So let's
learn, can I can I slowed you
down for a moment? And justlet's circle back and educate
our listeners? Because I this isnew, new areas for some and I
think it's worthwhile kind ofjust taking a pause. And going
through these use cases, Jackie,that you're going to identify
here? Because so software of anunknown provenance or soup
(21:01):
software? How would you definethat?
Unknown (21:04):
It's software that a
lot of the time, it can be
something that you'redownloading from the internet,
that's at the simplest level,something you're downloading
from the internet, like a PDFConverter, but it could be
something bigger, like, youknow, I mean, our is very well
documented as far as astatistical processing program.
(21:27):
But what people don't realize,and Big Pharma, and a lot of
companies, so maybe
Dori Gonzalez-Acevedo (21:34):
this is
so important, because this is,
so there's a lot of datascientists out there. And
there's lots of information thatare being developed in small
buckets and in an open sourcesort of way. And those, let's
call them widgets, for better,right, like, so they're widgets,
and that you can go and grabfrom the internet, right through
(21:57):
a university or a small company,or whatever they're developing,
and you can utilize that in yourprocess. Right, right. And so
that's what we're talking aboutis really going this is like
this is kind of like researchand university stuff on on
steroids, right? is pullingthat, that that native
technology into actuallyintegrating to a medical device
(22:21):
potential, right.
Unknown (22:22):
But what makes it makes
it and rightfully so it makes a
lot of companies squidgy aboutthis, because you don't know the
pedigree. It's kind of likegoing to the dog pound. And I
love dogs and getting thatpuppy. And it looks like a
Labrador, you bring it home. Butthree weeks later, it sprouts
ears and sprout fairy tale, andyou've got colleagues, and you
(22:45):
still love it just as much. Butyou don't know its pedigree. And
this is the same thing.
Sometimes you get one of thesewidgets, you don't know what's
in the code. You don't have anycontrol over its lifecycle. If
they update it, or they patchit, they're not going to let you
know. So if you're especially ifusing one, maybe you're you're
(23:05):
downloading or something that'slike a like a subscription, it
could change on you,unexpectedly. At the same time,
I'd say that soup and opensource get a bad rap. Because
I'd like to point out too, thatat this very moment, you and I
are using soup, and you don'teven know it. If you're on the
(23:29):
internet, you're using TCP IP,which was invented in the 1970s
as part of research on radiopacket controllers. It morphed
into the internet with a verysmall user base. And I pretty
much guarantee you that noquality manager that I've ever
met, including myself, maybe youhave you ever thought Dory about
(23:51):
validating TCP IP? No
Dori Gonzalez-Acevedo (23:55):
way. Not
at all.
Unknown (23:57):
Nope. So some of this
stuff is so ingrained that over
time, it's by use, right. So tothe same extent, when people
come to me like Jackie, I don'tknow what to do with this. So
they would take one of thesewidgets, and they would try to
pick it apart, validate it, andand you know, and back in the
(24:20):
day, I remember literallysitting down with these old guys
from IBM, that were ourvalidation team at alza. And
they'd be like, well, you know,first we have to turn the
machine on and make sure itboots up. I'm like, Well, that's
an IQ. Let me point out that ifwe can't get the machine to
boot, right, then it's anautomatic fail, so we can't use
(24:43):
it. So let's let's think aboutlike getting past that IQ
especially because nowadaysthere is no IQ. You know, it's
more like an O Q whereperformance qualification or UA
t. So, the first thing that Itell people if you're going to
use any of the is open sourcesoftware is to understand your
(25:05):
what kind of open sourcesoftware you're using. And
understand the definitionbecause in the eyes of the FDA,
anything that you don't control,the lifecycle of is kind of off
the shelf open source. Becauseyou can't, you can't look at the
source code of the law, youcan't look at the source code of
(25:29):
master control. So to an extentoff the shelf is a form of OSS
or suits. Yeah, even thoughyou're buying it from and
there's where you're yourvendor, go back to that vendor.
Right, right. Yeah, I know whoI'm buying it from, they're well
established.
Dori Gonzalez-Acevedo (25:45):
So this,
then it also implies why fit for
purpose or intended use is socritically important, right?
Because if those widgets arebeing used in a critical
function, or criticalcalculation, right, so that's
where in lies the rub, right? Ifthey're being used behind behind
(26:05):
the scenes, for all intentspurposes, as to facilitate
operations and getting thingsmoving, blah, blah, blah, right?
Like, that's not a big such abig deal. But if they're being
used to make some criticaldecision points, right, so let's
talk about that.
Unknown (26:20):
Yeah. And also, as part
of that, I want to point out,
there's one flavor of soup thatI kind of didn't talk about that
I probably should. And that'swhen a device manufacturer,
internally develops a softwarecomponent as part of a device.
But they haven't documented orfollowed his software lifecycle
process. And I've encounteredthis and working with a couple
(26:42):
of university offshoots, wherethey started out largely as a
research organization, andthey've developed this device,
and then they're like, Okay, andI was brought in to help them
kind of backfill the designhistory,
Dori Gonzalez-Acevedo (26:54):
I've seen
this.
Unknown (26:57):
Oh, we've got this
whole device, I walk into this
lab, there's this giant machine,literally, like, okay, like,
well, maybe we should start outlike, What is this for? Let's
write the user, then cheese backand get the technical
specifications and develop thatdesign history? Yeah. So I would
say that, let's just make surewe break out the kinds of soup
because not all soup isproprietary software purchased,
(27:20):
and not all,
Dori Gonzalez-Acevedo (27:21):
just
software. So yeah, I laugh
because I've had that sameexperiencing audit, auditing
some a university, and in theirbrilliant, brilliant scientists,
right. And they can find thingsthat you have no idea even
existed before, right, andthey're utilizing it in very
great ways. So they have the bigpicture vision of what this
(27:42):
thing is going to do in thefuture. But they have zero
documentation on how they got towhere and so when you go into
the file for everyone that'slistening, you have to go to
File design history file for forthat with the FDA to get
clinical trial use, if you don'thave that some of that
information, or at least youcan't source it, or you can't
(28:03):
figure out where it came from,like, that's big deal. Or this,
you know, depending on what thewhat the what the experiment or
the devices that you're you'regoing to submit for.
Unknown (28:15):
Yeah. So my little
nugget for any professors who
are listening out there, becauseI've had professors say the
stream for why do we needquality? And I'm like, because
if you ever want to sell whatyou're inventing right now,
you're want to bring it tomarket, you're gonna have to go
through trials, so documentedalong the way, you know, like,
(28:36):
just write it down. Yeah, could
Dori Gonzalez-Acevedo (28:37):
be a lab
notebook, all that sort of
stuff. It does not have to befancy at all. We don't care.
It's just that it would be goodto know. I mean, and again,
being a scientist like they dodo that. It's just, I know, me
translating my lab notebook intoprocess validation documentation
was it was an effort that Idon't want anyone else to have
to go through. So yeah.
Unknown (28:59):
Let me just back up on
our definition. So I just wanted
to, to give this to to helppeople because this is new to a
lot of people. So you've gotthat soup that we just talked
about. That's like, it's out inthe internet. You're using it.
It's part of your life. It'slike air like nobody is
validating air. TCPIP is thesame thing. Then there's soup.
That's not off the shelfsoftware. That's something that
(29:22):
we have developed, you know,Dori and I are inventing an
invention and we developed wedidn't write down but we wrote
all that software. The problemthen there's off the shelf
software that isn't soup thatcan be two kinds. It's usually
like a generally purchase likeFDA says generally purchase
through available system thatyou're using is a device
manufacturer but isn't part ofthe device. So that can be like
(29:44):
a piece of software controllinga piece of lab equipment like a
spectrometer. So that isn'tincluded in your device, but you
need to like document thatyou're using it somewhere that
you validated that your deviceis calibrated so you know you're
getting the same results everytime That's something that
people don't always think about.
But does he be talking and thenthere's off the shelf software
(30:04):
and soup. And that stuff thatyou might generally purchase or
download, like those widgetstory we're talking about, but
that you're incorporating intothe device. So it's like
embedding like a softwarelibrary, or system in a device.
And then of course, lastly,there are those widgets or
programs that get invented basedon soup. And then, of course,
(30:25):
there's freeware, and we'll kindof leave that out, because
that's a whole other can ofworms on its own. But those are
kind of your at their own risks.
So if you're downloading aprogram from somewhere, and this
goes for anybody, anywhere inthe world, doing anything, if
you're downloading a freeprogram, you want to really be
(30:47):
careful, you want to documentyou want to use it at your own
risk. And if it's going into anykind of regulated product, you
want to test the heck out of it.
I mean, so that's where riskbased, like my risk has just
skyrocketed, because I don'tknow where it's coming from. You
know, it's like somebody left apuppy on my doorstep, I don't
know where it's from, could bevicious. Could be great. Gotta
(31:08):
check it out. So don't know thepedigree. The more you know,
pedigree, the more homework youhave to do.
Dori Gonzalez-Acevedo (31:17):
So, so
you're seeing this as a big
trend, like, because it's alsomore of like Internet of Things
to coming to more set softwaremedical device, right? We're
going to be seeing more and moreof this as we as we move
forward, right?
Unknown (31:33):
Yeah. And I was at a
conference recently where this
this came up as a thing wherethey were talking about the new
edition of GAMP. And they werelike, well, what do I do with
these systems? And you know, myanswer to this person's asking
the question, I wasn't a speakerat the conference, I was just
(31:55):
there as a participant, was, youreally have to look at the level
of risk. And I know that goesback. And I know that out there
somewhere. I can hear peoplelike and going, it's not just
about risk, but risk is a bigcomponent. There's also, you
know, that level of concern,there's like, what happens after
(32:17):
we assess? So like, I thoughtmaybe I'd take a few minutes to
step you through kind of likeone of those, what would Jackie
do?
Dori Gonzalez-Acevedo (32:23):
Sure,
that's awesome.
Unknown (32:25):
So when you look at the
FDA is guidance on the use of
off the shelf software that theybrought out in 2019? You know,
they're talking about using itand devices. But also, when I
think about it, it's not justdevices, it's companies, you
know, any pharmaceutical companywith you're using big data,
bioinformatics, data analytics,software industry just can't
(32:48):
move fast enough to pick up andcater to your specialized needs.
So quite often, you're buildingyour own. So the first thing to
do is to assess and do like arisk hazard analysis, like you
were saying, is this going tocritically impact patient approx
safety? You know, when Imitigate these risks, does it
(33:12):
bring that level of risk down toan acceptable level of concern?
What am I doing with my residualrisks? Am I performing a risk
review? So if I like had to takemy first step, it would be I'm
going to do this risk hazardanalysis. I'm going to come up
with a mitigation plan.
Dori Gonzalez-Acevedo (33:31):
But
you're doing that at the system
level of the of the soup, right?
Unknown (33:36):
I'm doing it because
it's the soup and and the use,
like how am I going
Dori Gonzalez-Acevedo (33:41):
Zubin?
Right? But you're right at butstill at the high level? Because
I just want to differentiate,because when we start saying
more risk based proach. Folks,traditionally think of that or
hear that as FMEA, AES, which isnot what we're talking about
here. That is that is reallyspecifically for medical device
(34:02):
specific design considerations.
So that's not what we're talkingabout here. We're elevating this
conversation up higher to theuse, and what is what is going
to be impacted to the whole.
Unknown (34:17):
So what are you going
to automate? You know, so what's
this thing here for? Why are youeven putting this piece of
software? Why are you usingthis? Why are you building it?
Is it directly impacting patientor product safety? Or is it
indirect, like I'm doing dataanalytics to figure out, you
know, where else I can sell thisor, you know, something like
(34:38):
that. So, you know, is it moreof a commercial use? Is it more
of an r&d use? Is it a definitelike safety use? You know, is it
is it going to be part of asubmission to the FDA? My Risk
is just up because I better getit right. Am I going to be using
this as a decision to keep aproduct on the market? pull it
(35:00):
off. Is it safety related? Is itpart of a device? So what's the
safety classification of thedevice? You know, is it like
class one, two or three, youknow, what level of risk is that
device itself. So the higher therisk classification of the
device, and its concomitantsoftware that you're putting in
it, the higher the overall riskof the project. So starting
(35:21):
right there, and documentingthat, and then going down to
your system level assessment.
Dori Gonzalez-Acevedo (35:26):
Yeah. So
one of the other things I'm
hearing what you're saying is,is also part of the trends that
I see in the industry as a bigbig picture hole, right? We are
big B, we're being more complexthan ever before, right? We're
pulling together pieces and bitsand stuff from all over, not
just the proprietary informationthat a given sponsor is making.
(35:48):
But we're also leveraging otherparts of good technology, if we
talk about AI, machine learningas well, right? Like all of
these things put together, theechos the ecosystem of what
we're building today is moreinterdependent on all of us.
Right, rather than on a singlecompany. And so the risk from a
(36:09):
business perspective also jumpsup, right? Where were where
were, you know, a decade ago,right? A lot of stuff was all
done in house. Right? Yeah. Andnow it's, it's it in order to
innovate, right? To get to thenext level, we actually have to
innovate with partnerships withacross a variety of things.
(36:30):
Right. So our, our duediligence, it goes up and in
some ways, you know, I don'twant to get into legalese speak,
right. But it's, you know, toput that the legal stuff aside,
it's very exciting, right,because what you can create is
infinite, but you need tounderstand, I like the word
provenance, because it reallydoes kind of tie into all of
(36:55):
this, right? Like reallyunderstanding all of the pieces
and doing like a heat map ofwhere your risks are within that
whole ecosystem. Yeah,
Unknown (37:05):
yeah. And it's, and
it's not just with software. In
fact, I just read an articlerecently. And I'll, I'll have to
pull it and send you thereference. It was about
literally open sourcepharmaceuticals where companies
be cooperating with each otherto develop, you know, like,
(37:25):
software, develop something new.
It's like, you've got a piece ofthe puzzle I've got I got
Dori Gonzalez-Acevedo (37:29):
a piece
of the puzzle and right, and how
can we partner together? Right.
And I think that partnership isalso a common theme in in how to
do all of this moving forward,right? Because all of us have
expertise in whatever thing thatwe make, or service we provide,
right.
Unknown (37:46):
And part of this too.
So going back to the softwarepiece is, you know, as you're
performing that criticalityassessment, because you're
pulling these disparate pieces,you need to be sure to look at
the downstream impact on othersystems, you know, are you is
this super open source gonna bepart of an entirely new system?
are you improving existingsystem? What impact is this
(38:07):
going to have potentially on,you know, product safety,
quality availability, likesupply chain, you know, is this
going to affect your supplychain in some way, this is
probably more have to do withsoftware as a medical device,
which is a whole other topic.
But you know, basically, if youhave an open source component
(38:31):
that enables you, to bring thisdevice to the supply chain, lets
you to the virtual supply chain.
Whereas if you couldn't do it,so it does completely affect
that, that supply chain that andmay fill a need for a life
saving drug or device. Now,going back to my my favorite
(38:52):
thing, like I said, is auditingI'm an audit, I'm an audit and
process geek like I love writingSOPs, I love writing procedures,
and I love auditing. The What ifI can't, so who's gonna go out
and audit the our institutelike, this is like, you can't
audit Amazon. You can't auditAtlassian. Those companies are
(39:13):
huge. Also universities, thingslike that, they're not going to
have the ability to have you dothat. So it's a similar approach
where you have to like pull theinformation that you can, like
at Amazon Atlassian do anamazing job of giving you so
much information, like I can goto their websites and get all
the information, understand theintended use and everything. But
if I've got like a widget thatwas built by a bunch of super
(39:35):
smart scientists at someuniversity, probably can't audit
it. So then I've got this thing.
So then after I've kind ofassessed the risks of this
thing, after I've kind of comeup with what level of concern
and what mitigations I've got tofigure out how I'm going to test
it. And there's a few thingsthat are going to go into that.
(39:57):
So availability of anydocumentation like is there any
design documentation. Is thereyour anything I can leverage? Do
they have release notes? Arethere any tests that I can
leverage? They publish anything?
Do I get notified of releases?
Patches? So me have to like talkto my tea department a little
(40:19):
bit about, you know, how can wehandle this? Can we like
sequester thing in Sandbox?
During upgrades? How am I goingto, you know, making sure I have
appropriate test environmentsfor it. And kind of, you know,
reviewing along the way, as Imitigate each of the risks,
doing risk review, which issomething that people don't do a
lot of stuff, I can't audit it,I need to do a really
(40:40):
comprehensive risk assessmentand testing, I need to review
risk to make sure it's reallybeen mitigated by whatever
measures I take, and whatevertesting I do, to make sure that
not only is the system fit forintended use, but it isn't going
to break anything else in myecosystem.
Dori Gonzalez-Acevedo (40:58):
Yeah,
it's, it's exciting. I mean, I
think that a lot of this goes tothe How to continue to be
creative in this space, right?
Because here's the other thing,I know, we can't, we can't
develop everything, right? A lotof companies, you have to be
(41:22):
able to, and I know you and I'vetalked about in the past, like
the trust factor, right? Like weneed to be able to partner and
be creative, and figure out howto trust those relationships
enough to be able to advance ourown ideas within the companies
that we work with, right? Anddo. And so the combination of
software as a medical devicewith a drug with the you know,
(41:48):
like with machine learning, likeall of that is is happening,
right? There's lots of startupsthat are working on some really
cool stuff. And if we as anindustry can't shift and make
use of that. That is justconcerning to me, right? Like,
like, what are we going to doabout that?
Unknown (42:09):
Well, and I think it
goes back to some of it, when
you're thinking of using thisstuff, it goes back to that CSA
approach, that thing that'sconcerned people, the CSA and
GAP care, how do I know what I'mdoing is enough? You know, even
I mean, everybody, we all haveinspection, fear, even if you're
(42:30):
an inspector, you haveinspection. So
Dori Gonzalez-Acevedo (42:33):
I was
just talking to a customer this
morning around this and and, youknow, and again, CSP CSA, I
don't care what you call it. AndI'm, I'm, it's just it's another
term, in my opinion, rightthere, where we're talking about
software quality testing, andgood software quality practices.
(42:54):
And if we bring it back to that,which many industries, right,
not just the life sciences andhealthcare industries, many
industries do very, very well.
Regardless of the words that weput on top of it as a regulated
industry that we do. So how canwe leverage those best practices
which are being done in theworld today, across the board,
(43:15):
really cool. OT and it sort ofstuff that requires much more
important things that we need totalk about? Right? So maybe not
today on this podcast, maybewe'll come back and talk about
another time. But cybersecurity,yeah, much, much more important
than, you know, down to some ofthe nuts and bolts of what we're
(43:36):
talking about from a validationtesting.
Unknown (43:39):
And we haven't even
kind of touched on that. But of
course, if you've got a wellknown application from a well
known vendor, you know, andthey're using, you know, Amazon
data centers or something that'svery well established, your
cybersecurity risk wouldconsequently be lower, then I'm
getting an application from Idon't know where, you know, to
(44:00):
convert, you know, to create mymy tool, or maybe I'm, you know,
grabbing something that wasbuilt in Python or R from
somewhere, I don't know, builtit. I don't know if they put
trapdoors in it that couldn't,you know, you don't know where
you're what you're getting. Soagain, having that and having a
cybersecurity expert in with,you know, talking to your IT
(44:23):
department, if you know,cybersecurity isn't your primary
wheelhouse. I'm definitely likea compliance and risk geek. You
know, what I try to do, also inthese cases, is have
quantifiable assessments. Sokind of going back to that
(44:43):
whole, build it back in the 90sthings back in the 90s. All age
myself. We built everything whenI was at alpha wasn't available.
We had a huge team of softwaredevelopers, because we had to
build everything that we needed.
We had to build our likemanufacturing resource planning
software, everything. And we hadto test it consequently, but we
(45:03):
knew where it came from, wecould look at the source code.
Now I can buy one, I, you know,know that maybe it has like what
impact it's going to have, but Ican't look at the source code.
So I have to, you know, takethat into consideration. I was
going to say something aboutsame thing with risk meetings.
(45:24):
So back then I remember sittingin these like, very warm rooms,
it seems like it always tookplace in a very warm room right
after lunch, which is acombination of deadly things. CV
wanting a coffee in a cookie,you'd be sitting here having
people argue over these risks,and I was like, Okay, why don't
we have 10 questions we askabout each and every one of
(45:44):
these things, not that you canbring, you're going to break it
down however you want. But 10 or15, standard questions you ask
is, is going to get used in asubmission? Does it directly
impact patient data? Patient 50rather patient quality, product
quality? Is it you know, gonnaneed to, you know, do a recall,
you know, what is what are thethings that are high and low
risk and score. And then I'vegot this score sheet. So when I
(46:07):
get audited, I can say, thisfeature was high impact, high
risk. So you know, we're gonnatest it more this feature, you
know, it was the color of theinterface, it has no impact on
product safety, product quality,so we don't really care, we can
test it less. And thenmultiplying that, so I take
(46:27):
those risks and actually usemath, where I take I multiply
out my device classification,three to one my risk
classifications, my level ofconcern of my residual risk, my
what I know about the vendor,and I come out with the amount
that I kind of need to test eachthing. So I have like I've for
(46:51):
myself, I've created a series oftables, where I, you know, look
at the intended use the riskhazard analysis, level of
concern, mitigation plans, riskreviews, then I look at my
different activities based onlike, if I have a major level of
concern with a high risk pieceof software with a vendor that I
(47:12):
don't know, I'm going to requireeverything like a full audit of
whatever I can get or a validatefull validation plan, full test
plan, error assessment, youknow, user requirements, this
didn't I can get him, you're alot more testing a lot more
verification that things work.
(47:34):
Whereas if it's a low risk, youknow, kind of like, like, it's
running a piece of equipment,that's not even impacting my
drug or device. You know, I cando less testing, I can have
maybe not no documentation, butI can do more ad hoc testing, I
can have a lighter weight testplan. Maybe I use more
(47:56):
unscripted testing for the lowerrisk features lower risk suit,
but for my critical stuff,that's for a class three, you
know, major level of concern,I'm doing a lot of scripted
testing. I'm looking at anythings, any changes that come
through, and I'm applying moretesting to those because I don't
know what downstream impact thesoftware changes that that
(48:20):
developer released to us ourYeah. So I don't know if that
makes sense. I'm trying toencapsulate it without
illustration. Because I knowthat we're Yeah, it's,
Dori Gonzalez-Acevedo (48:29):
it makes
sense to me, because I live in
in those matrices A lot. And Iknow the practicalities of some
of what you're talking aboutoperationally, do do hit a
bottleneck. Right. And so my, mycautionary tales from an
operational side perspective, isthat we all need to be more
(48:55):
familiar and more comfortableliving in the gray, right?
Rather than the black or whitethat this or that sort of bucket
because you know, we'reinfinite, we have infinite
resources, time, and space andto do everything is not
(49:15):
possible. So we to identify therisks, at least have those
documented where a businessdecision has been taken and
made, right, so that you canthen follow up on that and see,
you know, a year from now, wasyour risk decision appropriate?
Right, right. Because I think itgoes back to also, you know,
(49:38):
really proving effectiveness andKappas, that's the same sort of
thing. If we're not reviewingour risk evaluation,
operationally in liveproduction, use after feedback,
we're not really learning andwe're not, you know, evolving
from a product perspective andour risk of management
(50:00):
perspective. And that isrequired in and should be part
of the process as well. Andthat's the part that I see often
is forgotten. I see it forgottenwith Kappas I see it forgotten
with risk, I see it forgottenwith audit, management and
vendor management, right? We'rereally, we're really good at
following the process of doingsomething the first time,
Unknown (50:23):
right, like one and
done oh,
Dori Gonzalez-Acevedo (50:26):
what
we're not good at, you know, you
know, and I'm getting I'm makingsome broad statements where
we're, we tend to lack on thefollow up and the follow
through, and the benchmarking ofwhat we are today versus what we
were 12 months ago. And, and soand then being able to rebase
line on that and do that. And Ithink part of the resistance to
(50:49):
that is, well, we have so muchmore to do, right. So the things
that are coming in, are notslowing down, they're
accelerating, right. And sohaving a system like rescue, be
able to track the auditfindings, the capa findings and
have periodic ways to remind me,right, that this stuff is going
(51:12):
on, in a way that adds value. Ithink it's critical.
Unknown (51:18):
Yes. And risk review.
So see, and one of the thingsis, and this is you, right,
like, it's one of my little petpeeves that people like, Oh, I
did my risk assessment of theproject. And then they're
saying, Oh, we're gonna have apost post mortem at the end of
the project, but then that likenever happens. So you never
learn from it. The other thingis that, you know, and you're
going back to like, a lot ofwhat people do with validation
(51:40):
records, this is like across theboard doesn't matter if it's
CSA, CSV, soup, not soup. Peoplecreate these things as static
records. And then they just getshoved in document control, or
any never like, you can't reallyreuse them. You can't really
access them again, it's reallyhard and an inspection because
(52:01):
you get three, four weeks ahead,and they'll give you 14 folders.
Hey, Jackie, for each of thoseapplications that you're you
know, QA it for? Can you give methe records for all the changes?
Well, that's where a system likerescue validation automation
platform will really give you alot of bang for your buck.
Because first of all,everything's there. So I don't
have to go searching, whichsaves anti Jackie a lot of time.
(52:23):
Second thing is things arereusable, and your records are
no longer dead and static. LikeI come along. And now I've got
my you know, release are 222,whatever. And I can look back.
So here's what happened in r1.
Here's the changes to our two.
This is where I need to look atdownstream impact. Is there any
(52:44):
regression can I review anyrisks that came up? So my risk
review is living like a livingrisk review. And we're gonna
like in this, I'm a distancerunner. And there's a there's a
saying that if you wait untilyou're thirsty to drink, it's
already too late. Yeah. If youwait to evaluate risk, until
(53:06):
things start breaking, it'sgoing to cost you so much more
time, effort, compliance, risk,inspection risk, not to mention,
like your little box of condoms.
Dori Gonzalez-Acevedo (53:23):
Right,
because you
Unknown (53:25):
waited too long, you
didn't. You didn't do your
homework. So going back to thewhole thing like your vendor
audit, whether it's soup, if youcan get one or OSS to your risk
review, these are all steps anddoing your homework and to be
really honest, you're notreinventing the wheel every time
so if you've evaluationautomation platform that
(53:48):
automates a lot of theseworkflows for you like rescue,
I'm not having to rethink getevery time. You know, I can plug
in like I can have all myquestionnaires in there and I
can have that as part of myvalidation package. So that I
know when I'm going to auditDori that I you know, I've got
like kind of a set of questionsfor her I can make sure all my
my P's and Q's are done. Yeah,I'm not forgetting anything. I
(54:11):
shouldn't say p's and q's. It'snot a picky like I can make sure
that I'm I'm you know that I'vegot a lot of that thinking
outline so I can reallyconcentrate on what's different.
So I'm not doing the check thebox activity every time I'm
doing the critical thinkinglike, what's this for? Why am I
(54:31):
using it? What's the intendeduse? Who does it impact? What
workflows does it impact? Beyondthe obvious and this is the
other thing beyond my obvioususer group. Right? Who else is
it impacting? And I'll give youa like a little example. was
working with a vendor it was apharmacy for a highly controlled
(54:52):
substance. And the pharmacyunbeknownst to the company I was
working for, had changed Justthere, they had run out of eight
digit numbers for their serialnumbers. So they've gone up to
nine digit numbers. But in doingso, they did not update their
(55:13):
own software. Consequently, theydidn't let us know that we
needed to update our ownsoftware. And I was doing just
an audit, like a forensic auditto see what lots came back. And
all of a sudden, I wasn't seeinglot numbers for like, several
months. I'm like, this is weird.
I know, we released drugs. Andit turned out that when we did
that, it came out that it wasthe vendor that didn't do their
(55:36):
stuff, which told us uh, wethought we were doing everything
right. The whole time. Wethought we were staying up to
date in our software. Butbecause another totally
disparate party didn't know,like, there wasn't that
downstream impact assessments assoon as you have to think out of
the box, because I was like, whyis this broken? We can
accommodate you know, thesenumbers, we can accommodate 20
(55:56):
digits, it was because thevendor software could only
accommodate eight, and they'vegone to nine. Yeah. So
Dori Gonzalez-Acevedo (56:04):
now that
interconnectedness and and it's
not necessarily sometimesmalicious by any means. It's
just that there's disconnects ofnot knowing how everyone
connected, everyone's connected,right, upstream and downstream.
Yeah. Well, Jackie, it has beena pleasure. Well, you're gonna
have to do this again, because Ithink we can talk about a many
(56:28):
other topics together for quitesome time. It was a pleasure to
have you today on softwarequality today, and we will stay
tuned to next time.
Unknown (56:39):
And I'm looking forward
to seeing you on the women and
validation call this afternoon.
Yes,
Dori Gonzalez-Aceve (56:43):
absolutely.
Thank you, our hair care. Thanksfor listening to software
equality today. If you like whatyou just heard, we hope you pass
along our web address per sellerx.co to your friends and
colleagues. And please leave usa positive review on iTunes. Be
sure to check out our previouspodcasts and check us out on
LinkedIn at reseller X. Join usnext time for another edition of
(57:06):
software quality today